March 3, 2021
Hot Topics:

Sample Chapter: Efficient PL/SQL

  • By Beck, Kallman, Katz, Knox, and McDonald
  • Send Email »
  • More Articles »

As a preliminary step, we collect the departmental summary detail and add it to our PL/SQL table. Because the deptno is numeric, it can also serve as the index for our PL/SQL table.

 19   for i in c_dept_salaries loop
 20     v_dept_sal(i.deptno).avg_dept_sal :=i.asal;
 21     v_dept_sal(i.deptno).min_dept_sal :=i.msal;
 22     v_dept_sal(i.deptno).dname        :=i.dname;
 23   end loop;
 24   for each_emp in c_emp_list loop

In our main processing loop, we no longer need to do any more lookups to the dept table. We simply reference the appropriate information from the PL/SQL table (dept_sal).

 25      if abs(each_emp.sal - 
            v_dept_sal(each_emp.deptno).avg_dept_sal )/
 26         v_dept_sal(each_emp.deptno).avg_dept_sal >0.20 then
 27        if v_dept_sal(each_emp.deptno).min_dept_sal =
              each_emp.sal then
 28          insert into emp_sal_log
 29            values (each_emp.ename,each_emp.hiredate,
 30                    each_emp.sal,v_dept_sal
 31         else
 32           insert into emp_sal_log
 33            values (each_emp.ename,each_emp.hiredate,
 34                    v_dept_sal(each_emp.deptno).dname,'Y');
 35         end if;
 36       end if;
 37     end loop;
 38   end;
 39 /

Procedure created.

At the cost of a little more code complexity, we have managed to reduce our code to a single pass through the dept table and a single pass through the emp table. Look at the impressive results we get through testing through the REPTEST.SQL script (after adjusting it to run report_sal_adjustment3).

Table 1-4. Further Refinements to REPORT_SAL_ADJUSTMENT


We have improved performance significantly. More importantly, the scalability now appears to be linear as opposed to exponential. However, with this solution, a new problem lurks under the covers. Storing the departmental information obviously consumes some memory. To observe just exactly how much memory is being chewed up, we need to look at the session statistics using the V$MYSTATS view we defined earlier. After running the test with 5,000 departments, let's look at how much memory was consumed by this session.

SQL> col value format 999,999,999
SQL> select *from v$mystats
  2  where name ='session pga memory max'
  3  /

NAME                                               VALUE
----------------------------------------    ------------
session pga memory max                         3,669,024

Our session used nearly 4 megabytes of memory. But we can now handle more than 50,000 employees within our time constraints. Problem solved...well, not really.

A solution exists that is even more efficient and has none of the associated memory overhead. In adding greater levels of complexity to the procedural solution, our developer has been getting further and further from the optimal result. We did not need all that complexity in our PL/SQL—we didn't need PL/SQL at all! The problem can be resolved with SQL using analytic functions. (Note that you will need to use Oracle 9 for this routine to compile as presented. If you have version 8, the insert statement will need to run as dynamic SQL; that is, wrapped within the EXECUTE IMMEDIATE command.)

SQL>create or replace
2 procedure report_sal_adjustment4 is
3 begin
4 insert into emp_sal_log
5 select e.empno,e.hiredate,e.sal,dept.dname,
6 case when sal >avg_sal then 'Y '
7 else 'N '
8 end case
9 from (
10 select empno,hiredate,sal,deptno,
11 avg(sal)over (partition by deptno )as avg_sal,
12 min(sal)over (partition by deptno )as min_sal
13 from emp )e,dept
14 where e.deptno =dept.deptno
15 and abs(e.sal -e.avg_sal)/e.avg_sal >0.20;
16 end;
17 /
Procedure created.

And that's it! Our procedure has been reduced to a single SQL statement. No lookup tables, no complicated code, just simple SQL. And when we benchmark it using the REPTEST.SQL script, we get the following astounding scalability results shown in Table 1-5.



Page 17 of 19

This article was originally published on February 5, 2004

Enterprise Development Update

Don't miss an article. Subscribe to our newsletter below.

Thanks for your registration, follow us on our social networks to keep up-to-date