Wednesday, 15 October 2014

Using batch select (BULK COLLECT) for fectching records in python.

This content will be helpful for those who have worked with BULK COLLECT in PL/SQL.

BULK COLLECT helps to fetch record in batch process, which helps in optimization of data fetch.

Say in PL/SQL here is a block of PLSQL, which is using BULK COLLECT for query optimization.

declare
cursor rec is select * from employee_master;
emp employee_master%rowtype;
type emprows is TABLE of emp;
t_emprows emprows;
begin
open rec;
loop
  fetch rec BULK COLLECT into t_emprows LIMIT 50;
    for i in  t_emprows.first..t_emprows.last loop
      dbms_output.put_lines(.t_emprows(i).emp_name);
    end loop;
  end loop;
end;

In the above example you can see how bulk collect with limit clause restrict umber of rows fetched into memory. This lessens the disk I/O, and hence fasten the fetch time.

This can also be done in python. Here is an example which fetches record from table and write into a file in csv format.


import cx_Oracle
# con = cx_Oracle.connect('username/password@[ipaddress or hostname]/SID')
# for example con = cx_Oracle.connect('scott/tiger@127.0.0.1/orcl')
con = cx_Oracle.connect('GREENPLY/GREENPLY@192.168.0.109/orcl.localdomain')
cur = con.cursor()
cur.execute('select emp_code,emp_name from employee_master order by 2')
file = open("file.csv", "w")
flag=False
while not flag:
    count=0
    for row in cur.fetchmany(50):
        file.write(row[0] + ',' + row[1] + '\n')
        count=count+1
    if count<50:
        break
file.close()
cur.close()
con.close()
print("File successfully exported")

No comments: