Vorträge im Juni 2018 / Events in June 2018

Dear english readers - as the events are in german, I do write this blog entry also in german only. Thanks for your understanding.

Auch im Juni gibt es wieder eine Reihe von Vorträgen, die sehenswert sind. Da ich an beiden Terminen selbst als Sprecher auftrete, liegen mir diese beiden natürlich besonders am Herzen:

Zum ersten der Fokustag "Oracle" am 12.06.2018 bei robotron in Wil, die Agenda ist sehr interessant und anmelden kann/sollte man sich da auch schon heute - bevor alle Plätze vergeben sind. Themen unter anderem: Database 18c New Features und (unbekannte oder unbenutzte) Features der Standard Edition 2 bzw. der Enterprise Edition Datenbank ohne zusätzliche Packs, wie z.B. Diagnostics und Tuning Pack.

Der zweite Termin ist der Special Day der Swiss Oracle User Group (SOUG) zwei Tage später, also am 14.06.2018 im Trafo in Baden - auch ein Muss-Termin für alle, die in der Schweiz mit Oracle Produkten arbeiten.

Es würde mich freuen, einige neue und natürlich auch alte Gesichter an den Events zu sehen. By the way - wer es noch nicht mitbekommen hat, mein Arbeitgeber Robotron Schweiz GmbH ist jetzt SOUG Sponsor!

Dynamic datapump export without expdp client using DBMS_DATAPUMP API and SQL*Plus (also for PDBs)

Long time, since I have written my last blog entry, but I was busy with a bigger customer project, where we needed to migrate roundabout 30 databases to a new environment.

Nearly finished with the project, I got a script from our customer to schedule it with cronjob.
The script should do a full export every Sunday and a schema export of all schemas that don't use the SYSTEM or SYSAUX tablespace as default tablespace all other days.

What I have seen then was unbelievable, I nearly couldn't stop head-shaking.
The former consulting company created a procedure that was stored in the SYS user. Not worse enough, this procedure was scheduled from inside the database (using dbms_jobs) and the only job was - do you sit? - creating a linux shell script dynamically, with a full export on Sunday and the schema exports on all other days.
The last step and the next head-shaking was, that this dynamically created script afterwards was scheduled in cronjob. I don't need to mention that the export wasn't made consistent using any FLASHBACK_ parameter.

What the hell...

First rule: Never put anything in the SYS schema of the database, especially not when it is scheduled inside the database.
Second rule: Don't do things not more complicated, as they are, so use your brain but let the things to run to be simple and understandable (I extra don't say stupid).
Third rule: Don't rely on a job cascade combining inside and outside database schedules. Worst thing here is that the dynamic shell script could not be written by the database, so if you add e.g. a schema, this will never be exported - and you may also never get a full export (or you get a full export all the time).

Well, these three and a lot more topics convinced me to streamline the export a little bit:
  1. Keep it simple and easy
  2. Make only one job
  3. Don't put any own code into the SYS schema
  4. Make the script dynamic
  5. Make the export constistent
  6. Add own logfile entries

So I created a new script, you can copy and paste it into your notepad/vi and save it (maybe you need to deal with wrong quotation marks after copy/paste). The script exports to the temp directory with the DB_UNIQUE_NAME as subdirectory. Please create this directory before you start the export or add "host 'mkdir -p /tmp/&lv_db'" after the create or replace directory statement to create it also dynamically.

Afterwards, you can run the script by just start sqlplus as a SYS/SYSTEM user and provide the script name, e.g. if you save the script as JoergsDatapumpExport.sql you can start the script after setting oraenv by calling $ sqlplus / as sysdba @JoergsDatapumpExport.sql .

Hope, this script is helpful for someone - if you connect e.g. to a pdb using sqlplus and if you have added the mkdir part, you can dynamically export from any pdb. There are also some enhancements I have done for my test environment you could do: Adding parallel export support depending on the numbers of cores you can use or you can stop the datapump export if there is no schema to export - before ugly stuff is written into the logfile.

Comments are welcome.




WHENEVER SQLERROR EXIT FAILURE;
set serveroutput on
set feedback off
column db_unique_name new_value lv_db
select db_unique_name from v$database;
create or replace directory "DATA_PUMP_DIR" as '/tmp/&lv_db';

declare
l_dp_handle NUMBER;
v_job_state varchar2(4000);
v_day varchar2(100);
v_host varchar2(50);
v_db varchar2(20);
v_schemaexport varchar2(1000);
l_current_scn number;
cursor c1 is
select username from dba_users where default_tablespace not in ('SYSTEM','SYSAUX')
and username not in ('PERFSTAT');

BEGIN
select rtrim(to_char(sysdate,'DAY')) into v_day from dual;
select host_name into v_host from v$instance;
select db_unique_name, current_scn into v_db, l_current_scn from v$database;

if v_day in ('SUNDAY','SONNTAG') then
l_dp_handle := DBMS_DATAPUMP.open(operation => 'EXPORT',
job_mode => 'FULL',
remote_link => NULL,
version => 'LATEST');
else
l_dp_handle := DBMS_DATAPUMP.open(operation => 'EXPORT',
job_mode => 'SCHEMA',
remote_link => NULL,
version => 'LATEST');
end if;
DBMS_DATAPUMP.add_file(handle => l_dp_handle,
filename => v_host||'_'||v_db||'_'||v_day||'.dmp',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE,
reusefile => 1);
DBMS_DATAPUMP.add_file(handle => l_dp_handle,
filename => v_host||'_'||v_db||'_'||v_day||'.log',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE,
reusefile => 1);

DBMS_DATAPUMP.SET_PARAMETER(handle => l_dp_handle, name => 'FLASHBACK_SCN', value => l_current_scn);

if v_day in ('SUNDAY','SONNTAG') then
DBMS_DATAPUMP.log_entry(handle => l_dp_handle,
message => 'FULL Export.');
else
begin
DBMS_DATAPUMP.log_entry(handle => l_dp_handle,
message => 'SCHEMA Export (without schemas with SYSTEM and SYSAUX default Tablespace).');
v_schemaexport := 'IN (' ;
for c1_rec in c1 loop
begin
if c1%rowcount=1 then
v_schemaexport := v_schemaexport||''''||c1_rec.username||'''';
else
v_schemaexport := v_schemaexport||','||''''||c1_rec.username||'''';
end if;
end;
end loop;
v_schemaexport:= v_schemaexport||')';
DBMS_DATAPUMP.METADATA_FILTER(l_dp_handle, 'SCHEMA_EXPR', v_schemaexport);
end;
end if;

DBMS_DATAPUMP.log_entry(handle => l_dp_handle,
message => 'Please wait until the export is completed.');

DBMS_DATAPUMP.log_entry(handle => l_dp_handle,
message => 'Export Parameters: Flashback_SCN='||to_char(l_current_scn),
log_file_only => 0);

DBMS_DATAPUMP.log_entry(handle => l_dp_handle,
message => 'Start time: '||to_char(sysdate,'DD.MM.YYYY HH24:MI:SS'),
log_file_only => 0);

DBMS_DATAPUMP.START_JOB (l_dp_handle);
END;
/
exit;