• odu恢复drop表--通过logmnr挖掘object_id


    生成测试表
    conn liang/chengce243
    create table tab_01(id int,name VARCHAR2(128));
    insert into tab_01
    select OBJECT_ID,OBJECT_NAME
    from dba_objects where rownum<=10000;
    commit;
     
    create table tab_02 as select * from tab_01;
    drop table tab_02 purge;
     
    恢复的步骤为:
    1.OFFLINE表所在的表空间
    2.使用logminer从日志里面挖掘被drop掉的表其data object id。
    3.扫描数据:scan extent
    4.如果没有表结构信息,需要自动来判断:unload object data_object_id sample
    5.恢复表:unload object data_object_id column coltype coltype...
    6.ONLINE表所在的表空间


     
    先将测试表所在的users表空间offline:
    alter tablespace users offline;
     
    使用logminer来查找被drop表的data object id:
    select group#,status from v$log;
     
    col member for a50
    select member from v$logfile where group#=2;
    MEMBER
    --------------------------------------------------
    /u01/oradata/orcl/redo02.log
     
    exec sys.dbms_logmnr.add_logfile(logfilename=>'/u01/oradata/orcl/redo02.log');
    exec sys.dbms_logmnr.start_logmnr(options=>sys.dbms_logmnr.dict_from_online_catalog);
     
    set pagesize 999
    set linesize 160
    col scn for 999999999
    col timestamp for a30 
    col sql_redo for a60 
    ALTER session SET nls_date_format='yyyy-mm-dd hh24:mi:ss';
    select scn,timestamp,sql_redo from v$logmnr_contents where operation='DDL' and sql_redo like '%tab_02%' order by 2 ;
           SCN TIMESTAMP SQL_REDO
    ---------- ------------------------------ ------------------------------------------------------------
        976977 2019-03-07 16:26:59 create table tab_02 as select * from tab_01;
        977005 2019-03-07 16:27:00 drop table tab_02 purge;
     
    select scn,timestamp,sql_redo from v$logmnr_contents where timestamp=to_date('2019-03-07 16:27:00','yyyy-mm-dd hh24:mi:ss') order by 1;
     
           SCN TIMESTAMP SQL_REDO
    ---------- ------------------------------ ------------------------------------------------------------
        977001 2019-03-07 16:27:00 set transaction read write;
        977001 2019-03-07 16:27:00 /* No SQL_REDO for temporary tables */
        977002 2019-03-07 16:27:00 commit;
        977004 2019-03-07 16:27:00 set transaction read write;
        977004 2019-03-07 16:27:00
        977005 2019-03-07 16:27:00 drop table tab_02 purge;
        977008 2019-03-07 16:27:00 /* No SQL_REDO for temporary tables */
        977008 2019-03-07 16:27:00 Unsupported
        977009 2019-03-07 16:27:00
        977009 2019-03-07 16:27:00
        977009 2019-03-07 16:27:00
        977009 2019-03-07 16:27:00 Unsupported
        977009 2019-03-07 16:27:00
        977009 2019-03-07 16:27:00
        977009 2019-03-07 16:27:00
        977010 2019-03-07 16:27:00 Unsupported
        977010 2019-03-07 16:27:00 delete from "SYS"."OBJ$" where "OBJ#" = '87360' and "DATAOBJ
           #" = '87360' and "OWNER#" = '84' and "NAME" = 'TAB_02' and "
           NAMESPACE" = '1' and "SUBNAME" IS NULL and "TYPE#" = '2' and
            "CTIME" = TO_DATE('2019-03-07 16:26:59', 'yyyy-mm-dd hh24:m
           i:ss') and "MTIME" = TO_DATE('2019-03-07 16:26:59', 'yyyy-mm
           -dd hh24:mi:ss') and "STIME" = TO_DATE('2019-03-07 16:26:59'
           , 'yyyy-mm-dd hh24:mi:ss') and "STATUS" = '1' and "REMOTEOWN
           ER" IS NULL and "LINKNAME" IS NULL and "FLAGS" = '0' and "OI
           D$" IS NULL and "SPARE1" = '6' and "SPARE2" = '1' and "SPARE
           3" = '84' and "SPARE4" IS NULL and "SPARE5" IS NULL and "SPA
           RE6" IS NULL and ROWID = 'AAAAASAABAAAVCnAAa';
     
        977011 2019-03-07 16:27:00
        977011 2019-03-07 16:27:00
        977011 2019-03-07 16:27:00
        977011 2019-03-07 16:27:00
        977012 2019-03-07 16:27:00 Unsupported
        977012 2019-03-07 16:27:00 Unsupported
        977013 2019-03-07 16:27:00 commit;
        977016 2019-03-07 16:27:00 set transaction read write;
        977016 2019-03-07 16:27:00 Unsupported
        977016 2019-03-07 16:27:00 Unsupported
        977017 2019-03-07 16:27:00 commit;
        977017 2019-03-07 16:27:00 set transaction read write;
        977017 2019-03-07 16:27:00
        977020 2019-03-07 16:27:00 Unsupported
        977021 2019-03-07 16:27:00 commit;
     
    从SCN为977010的几行中,delete from "SYS"."OBJ$" where "OBJ#" = '87360' and "DATAOBJ#" = '87360' and "OWNER#" = '84' 可以看到被drop表的data object id为87360。 
     
    set linesize 180
    col FILE_NAME for a50
    col VALUE for a8
    select ts#,file_id fno#,file_id rfno#,file_name,value
     from V$tablespace t,dba_data_files d,v$parameter p
     where t.name=d.tablespace_name
     and p.name='db_block_size';
    把查询结果贴到control.txt;
     
    修改配置文件config.txt  output_format  为 DMP
    [oracle@hncdf odu]$ cat config.txt 
    byte_order little
    block_size 8192
    db_timezone -7
    client_timezone 8
    data_path data
    charset_name ZHS16GBK
    ncharset_name AL16UTF16
    output_format DMP
    lob_storage infile
    clob_byte_order little
     
    通过ODU的抽样来自动判断数据的类型:
    ODU> scan extent tablespace 4;
     
    scan extent start: 2019-03-07 16:45:46
    scanning extent...
    scanning extent finished.
    scan extent completed: 2019-03-07 16:45:46
     
    ODU> unload object 87360 sample
     
    Unloading Object,object ID: 87360, Cluster: 0
    output data is in file : 'data/ODU_0000087360.txt' 
     
    Sample result:
      object id: 87360
      tablespace no: 4
      sampled 1022 rows
      column count: 2
      column 1 type: NUMBER
      column 2 type: VARCHAR2
     
    COMMAND: 
    unload object 87360 tablespace 4 column NUMBER VARCHAR2 
     
    ODU> unload object 87360 tablespace 4 column NUMBER VARCHAR2
     
    Unloading Object,object ID: 87360, Cluster: 0
    10000 rows unloaded
     
    将测试表所在的users表空间online:
    SQL> alter tablespace users online;
     
    使用imp导入数据
    imp liang/chengce243 file=data/ODU_0000087360.dmp fromuser=liang touser=liang
     
    参考文档:http://www.laoxiong.net/category/odu
     
  • 相关阅读:
    爬取校园新闻首页的新闻的详情,使用正则表达式,函数抽离
    爬取校园新闻首页的新闻
    中文词频统计
    综合练习:英文词频统计
    字符串练习
    1.用Hive对爬虫大作业产生的文本文件2.用Hive对爬虫大作业产生的csv文件进行数据分析,写一篇博客描述你的分析过程和分析结果。
    Hadoop综合大作业加上以前漏掉的作业
    爬虫大作业
    获取全部校园新闻
    爬取校园新闻首页的新闻
  • 原文地址:https://www.cnblogs.com/liang545621/p/12611603.html
Copyright © 2020-2023  润新知