i****a 发帖数: 36252 | 1 SQL 2008
Is it possible to copy a table from DB1 to DB2? I am not talking about
select insert, but I want to copy the whole table over as quickly as
possible.
There are couple large tables need to be copied a new database. insert will
take forever. I am checking if I can use RedGate's SQL Backup and object
level restore in some way. In the mean time, any suggestions? |
g***l 发帖数: 18555 | 2 tables are objects , there are properties you have to set, like ownership,
user. What if users do not exists, or there is a table already exists in the
targeted database. You have to make manual decisitions. copying table is no
different from re-creating table and setting properties and copying data
over. |
gy 发帖数: 620 | 3 We have a table over 4M. The developer used SELECT/INSERT method, which took
over 20 minutes and the log file grew very big(over 20GB). After I used
SSIS, it took less than 3 minutes, and log file did not actually grow...
(BTW, it is one step of a daily job) |
i****a 发帖数: 36252 | 4 yeah, need to do bulk insert, or insert in batches.
I have 2 tables, 17M and 58M rows. I am trying to see if I can get away from
copying data via SQL.
took
【在 gy 的大作中提到】 : We have a table over 4M. The developer used SELECT/INSERT method, which took : over 20 minutes and the log file grew very big(over 20GB). After I used : SSIS, it took less than 3 minutes, and log file did not actually grow... : (BTW, it is one step of a daily job)
|
i****a 发帖数: 36252 | 5 the target database is a new, blank DB. but I don't think it's possible.
even object level restore wouldn't let me restore to a different DB.
the
no
【在 g***l 的大作中提到】 : tables are objects , there are properties you have to set, like ownership, : user. What if users do not exists, or there is a table already exists in the : targeted database. You have to make manual decisitions. copying table is no : different from re-creating table and setting properties and copying data : over.
|
j*****n 发帖数: 1781 | 6 1. generate create table (include indexes) scripts from source;
2. run create table scripts on target, do not run create non-clustered
indexes scripts for now;
3. use Data Export Wizard, set up identity insert if need to; save as
package file (.dtsx) but don't execute.
4. open BIDS and create a new SSIS project;
5. add the package into your project;
6. in data flow task, change the properties of the data destination so that
specify commit once every 500,000 rows;
7. make sure your DB recovery mode is simple (change it and you can always
change it back to the original setting;
8. run package in BIDS, 58M shouldn't take more than few hours;
9. create all indexes when transfer is done.
from
【在 i****a 的大作中提到】 : yeah, need to do bulk insert, or insert in batches. : I have 2 tables, 17M and 58M rows. I am trying to see if I can get away from : copying data via SQL. : : took
|
j*****n 发帖数: 1781 | 7 attached pic for your reference.
that
【在 j*****n 的大作中提到】 : 1. generate create table (include indexes) scripts from source; : 2. run create table scripts on target, do not run create non-clustered : indexes scripts for now; : 3. use Data Export Wizard, set up identity insert if need to; save as : package file (.dtsx) but don't execute. : 4. open BIDS and create a new SSIS project; : 5. add the package into your project; : 6. in data flow task, change the properties of the data destination so that : specify commit once every 500,000 rows; : 7. make sure your DB recovery mode is simple (change it and you can always
|
g***l 发帖数: 18555 | 8 不是有BCP专门干这种活么
【在 i****a 的大作中提到】 : the target database is a new, blank DB. but I don't think it's possible. : even object level restore wouldn't let me restore to a different DB. : : the : no
|
i****a 发帖数: 36252 | 9 Thanks. I am familiar with SSIS but we probably don't have hours. And the
uncertainty on actual time needed is going to throw the deployment off. I
can try this in smaller scale on staging server but we can't afford any
surprises in production.
That's why I am trying to look at alternatives.
I think it has come down to restoring a backup of the original DB with a new
name, stripping the original database except those 2 data tables and
deploying new schema and objects onto the that copy. The backup/restore will
take a long time too but at least the time it takes is known.
that
【在 j*****n 的大作中提到】 : 1. generate create table (include indexes) scripts from source; : 2. run create table scripts on target, do not run create non-clustered : indexes scripts for now; : 3. use Data Export Wizard, set up identity insert if need to; save as : package file (.dtsx) but don't execute. : 4. open BIDS and create a new SSIS project; : 5. add the package into your project; : 6. in data flow task, change the properties of the data destination so that : specify commit once every 500,000 rows; : 7. make sure your DB recovery mode is simple (change it and you can always
|
j*****n 发帖数: 1781 | 10 would you consider scheduling a job and let it done over night?
new
will
【在 i****a 的大作中提到】 : Thanks. I am familiar with SSIS but we probably don't have hours. And the : uncertainty on actual time needed is going to throw the deployment off. I : can try this in smaller scale on staging server but we can't afford any : surprises in production. : That's why I am trying to look at alternatives. : I think it has come down to restoring a backup of the original DB with a new : name, stripping the original database except those 2 data tables and : deploying new schema and objects onto the that copy. The backup/restore will : take a long time too but at least the time it takes is known. :
|
g***l 发帖数: 18555 | 11 这种我以前做过,BCP IN 中间TABLE, RENAME RECOMPILE,这样就是SWAP一下不到一秒
钟,但要注意SWAP之前要CHECK RECORD COUNTS,因为BCP不出ERROR |
g***l 发帖数: 18555 | 12 但其实最好的方案是,这种经常要FULL DUMP TABLE放在一个
SERVER的一个DB上,另外一个OFFLINE SERVER一样的DB,先
LOAD好,TEST好,然后CHANGE APPLICATION SERVER CONNECTION
STRING,另外一个就变成OFFLINE的了,这样是比较妥当的,
BCP也只能晚上做,没啥质量的保证,而且是在PRODUCTION上
做,如果说白天INVENTORY要REFRESH好几次数据量又巨大,
BCP就困难了。 |