Skip to Main Content
  • Questions
  • Regarding loading in to the database

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, sona.

Asked: February 10, 2017 - 7:08 am UTC

Last updated: February 23, 2017 - 2:54 am UTC

Version: SQL Developer 1.5.3

Viewed 1000+ times

You Asked

I have 56k chunk of data .when i loading this data in to my database using toad ,it is taking almost 7-8 hours o get loaded.

Can you please suggest some solution as i using sql developer ,some errors are thrown like task rolled back .


and Connor said...

56k ? Is that bytes, rows, megabytes ?

I've done a test on my laptop here that lets me load about 900,000 rows per second into a standard help table.

You need to give us (a lot) more information and context around what you are doing and what you want to achieve

Rating

  (5 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Loading large volume of data arnd 56k in database

sona sh, February 15, 2017 - 1:45 pm UTC

There are 56000 rows in xls format and it takes almost 7-8 hours to get loaded in to the oracle database using TOAD .
Connor McDonald
February 16, 2017 - 3:40 am UTC

Save the file as CSV, and use SQL Loader or an external table.

Or see this blog

http://www.thatjeffsmith.com/archive/2014/12/sql-developer-4-1-easier-excel-imports/

loading large chunk of data in to database

sona sh, February 16, 2017 - 10:11 am UTC

Great article i would say,
But i always get this errror on loading 56k data ,

Import Data into table HR_NEW_TABLE from file C:\Users\Desktop\Org.xls . Task canceled and import rolled back. Task Canceled.

could you please shed some light on the same.

Regards
Sona

RE

GJ, February 17, 2017 - 12:26 pm UTC

Did you try what Connor asked==>(save as csv) and use sql loader.

"Import Data into table HR_NEW_TABLE from file C:\Users\Desktop\Org.xls"

"Task canceled and import rolled back. Task Canceled"
file used in Org.xls and im sure that's not a sql loader error

Loading large volume of data arnd 56k in database

sona sh, February 22, 2017 - 8:19 am UTC

the solution provided was really helpful ,
but my file data contains name columnn fields as (john,Sam)
so loading data using control file will not give relevant result .Can you please suggest something on the same.



Connor McDonald
February 23, 2017 - 2:54 am UTC

You can choose to skip header rows in the sql dev import

SQLDEV_IMPORT_DATA

Loading large volume of data arnd 56k in database

sona sh, February 27, 2017 - 2:17 pm UTC

Thanks a lot for your help.