Pregunta

I am having a spreadsheet which contains around 3000 records. I need to insert all these data to a new table. So in this case using batch insert mechanism is quite good.

So i tried a simple example ,

 <cfquery datasource="cse">
    insert into Names
    values
    <cfloop from="1" to="3000" index="i">
        ('#i#')
        <cfif i LT 3000>, </cfif>
    </cfloop>
</cfquery>

But as SQL Server 2008 only allows 1000 batch insert at a time I am getting error.

So how to make separate batches each containing 999 records at a time and can execute at a time?

¿Fue útil?

Solución

You can use a BULK INSERT statement that should cope with extremely large datasets.

The data will need to be in a CSV, and you'll have to create a variable to the file location.

  <cfquery datasource="cse">
    BULK INSERT Names
    FROM '#variables.sCSVLocation#'
  </cfquery>

If you have a reason not to use BULK INSERT and want to break it down into loops of 999, then you would have to work out how many 'records' are in the dataset, divide it by 999 to get the amount of times you'd have to loop over it.

Otros consejos

<cfquery datasource="cse">
    <cfloop from="1" to="3000" index="i">
    <cfif ((i MOD 1000) EQ 1)><!--- Each SQL INSERT Can Only Handle 1000 Rows Of Data --->
    INSERT INTO Names
    (
    [colName]
    )
    VALUES
    </cfif>
    (
        '#i#'
    )
    <cfif (i LT 3000)><cfif ((i MOD 1000) NEQ 0)>,</cfif>#CHR(13)##CHR(10)#</cfif>
    </cfloop>
</cfquery>
Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top