Recommended record count per transform? (1 reply)
Arline, if you're using v4.0 or higher I would not expect any issue with this amount of data.
Even with earlier versions this should be ok....but it will take quite a bit of time to chug through.
Arline, if you're using v4.0 or higher I would not expect any issue with this amount of data.
Even with earlier versions this should be ok....but it will take quite a bit of time to chug through.
Realisable Software Ltd provides code-free, cost-effective applications integration solutions for SMEs. Our core IMan product is designed to integrate almost any application with a number of Sage solutions and online payment processors.
Looking to purchase IMan, please see our resellers here.
Realisable Software
Ph: +44 (0) 208 123 1017
Copyright © Realisable. All rights reserved.
Realisable is a registered trademark
Realisable Software Ltd provides code-free, cost-effective applications integration solutions for SMEs. Our core IMan product is designed to integrate almost any application with a number of Sage solutions and online payment processors.
Looking to purchase IMan, please see our resellers here.
Realisable Software
Ph: +44 (0) 208 123 1017
Copyright © Realisable. All rights reserved.
Realisable is a registered trademark
Realisable Software Ltd provides code-free, cost-effective applications integration solutions for SMEs. Our core IMan product is designed to integrate almost any application with a number of Sage solutions and online payment processors.
Looking to purchase IMan, please see our resellers here.
Realisable Software
Ph: +44 (0) 208 123 1017
Copyright © Realisable. All rights reserved.
Realisable is a registered trademark
I'm building a few transforms that insert data from an SQL view to an Orchid Extender table in Sage 300.
The last one I did is meant to insert about 112,000 rows to the table. Is that too much? I got a 'timeout' message on the scheduler screen, and when I query the AUDITLOG table I see that the process started but did not stop. No errors. Server seems to be sluggish as well, so I'm not sure where/how to isolate the issue.
Just curious if in general I should build a transform to handle the first X rows of data, then do a second pass that handles the next X rows that have not already been processed, or what the best practice should be there.
Thanks for any advice.