Add batch size option to data deployment
It would be great to be able to pick the batch size used when doing data deployments.
We have to work with some orgs using poorly performing managed packages that are blowing up due to CPU limits.
We don't have many option, so being able to reduce the batch size, and hence the CPU load would be really useful.
This now completed and can be set at a team level at https://app.gearset.com/account/data-settings
Dataloader that I am paying for is practically useless for me if I cannot specifiy the batch size. You even have batches in the metadata deployments for comparing metadate deployments between sandboxes. Why can't you use similar feature with data deployments? You would be essentially bulkifying the data and this would put you at a more competitive advantage with products like Autorabit where, it took us years to get rid of Autorabit and adopt Gearset for this type of reason where the dataloader took in Gearset (although it has come a long way) is not as robust.
Gab Harbour commented
This is a no-brainer. you should implement this asap, otherwise you can't migrate data to orgs that have apex or flows running, which is 90% of enterprise orgs