Help us improve Gearset
We love getting feedback from our users on how we can make Gearset even better. Post your ideas for improvements, new features, and bug fixes alike, and vote for others – let us know what’s important to you.
73 results found
-
Allow deployment of Email header/footer changes
Examine changes to Email Footers and allow deployment of changes.
1 vote -
Add Support for Pre/Post Data Deployment Scripts
Some of my more complex data objects require that I turn off Validation Rules during deployment, then back on again after deployment.
Options that would work:
BEST: List of Validation rules under each object and and option to temporarily disable on a one-by-one basis during the deployment. A "Disable All Validation Rules During Deployment" option would be helpful if this feature is implemented.
BETTER: Ability to specify metadata xml to apply to each sobject pre and post deployment.
GOOD: Ability to specify anonymous Apex to execute pre and post deployment. This would involve a callout to the Metadata API, which…
18 votes -
Support Salesforce standard Country/State Picklist Mapping for Data Masking
Right now, using the data masking feature for data deployments, the values for country and state are real values and random. There's no guarrantee that the country and state will be valid pairings.
This causes issues for orgs that use the standard Country/State picklists from Salesforce.
The goal would be that the masking feature would pull from the Saleforce default values and make sure that the state and country are valid pairings as opposed to randomizing both separately
1 voteGearset will now only use US states and “United States” for the country, meaning that the pairings are always valid.
-
Schedule data deployments
The current data deployment functionality works fine for one-off deployments, but if you are trying to keep specific data in sync across multiple orgs, the ability to schedule data deployments for a given time in the future would let you deploy on a recurring basis automatically. Becomes especially helpful when combined with the current data filtering options.
20 votes -
Deploy data from local repository.
It would be very helpful if we could deploy data from a local repository. We have different namespaces for internal packages and it would make data migration significantly easier if we could do it from a local machine where we'd switch the namespaces.
3 votes -
Ability to update value of Reference field for a data deployment
"Select reference fields to deploy"
Ability to edit the value for the target deployment data for the reference field instead of searching for the match on the Target.
Ex. Loading Opportunities from a Source to a Target that contains 5 different record types, but when uploaded to the Target we specify the recordtypeid to one that matches a recordtypeid on the Target for the entire data load.
Similar to how the Owner reference can ignore the Target's users and default to the running user of the data deployment.
3 votes -
Allow complex filters and child relationships on filters for data loading
Allow complex filters and child relationships on filters for data loading. For example, filter the Account to only load if the Owner is an Active user in the system.
10 votes -
Allow record id as the first criteria for matching lookup fields
When deploying data records from production to sandbox, the default matching criteria for the owner field will not typically work...
Gearset currently matches on email and profile, but the email addresses for users get modified in sandboxes. So, no match is found for users other than the one who created the sandbox.
It would be great if the first check was: does the user record id match. If not, use email + profile, if not, use the current user.
2 votes -
Preserve auto-number field values during a data deployment
Staging environments typically exist with integrations with other systems that have references to auto-number fields
Data deployment re-evaluates auto-number fields instead of keeping the values of the source.If data deployment jobs could automate the following it would probably work
1. changing the type of the auto-number field to text in the target org
2. potentially removing apex references to the field
3. deploying the data
4. rolling back metadata modifications in steps 1 and 21 vote -
Delete records from the target org during a data deployment
Provide options to delete records from Target Org. This will help clean up of test data and reload from Source.
17 votes -
Add problem analyzers to the data deployment app
Much like the problem analyzers for metadata deployments in Gearset, it would be great if you could add a problem analysis feature for data deployments after having configured what you want to deploy.
Some scenarios will almost certainly cause the data deployment to either totally fail or only partially deploy some of the records and skip records it can't deploy. Providing automatic fixes (e.g. choosing not to include duplicates) would improve the number of deployments that work 100% of the time.
1 vote -
Select all objects in a data deployment
It would be useful to select all objects in the data deploy screen to make it faster to select a large number of objects (even better would be the ability to group objects and select whole groups at once)
5 votes -
Trigger a data deployment from a CI job
When we pull down a version between our local and staging environments. We would like to also automatically pull in the testing data from another project.
As one of the things we are trying to setup is taking a version from Prod then running GhostInspector through our core flows. Ideally every 4 hours we take a backup from metadata + locked down actual data. Then run tests against it.
3 votes
- Don't see your idea?