Help us improve Gearset
We love getting feedback from our users on how we can make Gearset even better. Post your ideas for improvements, new features, and bug fixes alike, and vote for others – let us know what’s important to you.
82 results found
-
Remove the "run as specific user" completely from the dashboard prior to deployment (set the options to "Run as
Add during problem analysis:
Remove the "run as specific user" completely from the dashboard prior to deployment (set the options to "Run as logged-in user"). Deploy, then change it back to whatever you need. Based on https://developer.salesforce.com/forums/?id=906F0000000DCJPIA4
3 votes -
Improve the UI around deployment objects dependent on objects not in the comparison
Deployment of certain types of objects requires that other objects on which they are dependent be included in the deployment. If the needed objects have not been included in the selected items and they are not part of the comparison, Gearset will display the following message:
Problem: Some itmes reference componnents that were not downloaded from either the source or the target.
Solution: Removing the elements that reference missing components from the affected items will make the deployment more likely to succeed.
Below this is displayed a table with a column of checkboxes at the left and three additional columns:…
1 vote -
Display "No Difference" metadata sources on Deployment Summary page
On the "Deployment Summary Page" it would be great to add a column to the list of metadata components added which shows for the "No Difference" items showing the source piece(s) of metadata which is causing this "No Difference" to be included.
This would dramatically help in the narrowing down of why a piece of metadata is included - especially if that piece of "No Difference" is a managed package and cannot be altered. In a large deployment, trying to identify a root cause for why a component was added is quite difficult. Simply giving a way to track back…
1 vote -
Improve error handling for invalid XML metadata files
Currently, when there is an error with an XML metadata file (usually introduced by git merges), Gearset will produce an error, either in the Initializing Comparison, or Checking Deployment stages, stating simply that "An unknown error occurred."
If Gearset instead ran the XML from source control through an XML validator using the Partner WSDL XMLSchema, then we could get file-level validation errors if the XML doesn't conform to what Salesforce's Metadata API will accept.
e.g.:
$ xmlstarlet val -e -s ~/Downloads/salesforce-schema.xml force-app/main/default/profiles/Marketing.profile-meta.xml
force-app/main/default/profiles/Marketing.profile-meta.xml:17108.15: Element '{http://soap.sforce.com/2006/04/metadata}field': This element is not expected.
force-app/main/default/profiles/Marketing.profile-meta.xml - invalid1 vote -
Suppress irrelevant Suggested Fixes when deploying to git
The problem analysis comes up with a number of Suggested Fixes which may make sense for improving Salesforce deployments, but which are unproductive when applied to git deployments.
For example, the suggestion to remove TASK.WHAT_NAME, or to omit Standard Objects, will only serve to create a git repo which is an inaccurate reflection of the source org. Each time we deploy to that git repo, we will continue to have the same changes to select or omit, since the repo is missing objects and fields from the source org.
When the deployment target is a git repo, all suggested fixes…
6 votes -
Deployment Notes Persist on Failure
When a deployment fails, deployment notes are sometimes wiped, and it is laborious to retype them - sometimes, I'll copy and paste to Notepad before deploying because I know I'm going to have to type them again. Notes should persist, or at least be configurable, from one comparison run through successful deployment.
2 votes -
Better handle Shares on Report and Dashboard Folders in pre-validation
during the pre-validation of a deployment, validate that the shares exist for report and dashboard folders. Currently if they do not exist in the target they fail during deployment. Only option today is to clean them up after error in source org and then redeploy. Instead, it would be useful if gearset removed them during the pre-validation.
1 vote -
Queue Deployments
Current State:
A few devs have work in their individual sandboxes that are ready to be pushed to the higher .dev sandbox org
1st Dev deploys work to .dev
2nd Dev deployment fails because org is locked
Idea:
A few devs have work in their individual sandboxes that are ready to be pushed to the higher .dev sandbox org
1st Dev deploys work to .dev
2nd Dev deployment is queued up, 2nd Dev will be notified via email/browser when queued deployment has started
3 votes -
add the ability to run static code analysis against an org or source control, simply click refresh button to re-run it and see results
This will give you a "run code analysis -> see results -> fix issues -> push to source control (or an org) -> repeat" workflow.
3 votes -
Offer to push components that passed validation even if some items failed
For large payloads (thousands of artifacts) it can be tedious to try to resolve a few minor issues when 99% of the package will deploy successfully. Having a feature to push stuff that passes validation would be helpful. Included in this in the deployment summary and deployment reports will be a count and list of items that failed validation and were omitted from the package so they can be investigated
9 votes -
3 votes
-
Better Targeted Static Code Analysis Feedback Surpression
Currently, if I want to suppress errors from static analysis, the suppression seems to be all-or-none at the class level.
I would prefer to suppress only the specific errors when they happen.
(e.g. in the method or in the line.)1 voteHi!
Our static code analysis is built on PMD. Currently, the only error suppression that works with Apex code in PMD is the `@SuppressWarnings` annotation. (This was introduced for Apex in PMD version 6.0.0). As you say, you can suppress all rules, or specific rules, at the class level, but can’t make more fine grained suppressions, unfortunately.
There are other methods of suppressing errors within PMD for other languages, but these don’t currently work with Apex. We’ll keep an eye on future versions of PMD to see if they introduce new methods of suppressing warnings for Apex code.
-
Allow custom rules in PMD
PMD support is great, but it would also be useful to be able to have a set of custom rules for my team
e.g. we have a code library of classes that they ought to use instead of coding their own solutions every time. Some of these cases could be spotted by a PMD rule, so it would be great to be able to add my own rules for this.
9 votes -
Adjust dependancy on managed package versions for Apex[Class|Page]
We've found when developing classes against managed packages the class ...meta.xml includes the package version numbers. Problem is these are fixed to the exact version installed (e.g. 9.4, 9.3 etc. example below)
But then if the managed package is upgraded Salesforce, in its infinite wisdom, doesn't update these so when you come to deploy next time you have to go and manually update all the version numbers - or if you have a different version in another sandbox (to test the newer one) same thing.
Seems like something a cool deployment tool could handle and offer to fix up on…
3 votes -
Provide the option to skip problem analysis
Sometimes during a complicated deployment I need to try a few different strategies to get things to work. For example, I may need to just send up one piece or set of metadata up first.
The problem is, each time I do this, I have to wait for problem analysis to complete, which takes a couple minutes. Normally during a deployment I do want problem analysis to run, but if I want to quickly send up a single object, I have to wait a few minutes while Problem Analysis runs, and the time waiting adds up.
Maybe a quick deployment…
6 votes -
Avoid Analyzer suggesting to deploy a workflow's Sobject when such SObject and all referenced fields exists in the target
Use case:
1. Deploy from source to target a changed Workflow (from active to deactivated)
2. Target already has all of the workflow's referenced components (fields used in filter criteria, fields used in Field Update)Analyzer will tell you "Add the following to the deployment" and something that looks like:
Deploy All
- object.WfName
-- object and its subcomponents
-- object and its subcomponents
-- object and its subcomponentsSince the object is already in the target as are the subcomponents, this message is alarmingly misleading and could inadvertently lead to deploying an object not yet ready.
The above message…
1 vote -
Field Tracking Analyzer message is misleading
If source org includes a new custom field with field history tracking enabled TRUE, the deployment analyzer reports the following when deploying to target org:
'Fields with history tracking enabled cannot be deployed before it is enabled on the object. You should remove these history tracking changes from the deployment'
Specific use case was object OrderItem
Analyzer can clearly detect if field history is enabled on target org object (which it was) - message is spurious.
Analyzer can arguably detect that target org's limit (20) will be exceeded by the deployment of the selected custom fields (which could add/remove field…
1 vote -
When migrating ReportFolders, either remove running user or let me map users between Sandboxes/environments
When migrating ReportFolders, either remove running user or let me map users between Sandboxes/environments
The SharedTo User doesn't exist in the target org causing my deployment to fail.
2 votes -
Allow PMD to be configured by uploading a xml file with rules
We use PMD locally in our IDEs to analyse during development, and the whole team shares a rule file in xml. It excludes the ones we don't like and configures other rules.
We can configure Gearset to do the same through the UI, but I'd rather be able to just upload the file, then I will know they are the same set of rules.
3 votes -
Add a note against individual components in a draft deployment
While comparing two Org's for differences and saving the 'Draft deployment':
Files that I'm sure and fine with the differences and happy - so tick them.
Files that are important to be deployed but I am yet to discuss those differences with some peers from team (dev / support). I would like to mark these (or / and) comment against these - with some useful notes (ex: impact or risk analysis related notes). Once these are discussed, I can then tick them - indicating I am happy to deploy these.
Once I mark the files as either 1 or 2…
2 votes
- Don't see your idea?