Use Field Aliasing for ConvertCurrency

This post is more of a cautionary tale so you won’t be a bonehead like me. I recently wrote some code that queried OpportunityLineItem and used the ConvertCurrency function to get the TotalPrice in the user’s currency. The problem was that later in the code, I did an update to that same record. Since I had converted the currency using the formula, the update replaced the value with the converted value, but left the record currency in the foreign value.

My solution was to use a new feature in Spring 16 that lets you alias ConvertCurrency fields. With the field aliased, updates no longer change the monetary value, and problem solved!

OpportunityLineItem oli = [Select Id, convertcurrency(TotalPrice) usdprice from OpportunityLineItem where Id = '00k4B000003ESxW'];
// Use get to access the value
// Safely update without overwriting the value
update oli;

Avoiding Hard Coded URLs

Hopefully you know that hard coding links in Salesforce is a no-no. You might one day be migrated from one instance to another (NA1 to NA9999, or whatever they are up to these days). You also want to make sure that your links work in all your sandboxes. There is nothing worse than clicking on a link in a sandbox and all of a sudden you are in production!

First of all, I recommend that you implement My Domain. This means that you won’t have to worry about moving from one instance to another. Plus you need it on to use Lightning Components. You want those, right?

For the most part within Salesforce, you can just avoid the beginning part of the URL and use relative URLs. So, a URL button that takes you to a Visualforce page can just have ‘/apex/MyPage’.

The first “gotcha” is with managed packages. If you want to link to pages in a managed package, then you can still use a relative URL. Just use the format of ‘/apex/NAMESPACE__ManagedPage’.

Another good trick is to use the URLFOR function within Visualforce. This future proofs your pages so that if Salesforce ever changes the URL scheme (hello, Lightning), everything will still work. For example, to get the link to edit an account, I can use the following syntax.


Email templates are another tricky part of links. Since the email is being sent out, you can’t use relative URLs. You need to link into your instance. For the object that is triggering the email, you can use the merge field called Detail Link. This looks like {!Opportunity.Link}.

But what if you want to provide a link in your email to a related record? For example, an email gets sent from an opportunity, but you want a link to the account? First, let’s create a formula field on the object that will trigger the email and call it Base URL and make it a text formula field with the following formula:

LEFT($Api.Partner_Server_URL_360, FIND( '/services', $Api.Partner_Server_URL_360))

You can now use this formula field in your email templates to link to other records or Visualforce pages. So, from an opportunity email, the merge field would be: {!Opportunity.Base_URL__c}{!Opportunity.AccountId}.

By implementing these changes, you can be assured your links will always go to the correct instance.

Salesforce Productivity With Chrome Extensions

I’ve seen several posts recently about Chrome extensions that help with your productivity in Salesforce. Here are a few I use every day.

Boostr for Salesforce

This new extension has several features. My favorite is that it removes the placeholder text from the search box in setup. How many times have you clicked into that box before the page finishes loading?

Salesforce Admin Check All

This handy little extension adds a checkbox at the top of admin page lists so you can quickly grant access to a big list of fields.

Salesforce API Field Names

I have mixed results with this extension. When it works, it replaces the labels on a page with the API names. It doesn’t work on feed based layouts and sometimes the order gets off. I haven’t quite figured out the reason why.

Salesforce Colored Favicons

If you ever have multiple orgs open at once, this extension changes your favicon different colors based on the instance you are on. This lets you easily know if you are in production or a sandbox.

Salesforce Quick Login As

I’ve saved my favorite for last. You can login as another user from any page in Salesforce and it takes you directly to that page as the user. No more going to setup, finding the user, logging in and then navigating to the page you want to look at.


Deleting a Single Item From the Recycle Bin

Sometimes you just want to remove a single item or a subset of items from the recycle bin without emptying the entire thing. I found this long-standing idea on IdeaExchange, and then realized I could do it in Apex.

First step is to find the Id of the record you want deleted from the recycle bin. I use workbench to create my SOQL query and query deleted items. Be sure to select the option to include deleted and archived records. Here’s a query to get the IDs of deleted cases.

select id from case where isdeleted = true

Once I have the ID, then I can use workbench to execute anonymous Apex. Just put a comma separated list of Ids in the List:

Database.emptyRecycleBin(new List<Id> {'500U000000IonltIAB'});

Get Ready for Dreamforce 15

With just over a week left until the hordes descend on San Francisco, there’s a lot to prepare for. The big announcement for Dreamforce is going to be the new Lightning Experience. This new look of Salesforce is going to change everything! Rather than wait till you get to Dreamforce to begin learning about Lightning, I recommend taking a look at the new Trailhead modules. By finishing those modules, you’ll be better prepared and ready to jump into more advanced topics when you get to Dreamforce.

The Admin Trail – Migrating to Lightning Experience is best for existing admins who are interested in the impacts Lightning would have on your org. It walks you through the basics of Lightning. One of my favorite parts is the chart that compares the differences between Lightning and Classic. If you only do one thing before Dreamforce, look at this chart!

On the Developer side of things, the Developer Trail – Lightning Experience walks developers through Visualforce and Lightning Components. Unfortunately, since Lightning isn’t GA there aren’t the hands-on projects you get with a lot of other modules. One of the coolest things Salesforce is doing as part of the Lightning push is sharing design best practices. The module on the Lightning Design System is fantastic for getting an understanding of how to build pixel perfect components.

With these new Trailhead modules, you’ll be prepared for Dreamforce and ready to attend some awesome sessions!

If you are still looking for sessions (that don’t have anything to do with Lightning) to attend, here’s my shameless plug for the two sessions I’m presenting:

Writing Effective and Maintainable Validation Rules Tuesday, September 15, 12:00 – 12:40

Writing Effective and Maintainable Validation Rules Friday, September 18, 10:00 – 10:40

Apex Testing Best Practices Thursday, September 17, 4:30 – 5:10

Trailhead: The Awesome New Way to Learn Salesforce

I’ll admit it, I didn’t get Trailhead when it was launched at Dreamforce last year. There were only a few badges, and I didn’t really see what the big deal was. Fast forward a year, and holy cow! The Trailhead content just keeps getting added. I love the humor that the Trailhead Team injects into some of the modules. Catter, anyone? I also love how Trailhead actually makes you work in an org and checks your work.

As content in Trailhead increases, I’ve been pointing more and more people to Trailhead to learn Salesforce. Want to move into an admin role? Take the Admin – Beginner and the Admin – Intermediate trails. Want to see why I love developing on the Salesforce platform? Take the Developer – Beginner and Developer – Intermediate trails.

I’m sure we’ll be hearing a lot about Trailhead at Dreamforce next month.  Speaking of Dreamforce, there is a great badge on what to expect and what to do. This one has a lot of the humor I appreciate so much.

To get started in Trailhead, you need two Salesforce logins. The first login you need is your profile login. I use a developer org login for this. It is the same login I use for the forums and success community. The benefit here is that it isn’t tied to your production org login, so when you get a dream job at another company, your profile will move with you. The second login you need is also a developer org. This login will be used to complete your challenges. It is important to use a developer org for this login as you may need features not available in a sandbox or your production org. And why would you clutter a production org with all of that stuff anyway?

Once you have your logins squared away, you are ready to start learning. One of the cool things the Trailhead team is doing is providing modules on add-on products. One such example is Event Monitoring. By going through this module, you can gain an understanding of these new features and decide if it is something you think your production org would benefit from. You can be much better informed about the features without ever having to go through a sales pitch!

So, get on Trailhead and start exploring. Once you get a few badges, you’ll be hooked and want to collect them all!


Deploy Faster with Summer ’15 Features

Buried in the pages of the Summer ’15 release notes is a little feature that you might easily pass by. If you take advantage of this feature, then you can drastically speed up your deployments to production. This new feature allows you to choose which tests are run in a deployment.

First of all, the biggest change is that tests are only run when you are deploying Apex classes or triggers. This means that your deployments will immediately be sped up if you are only deploying objects, new fields, workflow rules, etc. If you do have Apex in your deployment package, then by default all local tests are run. Local tests are those that are not namespaced (included in a managed package).

The cool thing is that you can now make a change to your build.xml file to specify tests to be run. When you do this, only the test you specify in the build.xml file are run and Salesforce only looks at the code coverage from those tests on the code you are deploying. As long as the tests pass and the code coverage is good, then your deployment will succeed.

What does this mean in a real life situation? I ran a couple of scenarios and saw marked improvements. If I deploy a single class the old fashioned way, then tests in my org take about 30 minutes to complete. If I instead specify a specific test class, then my deployment time dropped to 2 minutes. This is huge, when you think about needing to deploy an emergency patch. You no longer need to wait for all tests to complete.

A word of caution: if you have the luxury of time, I highly recommend you run all local tests and use the quick deploy feature. This allows you to stage your release and make sure everything is working properly. Use the specific tests option only in cases where you need to get something to production fast.

Upgrade your Migration Toolkit and start getting faster deployments! Let me know how you plan on using this new feature.

Streamline Your Deployments

We had a release to production last night, and I marveled at how smoothly things went compared to the first release we did as a team. In this post, I’ll outline the critical components of building a successful release package.

1. Documentation

Before you even start work on features or bug fixes, have a system in place to document what is being asked for and how it will be used. We have a custom built set of objects in Salesforce to track this. Each feature or bug fix is assigned to a release and sprint and status is tracked as it moves through the process. We also add metadata as a related list to the features so we know what was changed. I’ve written some code that builds a package.xml based on all the items in the release. We use this package.xml file during our deployments.

2. Development Environment

Each person making changes that will go into the release should do their work in their own sandbox. By staying in your own sandbox, you have  no danger of overwriting someone else’s changes.

3. Version Control

Using version control ensures you can easily track changes back to a person and a feature request. We use a repository on Github that contains a complete snapshot of metadata. Each feature request goes in a branch that will get merged.

4. Continuous Integration

Before a feature gets merged into the release branch in Git, we test each one with CI. We use Snap-ci to do our CI. When a pull request is created, Snap does a validation build and runs all unit tests to make sure nothing is broken. If the change breaks the build, we know about it early and can fix it before it gets merged and sent on to users for testing.

5. Deployment Tool

All deployments are performed using the Ant Migration Tool. This allows for creation of a release package will all the metadata in it that we can deploy again and again to different sandboxes. We test the deploy at least three times before releasing to production.

6. Sandboxes

We use a lot of sandboxes when preparing for a release. Below are a few of the main sandboxes we use.

Staging – this is a config only sandbox used as the target of our CI process.  This sandbox is also used for user demos and basic testing by the development team.

UAT – this is a full sandbox that we use for user acceptance testing. We deploy from Staging to UAT about 2 weeks prior to the release to give users enough time to test and sign off on the changes. Important: if a user group doesn’t sign off on a change, we will pull their feature from the release.

Backup – this config only sandbox is created a day before the release and serves as a complete backup of the org’s metadata. It is easier to create a sandbox than doing a backup using Eclipse, Ant, or MavensMate. This sandbox is used in case we need to review how something worked prior to the release (aka, did that break with this release?). We can also restore quickly using change sets from that org if things go horribly wrong.

7. Quick Deploy

A recent feature Salesforce added is Quick Deploy. This allows you to do a validation deploy to production during which all tests are run. If it validates, you can click on the Quick Deploy to deploy the changes to production without having to run the tests again. The day of the release, we validate the package against prod and then wait for the designated release time to click the Quick Deploy button. This saves us so much time.


DataTables in Visualforce, part 2

In the last post, I showed how easy it is to turn a plain html table into a functional DataTable. In this post, I’ll show how you can build on that and and make the table even more dynamic. In this example, we’ll build a datatable that lists accounts that allows you to expand an account to see a list of contacts for the account.

First, instead of using a controller to return data for the table and build it using an <apex:repeat>, I am using Javascript Remote Objects to get 100 accounts. Note that the limit is 100 records at a time, so if you have lots of accounts, you may need to implement filtering or pagination (maybe a topic for a future post). The biggest benefit of Remote Objects  is that we don’t need a controller. We use the “ajax” option to specify the data for the table. The biggest issue I ran into with this was that it expects an object with a “data” property. You can’t just return the records from the remote object call.

In the columns option for the datatable, I have to specify where to pull each column from. This is done with { “data”: “_props.Phone”,
“defaultContent”: ” }. The defaultContent option tells the datatable to replace any undefined properties with a blank, preventing errors from being thrown.

Next, we have to handle the clicks to expand the row and display contacts. This code is fairly straightforward with another remote object query to get the list of contacts for the account. I then just build up a html table to append in the child section.

Here’s a screen shot of what it looks like when we are all done.

datatables ajax

And here is the entire page.

DataTables in Visualforce, part 1

I’ve been doing a lot of work with jQuery DataTables. They can add a nice custom interface for your Salesforce data with very little coding. I’m going to devote the next few posts on ways to use DataTables in your Visualforce pages. In this first post, we’ll make it pretty simple using a standard html table created using an <apex:repeat> tag and then using jQuery to make the table into a DataTable with zero configuration. Then we’ll do a couple of extra things to make our table a little more functional.

First, we need a controller that gives us the data we need. Our example will be listing contacts in Salesforce.

Now, here is a Visualforce page that turns that ugly table into a nice looking DataTable with sorting and filtering, plus some styling. Pretty easy, right?

Alright, let’s get a little fancy. We want to have a drop down listing all the accounts so we can filter by account name. We also want to default the order of the rows by contact last name. To do the sorting, we specify order with an array. The columns are zero-based, so last name which is the 3rd column will be index 2.

To add the drop down, we use jQuery to get the unique values from the first column, sort them and then append them to a select component on the page. We also add a handler to the select component to filter when the value changes. If “All” is selected, then it clears the filter.

And here’s what it looks like.datatables1

In the next post, I’ll show how to leverage Visualforce remoting to add some dynamic features to a DataTable. Tips and Tricks