So-Called Advent Calendar Day 5 - With Sharing is Caring
In the spirit of the holiday season, this is a series of short blog posts covering random things I have learned while doing Salesforce development, one for each day of Advent.
For a long time, I was under the impression that creating an Apex class with the with sharing
declaration was enough to enforce not just sharing rules, but also field level security (i.e. FLS) and CRUD access on objects. This was (mistakenly) reinforced by the fact that if I was trying to edit a record in a Visualforce page and couldn’t, then running the class without sharing
usually “fixed” that. But that wasn’t because my user didn’t have the correct CRUD access on said record, it was because they didn’t haven’t the correct sharing access to edit that particular record.
I’ve worked in a lot of Salesforce orgs where record access wasn’t kept really private - basically everyone was a step down from being a system admin. All records were accessible, everyone had the ability to read or edit any field. And so for an embarassingly long time I didn’t actually need to worry about how security worked. As long as record access was kept behind a login into their Salesforce org, the client often thought that was secure enough (for better or worse).
So here’s my current understanding of record security in Salesforce, hopefully this will be insightful for you as well. AND PLEASE CORRECT ME IF I’M STILL WRONG BECAUSE I REALLY NEED TO KNOW.
Sharing - This controls whether or not a User can Read or Edit a particular SObject record (i.e. a particular Account record of all the. Org wide defaults create a baseline of access that can be expanded via sharing rules, manual sharing, or hierarchies. One thing to note is that “Write” sharing access refers to the ability to update a particular record. Create access is not controlled by sharing rules.
with sharing
enforces these rules in Apex. without sharing
allows Apex to bypass sharing access and access whatever values that are in the system regardless of the running user.
CRUD - This stands for Create, Read, Update, Delete. This defines whether a user has the ability to do those for actions for a particular SObject at all. If you don’t have Update access on Account, even if Account has a org wide default of Public Read/Write
, you still won’t be able to update any accounts. This also works the other way around. If you have update access on Account, but Account has an org wide default of Private
or Public Read
, you only have the ability to update an Account you have the appropriate write sharing access to. Because Apex runs in a system context, this is not enforced unless you manually enforce it by checking permissions before doing DML! with sharing
does NOT enforce this!
FLS - This stands for Field Level Security and controls a User’s ability to Read/Edit a particular field. This is superceded by CRUD access, i.e. if you do not have Edit access to an SObject, you won’t be able to edit a field even if you have edit access on that particular field. Similar to CRUD, this is also not enforced in Apex! Apex can still read fields returned by a SOQL query even if the running user does not have read access to those fields. (Ways to enforce this will be discussed on another Advent day!)
I hope that’s both correct and straightfoward enough to clarify permissions to anyone who is still confused.
So-Called Advent Calendar Day 4 - Virtual Mocking
In the spirit of the holiday season, this is a series of short blog posts covering random things I have learned while doing Salesforce development, one for each day of Advent.
When writing unit tests, I try to avoid doing anything that would hit the database, like creating records, or running SOQL queries. Inserting records in tests not only slow tests down, but they also inadvertently create a dependency between your test class and any triggers in your organization. Or in the case of BigObjects, you just can’t insert recors because they’ll persist in your org even after the test is done (we’ll discuss this on another day!). While you will want a larger integration test that actually hits the database to test things from end to end, generally your unit tests are more focused and are also more in number. So even if just for the sake of speed, I try to think if I can write unit tests without any DML at all.
This becomes especially tricky if the method you are testing has any SOQL queries in it. While using dependency injection or the StubAPI can be useful to mock out services that do DML, sometimes it’s a bit too much to create a separate dependency for your method. Consider the following that queries some objects and returns the first one to match some criteria:
public with sharing class Finder {
public SObject findTheThing() {
List<SObject> things = [SELECT Name FROM Thing__c];
for(SObject thing : things) {
if(isThisTheThing(thing)) {
return thing;
}
}
return null;
}
private Boolean isThisTheThing(SObject theThing) {
return theThing.get('Name') == 'The thing';
}
Because of that SOQL query in findTheThing
, it seems like you have to insert a record to test the positive use case where a match is found. I want to avoid hitting the database, however it doesn’t seem right to add another class in here to mock out the SOQL query. That’s where virtual classes can be handy. Let’s rewrite the class above to move the SOQL query into it’s own separate virtual method:
public with sharing virtual class Finder {
public SObject findTheThing() {
List<SObject> things = getTheThings();
for(SObject thing : things) {
if(isThisTheThing(thing)) {
return thing;
}
}
}
private Boolean isThisTheThing(SObject theThing) {
return theThing.get('Name') == 'The thing';
}
protected virtual List<SObject> getTheThings() {
return [SELECT Name FROM Thing__c];
}
}
With a virtual class, I can create an extension of the class in my test that mocks out the virtual method.
@isTest
private with sharing class Finder_TEST {
@isTest
private static void shouldFindTheThing() {
SObject expectedThing = new FinderMock().findTheThing();
System.assertEquals('The thing', expectedThing.get('Name'), 'The correct thing should be returned');
}
private class FinderMock extends Finder {
protected override List<SObject> getTheThings() {
return new List<SObject> {
new Thing__c(Name = 'The thing'),
new Thing__c(Name = 'Not The Thing'),
}
}
}
}
By extending the virtual Finder
class I am able to override the virtual getTheThings
method while the rest of the logic is kept in place. Now I can test the findTheThing
method without having to insert any records and the test runs super fast! I could also do the same for the negative case by just altering the list of records that gets returned by the mock.
For these unit tests, I don’t care about the actual SOQL query because the tests are focused on the matching logic, not that the SOQL query ran correctly. I should certainly add a test that does hit the database to test the SOQL query, but by using mocking the virtual method I only have to hit the database for the tests that require it instead of incidentally for every test. By adopting this pattern, your test suite will run much faster and that’s a little less time you have to spend just staring at that deployment circle, waiting for that green checkmark.
So-Called Advent Calendar Day 3 - Retro Forgiveness
In the spirit of the holiday season, this is a series of short blog posts covering random things I have learned while doing Salesforce development, one for each day of Advent.
Retro is one of my favorite scrum ceremonies and I think it is the most important one. Without taking the time to reflect and celebrate what we have done, we cannot possibly set ourselves up for a better future. On my current team, we start with the Retro Prime Directive:
The Prime Directive prohibits Starfleet personnel and spacecraft from interfering in the normal development of any society, and mandates that any Starfleet vessel or crew member is expendable to prevent violation of this rule
I’m sorry, that’s the wrong one.
“Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.” –Norm Kerth, Project Retrospectives: A Handbook for Team Review
I love how this quote encourages you to frame your perspectively positively before reflecting on the past. How many times have you looked at some legacy code or some existing solution and thought “What the fuck was this person thinking?” That only an incompetent developer would build something like this, build something that you now have to fix? And how many times has that rage actually helped you solve the problem?
The Retro Prime Directive encourages you to avoid all of that and to instead consider the context of the situation.
Maybe there was external pressure that forced them to rush through a solution.
Maybe the “right” way to do it causes some hidden side effects and the “shit code” was the best solution they could come up with at the time to get around it.
And maybe they just straight up didn’t know any other way, but they were the only person there to could do it.
We have all been that past author that wrote the legacy code someone else had to fix. You deserve to be forgiven and you should afford others that forgiveness as well.
This is easier said than done. It is so much easier to complain about legacy code and admittedly a little more fun. But when you focus on blame, you get stuck in the past. Instead, I encourage you to focus on seeing what you can learn to help guide what you build moving forward.