Big Data is a term used for large volumes of data. Organizations can use this data to build insights that lead to smart decisions and strategic business moves. Big Objects and Async SOQL are used to build big data solutions in Salesforce, and this blog highlights ten important concepts to know about Big Objects within Salesforce:
#10 Infrastructure built to support large volumes
Traditional databases can’t handle millions of records. Therefore, Salesforce has created a brand new infrastructure to support them. This infrastructure is built on proven big data technologies including Hadoop, Apache Phoenix, and Apache HBase.
#9 Well Integrated With Salesforce
Although big data lives in a world of its own, we can still access these values using most of the tools we are accustomed to working with within Salesforce. Some key features are:
- Support relationships with standard or custom objects
- Support FLS and CRUD
- Create Big Object records using Bulk and Soap API
- We can query them with SOQL (with some limitations) and create them with Apex. However, they are not available using Standard UI, so we can create a standard or Lighting UI to interact with them.
#8 Doesn’t impact data storage
Big Object storage doesn't count against organization data storage limit. It has a separate limit based on the license type acquired by the user.
#7 Must define an Index that serves as a unique Identifier
From API 40.0, defining an index is mandatory. An index can be a combination of up to five fields, and serves as a primary key or unique identifier for Big Object records.
The index plays an important role on records:
While trying to insert a record, if an index already exists, it will update that record, instead of creating a new one. In some ways, it acts like an upsert for Big Objects.
If we query Big Object records using SOQL, filtering or ordering is only possible by the fields specified in the index.
#6 Deleting data in Big Objects
The Apex method deleteImmediate() deletes data in a custom big object. For instance:
Note: Repeating a successful deleteByExample() operation produces a success result, even if the rows have already been deleted.
#5 Big Objects are packageable
The metadata of Big Objects is very similar to custom objects except that the API name ends with __b. We only need to select custom object component types while packaging, and then select the Big Object we want to include in the package.
#4 Big Objects metadata can be deployed to/from Sandbox.
Big Object metadata is copied as a part of the sandbox refresh, but its data is not. Similarly, it can be deployed to production via change sets.
#3 Search, Reports and Dashboards are not allowed on Big Objects
As Big Objects are designed for very large data volumes, these features are not yet available in Salesforce. However, there are a few workarounds for reporting on Big Objects :
- Use Einstein Analytics, which you can report on Big Objects
- Summarize the information you want to report for Async SOQL, and store the result in an intermediate custom object. Then we can report on that custom object
#2 Field Type Workarounds
Big Objects only support the following field types:
However, we can use workarounds — like creating a formula field on custom object and copying that value as text in the targeted Big Object (as in the image below).
#1 Big Objects limitation
Big Objects do not support DML operations that include Big Objects, standard objects, and custom objects in a single transaction. Also, Salesforce does not support flows, triggers, and process builder on Big Objects.
If you want to know more about Big Objects, check this interesting trailhead to understand how to work with Big Objects.
Salesforce platforms make Big Data easy for businesses and developers. And at Appirio, we help you with the right tools and techniques to manage big data in your organization. Come talk with us to find out more about how Appirio can help energize your workers, your customers, and your business with a different experience. Reach out to Salesforce Experts today!