#DataCloud

2025-11-04

Salesforce remains a CRM leader with ~20% market share. Analysts rate it a Moderate Buy with ~25% upside. Best to buy on pullbacks or when AI and Data Cloud growth accelerates. Risks include slowing core growth, high valuation, and macro uncertainty. Long-term investors may benefit from dips.

#Salesforce #CRM #Investing #TechStocks #AI #DataCloud #StockMarket #TECHi

Read Full Article Here :- techi.com/when-should-you-buy-

2025-07-16

Getting Started with Salesforce Data Cloud: Your Roadmap to Unified Customer Insights

It’s not uncommon for businesses to lose track of their customers when data lives in too many places. Data scattered across various systems, from CRM and marketing automation to e-commerce platforms and mobile apps, creates “data silos” that hinder a complete understanding of customer behavior and preferences. This leads to misleading metrics, redundant communications, and missed opportunities for truly personalized engagement. This is where Salesforce Data Cloud steps in, offering a solution to connect, harmonize, and activate all your customer data, transforming it into actionable insights.

Evolving from Salesforce CDP (Customer Data Platform) and formerly known as Genie, Salesforce Data Cloud is designed to create a unified picture for your customer. It enables you to bring together data from any source, regardless of its format, using low-code tools and advanced architectural foundations like the lakehouse architecture and Hyperforce. The ultimate goal is not just data aggregation, but also to empower every part of your organization, from marketing and sales to service and commerce, with real-time, intelligent actions.

This guide will walk you through the essential phases of getting started with Salesforce Data Cloud.

Why Data Cloud? The Core Problem It Solves

The primary challenge Salesforce Data Cloud addresses is the elimination of data silos. Imagine a customer interacting with your brand through multiple touchpoints: they browse your website, sign up for a newsletter, make a purchase through your e-commerce platform, and contact customer service. Each interaction generates data, but this data often resides in separate systems each managed by different teams or individuals. Without a unified view, you might send generic emails, offer irrelevant products, or even annoy customers with redundant communications because you don’t recognize them as the same individual across all these systems.

Data Cloud provides a unified picture by ingesting data from diverse sources, including Salesforce CRM, Marketing Cloud, Commerce Cloud, Amazon S3, Google Cloud Storage, Azure, Workday, and SAP, using a rich library of pre-built connectors or flexible APIs. This consolidation is crucial for building unified customer profiles that represent a complete, 360-degree view of each individual, avoiding misleading metrics and improving personalization.

Beyond just collection, Data Cloud is built to make data actionable. It enables you to perform transformations and aggregations to generate calculated insights (e.g., Customer Lifetime Value, engagement scores), segment your audience with precision, and trigger real-time actions across various channels. Its architecture, based on a lakehouse model on Hyperforce, supports high-volume data ingestion and processing at the metadata level, ensuring efficiency and scalability.

It’s also important to note Data Cloud’s consumption-based pricing model, where you pay only for the services you use, making efficient data management even more critical. Despite the improvements made over the recent years, the estimation of Data Cloud costs remains to be a challenge.

Phase 1: Planning and Discovery – Laying the Groundwork

Any successful Data Cloud implementation begins with a meticulous planning and discovery phase. This foundational step ensures alignment with business goals and prepares the ground for effective data management. Data Cloud is a platform where most of the time of the implementation needs to be spent on preparation and design. Expediting these phases can be costly causing rework and frustration.

Define Business Objectives and Use Cases

Before diving into technicalities, ask fundamental questions:

  • Why are you starting a data platform solution?
  • What is the vision for this Data Cloud solution?
  • What are your primary use cases, and are they aligned with top business priorities?
  • How will you measure the success of the implementation?

For optimal results, start small. Focus on one or two core use cases initially. This iterative approach allows you to:

  • Identify platform nuances.
  • Understand source systems and their data quality.
  • Develop robust data dictionaries.
  • Monitor use cases, then expand.

Ultimately, you should catalog the available data and build a prioritized list of use cases based on their tangible business value.

Understanding Roles and Ownership

A Data Cloud implementation necessitates a strong partnership between IT and marketing/business teams. Clearly define who owns what:

  • CDP Administrator/Platform Owner: Manages the Data Cloud platform.
  • Data Roles: Responsible for creating data pipelines.
  • Marketing Roles: Focus on audience creation, campaign execution, and strategy.
  • Customer Insights and Analytics Teams: Leverage the unified data for reporting and analysis.

Align these roles with your organization’s existing structure to ensure all necessary stakeholders are involved from the outset.

Data Inventory and Quality

This is arguably the most critical aspect of planning. Prepare a thorough data dictionary or inventory that comprehensively lists all data sources, preferred ingestion methods, necessary transformations, and how they relate to your defined use cases.

  • Field-Level Data Inspection: Scrutinize individual fields for accuracy, identify primary keys, and assess whether data needs normalization or denormalization.
  • Data Profiling Tools: These are invaluable for understanding your data. They can analyze field distribution, completion rates, and help identify relevant fields. Profiling helps confirm if your approach will stay within free credit limits and accelerates the design phase.
  • Clean Data Upstream: It cannot be stressed enough: clean and sanitize your data at the source system before ingestion. Data Cloud is a unification tool, not primarily a data cleansing or deduplication tool. Ingesting bad or unnecessary data can significantly increase credit consumption and lead to inaccurate results.
  • Prioritize Data: Avoid the common pitfall of trying to bring in “all the data”.
  • Data Type Alignment: For Zero-Copy integrations, ensuring data type alignment between your source schema (e.g., Snowflake) and Data Cloud’s data model objects (DMOs) is crucial to prevent mapping issues.
  • Unique Keys: Data Cloud operates on an upsert (update or insert) model. Ensure every row in your data files has a unique key (either a single field or a composite key) to prevent incorrect merging of records during ingestion.

Phase 2: Architecture and Setup – Building the Foundation

Once the planning is complete, the next phase involves architecting and setting up Data Cloud to receive and process your data.

Connector Selection and Data Ingestion

Salesforce Data Cloud offers flexible ways to ingest data:

  • Out-of-the-Box (OOTB) Connectors:
    • Prioritize using OOTB connectors for Salesforce CRM, Marketing Cloud, Commerce Cloud, Amazon S3, Google Cloud Storage, and Azure. These are pre-built and minimize effort.
  • Ingestion API (Batch vs. Streaming):
    • Batch Ingestion: Ideal for front-loading historical data or ingesting large volumes at scheduled, off-peak hours. Data is typically sent in CSV format.
    • Streaming Ingestion: Designed for near real-time ingestion of small batches of data, such as user actions on websites or POS system events. Data is typically sent in JSON format.
    • Setup Process: First, create an Ingestion API connector, which defines the expected schema and data format. Then, create a data stream for each object you intend to ingest through that connector.
    • Authentication: Secure API calls require setting up Connected Apps in Salesforce, leveraging OAuth flows like JWT for authentication.
    • API Limits: Be aware of limitations, such as 250 requests per second for streaming APIs and a 200 KB payload size per request. These are important for designing your ingestion strategy.
    • Schema Mistakes: If you get a data type wrong in your schema, you generally cannot change it directly after creation.
  • Web & Mobile SDK:
    • Developers specifically tailor these SDKs to capture interaction data from websites and mobile applications, such as page views and clicks.
    • Key Benefits: They come with built-in identity tracking (managing both anonymous and known user profiles) and cookie management, simplifying the process of linking anonymous activity to known profiles once a user identifies themselves.
    • Consent Management: The SDKs also include integrated consent management, ensuring data is only collected and used with user permission.
    • Sitemap: A powerful feature that allows for centralized data capture logic across multiple web pages, reducing the need to embed code on every page.
    • Experience Cloud Integration: For Experience Cloud sites, a new integration feature provides a data kit that simplifies setup and automatically captures standard events.
    • SDK vs. Ingestion API for Web: For web and mobile applications, the SDK is generally preferred over the Ingestion API because it handles authentication more securely (no client-side exposure) and streamlines data capture.
  • Zero-Copy Integration:
    • This revolutionary feature allows Data Cloud to directly access live data stored in external data lakes and warehouses like Snowflake, Databricks, Google BigQuery, and AWS (S3, Redshift) without physically moving or duplicating the data.
    • Advantages: Offers near real-time data access, eliminates data duplication, and extends the value of existing data lake/warehouse investments.
    • Important Considerations: Data type alignment between your source system and Data Cloud is critical for successful mapping. Also, be prepared for network and security configurations (e.g., VPC, IP whitelisting) to ensure secure connectivity between Data Cloud (hosted on AWS) and your external cloud environments.

Data Harmonization and Modeling

After data is ingested into Data Cloud, it enters the harmonization and modeling stage:

  • Data Lake Objects (DLOs): When data first enters Data Cloud, it’s stored in DLOs, which are essentially raw, un-transformed representations of your source data.
  • Data Model Objects (DMOs): DMOs represent Data Cloud’s canonical data model. The next crucial step is to map your DLOs to DMOs, transforming the raw data into a standardized structure that Data Cloud understands and uses for downstream processes.
  • Standard vs. Custom DMOs/Fields: Data Cloud provides standard DMOs (e.g., Account, Contact, Individual). Leverage these where possible. For unique business requirements or custom fields from your source systems, you have the flexibility to create custom DMOs or add custom fields to standard DMOs.
  • Formula Fields: These are powerful tools within Data Cloud, similar to Salesforce CRM formulas. Use them to augment your data (e.g., create composite unique keys for identity resolution) or cast data types if mismatches occurred during ingestion.
  • Interim DLOs: In complex scenarios, consider creating “interim DLOs.” These can be used as an intermediate step to maintain additional business context, perform standardization, or scrub data before it’s mapped to the final target DMOs.
  • Data Categories: When setting up data streams, you assign a category to the data, which influences how it’s used:
    • Profile Data: Contains identification information (like name, email, address) and is crucial for identity resolution.
    • Engagement Data: Represents event-driven interactions (e.g., website clicks, purchases, mobile app logins). This data is typically used for aggregated statistics and behavioral insights.
    • Other: For data that doesn’t fit neatly into the above categories.
  • Data Spaces: Data Cloud allows you to logically separate data using data spaces. These function similarly to business units in Marketing Cloud, enabling you to manage data for different regions, brands, or entities, and ensuring compliance with regulations like PDPA, GDPR, or CCPA by controlling data visibility and access.
  • Relational Model: Maintain a comprehensive data dictionary that details your entire data model, including relationships between DLOs and DMOs.

Phase 3: Unification

With your data ingested and harmonized, the next critical phase is unification, where disparate customer profiles are brought together into a single, comprehensive view.

Identity Resolution

Identity Resolution is the core capability that enables Data Cloud to build a single, unified customer profile from various data sources. This process is crucial to:

  • Avoid inflating your customer metrics.
  • Prevent sending redundant communications.
  • Enhance personalization across all touch points.

The identity resolution process is typically two-fold:

  1. Matching Rules: These rules define the criteria for identifying when different records belong to the same individual. Examples include using fuzzy matching for first names (allowing for minor variations), exact matching for last names and email addresses, or linking records based on social handles.
    • Party Identification Model: Leverage external identifiers like loyalty member IDs or driver’s license numbers to enhance matching accuracy. This model helps link profiles across systems that might not share common direct identifiers.
    • Required Match Elements: Be aware of specific requirements when unifying accounts or individuals.
  2. Reconciliation Rules: Once potential matches are identified, reconciliation rules determine which attribute values will represent the unified profile. For instance, if a customer has multiple email addresses across different source systems, you can define rules to select the “most frequent” email, or prioritize data from a “source of truth” system.

Key Considerations for Identity Resolution:

  • Thorough Data Understanding: A deep understanding of your data, including unique IDs, field values, and relationships, is paramount for configuring effective matching and reconciliation rules.
  • Start with Unified Profiles Early: Even if your initial match rates are low, begin building calculated insights and segments against unified profiles from the outset. This prepares your Data Cloud environment for seamless integration of new data sources in the future.
  • Credit Consumption: Identity resolution is a credit-intensive operation (e.g., 100,000 credits per million rows processed). While incremental processing is improving efficiency, careful planning of how often identity resolution runs is essential to manage costs.
  • Anonymous Data: By default, the Marketing Cloud Personalization connector sends events only for known users. Enabling anonymous events drastically increases data volume and credit consumption, and you should note that Data Cloud doesn’t reconcile anonymous events to known users out of the box. You’ll need to implement custom solutions for that reconciliation.
  • Data Quality is Paramount: The success of identity resolution hinges on the quality of your incoming data. If your source systems contain “garbage” (inaccurate or inconsistent data), your unified profiles will reflect that. Therefore, prioritize cleaning your source data before bringing it into Data Cloud.

Phase 4: Activation – Turning Data Into Actions

The final, and arguably most impactful, phase is activation. This is where you use your unified, intelligent data to drive personalized customer experiences and automate workflows across various channels.

Calculated Insights

Calculated Insights allow you to perform aggregations and transformations on your data to derive meaningful metrics. These can include:

  • Customer Lifetime Value (LTV)
  • Engagement Scores
  • Total Deposit per Month
  • Propensity to Buy

These insights enrich your unified customer profiles, providing deeper understanding and enabling more sophisticated segmentation and personalization strategies.

Segmentation

Data Cloud’s segmentation capabilities enable you to create dynamic audience segments based on any harmonized attribute or calculated insight. This allows for precise targeting of specific customer groups.

  • Building Segments: Use the intuitive segment builder to drag and drop fields and apply criteria. You can combine rules with AND/OR logic to refine your audience.
  • Nested Segments: This feature allows you to incorporate one segment within another. However, be mindful of limitations, such as a maximum of 50 filters per segment.
  • Publishing: Publish segments to various activation targets. While Marketing Cloud Personalization supports only “standard publish,” other targets might allow “rapid publish” for faster audience delivery.

Activation Targets and Activations

After creating segments or calculated insights, you define activation targets, the destinations where you send this actionable data. Data Cloud offers broad activation capabilities:

  • Marketing Cloud: Push segments into Marketing Cloud data extensions for email personalization and Journey Builder entry events. You can also use Data Cloud data to influence different journey paths within Marketing Cloud, for example, by attaching custom attributes to Contact Builder.
  • Advertising Platforms: Directly send customer segments to major advertising platforms like Google, Meta, and Amazon for targeted campaigns.
  • Salesforce Flow: Initiate real-time Salesforce automation (Flows) based on data changes, calculated insights, or streaming events processed by Data Cloud. You can configure this via Data Actions.
  • Webhooks: Data Actions can also trigger webhooks to send data to virtually any third-party system.
  • Data Lakes & Warehouses: Securely share harmonized profiles, segments, or insights back to external platforms like Snowflake, Databricks, or Google BigQuery.
  • Business Applications: Push unified data or activate segments directly into other downstream business applications like ERP systems or other analytics tools.

Platform Monitoring

Consistent monitoring of your Data Cloud platform is crucial post-implementation. This includes:

  • API Ingestion Monitoring: Track data flow from MuleSoft or other APIs to Data Cloud.
  • Segment Publications: Verify that segments are publishing correctly and yielding expected results. Issues can occur if upstream data ingestion or unification breaks.
  • Activations: Ensure data is successfully reaching its intended activation targets.
  • Status Alerts: Subscribe to status.salesforce.com for updates on your instance to stay informed about any maintenance or performance degradations.

Key Lessons Learned & Continuous Evolution

Salesforce Data Cloud is a dynamic product that undergoes rapid evolution, with new features and changes rolling out frequently, often on a monthly basis, outside of the major seasonal releases. Staying current is key to maximizing your investment.

Key lessons from real-world implementations:

  • Stay Connected: Maintain close communication with your Salesforce account team, participate in partner Slack channels, and engage with Trailblazer communities. This helps you stay informed about upcoming features, pilot programs, and best practices.
  • Non-Reversible Data Ingestion: Be extremely diligent in your planning, especially regarding data types and unique keys. Correcting bad data types or core stream elements after you ingest and activate data is highly difficult and often requires you to delete downstream segments, calculated insights, and even DLO/DMO mappings to re-implement. Plan ahead to avoid costly rework.
  • Marketing Cloud Connector Caution: The Marketing Cloud connector will bring in all subscriber data from your Marketing Cloud instance, including data from multiple business units. This can significantly impact your profile counts and potentially lead to overages if not anticipated and managed. Understand what’s in your “all subscribers” table before connecting.
  • Consumption Costs: Data Cloud operates on a consumption-based model, so every operation has a cost.
    • Data Ingestion: Volume of data ingested directly impacts cost.
    • Batch Transforms: These process the entire dataset for every execution, potentially burning significant credits even if data hasn’t changed.
    • Identity Resolution: This is a credit-intensive process.
    • Segmentation: Publishing segments also consumes credits. Carefully plan your data volumes, refresh schedules, and automation frequencies to manage and optimize credit consumption.
  • Zero-Copy Considerations: While revolutionary, ensure data type alignment between your source systems (e.g., Snowflake, Redshift) and Data Cloud. Also, factor in time for network and security setup for private connections between cloud environments.
  • Optimize Journeys for Data Cloud: Instead of trying to force Data Cloud activations into existing, potentially inefficient Marketing Cloud Journey structures, take the opportunity to remediate and optimize your journeys for best practices aligned with Data Cloud’s capabilities.
  • Data Cloud is NOT a Cleansing Tool: Reiterate this fundamental point: Data Cloud is primarily a data unification tool, not a data cleansing tool. It is your duty to ensure your source data is clean and accurate before it enters Data Cloud.
  • No Master Data Management (MDM) Solution: Data Cloud adopts a “key ring” approach to identity, focusing on linking various identifiers to a unified profile, rather than aiming to be a traditional “golden record” MDM solution.
  • Consent Management: The Web SDK includes built-in consent management. If you are using the Ingestion API, you will need to implement custom solutions to handle user consent requirements.
  • AI Integration: Data Cloud offers robust AI capabilities. You can build your own regression models using Einstein Studio with your Data Cloud data, or integrate external AI models from platforms like Amazon SageMaker, Google Vertex AI, Data Bricks, and even large language models from OpenAI or Azure OpenAI. This enables predictive analytics and smarter decision-making.

Conclusion

Salesforce Data Cloud represents a significant step forward in leveraging customer data. By breaking down silos, unifying profiles, and providing powerful activation capabilities, it empowers businesses to deliver hyper-personalized experiences and drive intelligent actions across their entire enterprise.

To get started, you need to take a strategic approach, plan carefully, understand your data deeply, and commit to continuous learning as the platform evolves. By prioritizing use cases, ensuring data quality upstream, and leveraging the diverse ingestion and activation methods, you can successfully implement Data Cloud and unlock the full value of your customer insights. The journey may present challenges, but a truly unified and actionable customer view – once implemented and maintained effectively – will be a precious asset for your business.

Explore related content:

Bring Customer Data into Slack with Salesforce Channels

How to Earn the Salesforce Data Cloud Consultant Certification

Can You Use DML or SOQL Inside the Loop?

How to Quickly Build a Salesforce-Native Satisfaction Survey Using SurveyVista

#DataCloud #MarketingCloud #Salesforce #SalesforceAdmins #SalesforceDevelopers

Image of a customer's face surrounded by unified customer information from email to sale.
Tomáš Gáltomas_sk
2025-06-01
2025-03-25

Stop Agentforce Dev Orgs From Expiring

Have you ever discovered—often too late—that the Salesforce Dev Org you worked so hard to set up has suddenly expired? You spend hours or even weeks configuring a demo, exploring new features, or practicing your development skills, only to log in one day and be greeted by an expiration notice. It happens more frequently than we realize, especially given that new specialty and trial orgs sometimes have shorter lifespans. A one-month trial here, a week-long environment there—before you know it, all that effort can vanish without warning.

Dev Orgs Don’t Last As Long As They Did In The Past

In the past, standard Salesforce developer orgs might have lasted six months or even a year. Then came specialized developer orgs for different industries, along with new trial orgs that offered limited time frames. Eventually, Data Cloud and now Agentforce Dev Orgs arrived. While these are fantastic for exploring cutting-edge features—like real-time data capabilities and new AI solutions—they usually come with a much shorter shelf life. Salesforce has recently announced new Agentforce Data Cloud developer orgs that expire after 45 days. It is indeed better than just a week, but the clock is still ticking.

Salesforce does send notification emails to remind users that their developer orgs are nearing expiration. However, these alerts can sometimes blend in with other automated messages, or they may land in spam folders. Whether you are a seasoned developer juggling multiple environments or a newcomer trying to learn the platform, it can be easy to lose track of which org is about to expire and which is still active.

The Challenge: Tracking Expiration Before It’s Too Late

Consider how critical these development and testing environments can be. They allow you to:

  • Build proof-of-concept solutions without risking a live production org.
  • Experiment with new Salesforce features or packages in a controlled setting.
  • Demo functionality to colleagues, stakeholders, or potential clients.

If one of these invaluable environments suddenly expires, you may lose crucial configurations, data setups, and more. The result is wasted time and frustration as you scramble to recreate everything in a fresh org.

While the standard practice is to keep track of all login details in a password manager or a spreadsheet, not everyone is consistent with that. Even if you meticulously store your credentials, you might not remember to log in regularly to reset the expiration countdown. That is exactly why an automated reminder can save you from this common pitfall.

Introducing a Schedule-Triggered Flow to Send Automatic Warnings

The solution is surprisingly straightforward: create a scheduled trigger flow in Salesforce that checks for inactivity in your Agentforce or Data Cloud developer org. If the system sees that you have not logged in for a certain number of days (in this case, 40 days), it sends you an email alert. This way, you will have time to log in and reset the expiration clock before day 45 sneaks up on you.

You can build this flow yourself fairly easily, or you can install an unmanaged package I have created. This package contains a flow that runs each night to review login history. If the flow sees that your last login is older than the set threshold, it pings your email to remind you to hop back in. This single step could spare you hours of rebuilding or, worse, losing all the work you have done.

Below is a walkthrough of how to set up this scheduled flow. You will also find tips for customizing it to your specific username if you have changed it from the default (which normally ends with @agentforce.com).

Step 1: Confirm Your New Developer Org’s Basics
First, head to the Setup menu. Check your username and confirm that it ends with @agentforce.com. Confirm that your email address is accurate. You can also check that the Org URL includes “org farm,” indicative of the new Agentforce and Data Cloud Dev Orgs.

Agentforce and Data Cloud developer orgs often come with constraints—particularly around how many AI requests you can make. That said, for most learning or exploration scenarios, these limits will not be prohibitive.

Step 2: Build or Install the Flow
If you opt to build the flow from scratch, you will create a Scheduled Trigger Flow that runs once daily. Set it to run near midnight to ensure it checks your org’s login activity for that day.

In this flow, you will do the following:

  1. Define a constant (e.g., usernameSearchString) that points to any user whose username ends with @agentforce.com.
  2. Query the User object to find any records whose Username matches that constant. In most new developer orgs, there should only be one match.
  3. Retrieve the LoginHistory for that user, sorting records in descending order by login time. This step ensures you get the most recent login date.
  4. Compare that login date to the current date minus 40 days. If the last login is older than 40 days, proceed to send an email alert. If it is not, the flow does nothing.

If you have changed your default username, you will need to adjust your constant to match the portion that identifies your user. Otherwise, the out-of-the-box approach (i.e., checking for @agentforce.com) will work fine.

Step 3: Configure the Email Alert
Within the flow, you will set an Action to send an email. This email typically goes to the address tied to the user record (the one you provided when creating the developer org). The email is straightforward but crucial. You might specify a subject line like “Warning: Your Developer Org Will Expire Soon” or something equally attention-grabbing.

For the email body, you can create a Text Template. Include pertinent details such as the last login date, the username, and a short explanation that logging in again will reset the expiration countdown. Include enough information that you can quickly locate the correct credentials in your password manager. A typical text template might read:

Subject:
Warning – Action Needed! Developer Org Nearing Expiration

Body:
Hello, This is an automated reminder that your Agentforce developer org, associated with username {!User.Username}, is nearing its 45-day expiration window. Your last login was {!LoginHistory.LoginTime}. To prevent expiration, please log in as soon as possible.

Thank you!

You can, of course, modify the text to your preference. The main goal is to ensure you see this email and know exactly which org requires attention.

Step 4: Activate and Confirm the Flow
When you install the flow template from the unmanaged package, it might come pre-activated. If you build one from scratch, remember to Activate it. Then, check under Scheduled Jobs in Setup to confirm the flow is listed to run nightly.

If everything is configured correctly, you will receive an email once your last login date surpasses 40 days. At that point, you know you have a five-day grace period before the 45-day expiration hits—plenty of time to jump back into the org and keep it alive.

Step 5: Enjoy Peace of Mind
That is it! No more frantic searches through your password vault only to discover your demo is gone. The next time you spin up a new developer org for Data Cloud exploration, build this simple safety net. You will save yourself the frustration of expired orgs and lost work.

Install the Unmanaged Package to Leverage the Scheduled Flow

The link for the unmanaged package is as follows:

https://login.salesforce.com/packaging/installPackage.apexp?p0=04tgK0000000TOb

Bonus: How To Add the Org URL to the Email

One of our YouTube subscribers asked whether the Org URL can be added to the email. This is a very good suggestion, and I am planning on adding it to the unmanaged package in the future.

If you want to add the URL, please follow these instructions:

  1. Create a URLFormula resource. Your formula will read as follows:LEFT($Api.Partner_Server_URL_630, FIND( '/services', $Api.Partner_Server_URL_630))
  2. Go to your email body Text Template and view source (view as Plain Text), and replace the username resource {!Get_User.Username} with the following:
    <a href="{!URLFormula}">{!Get_User.Username}</a>

This will make the username value on the email clickable. Once the user clicks on the username, they will be taken to the login page for the Org.

Final Thoughts on Agentforce Data Cloud Developer Orgs

These new developer environments are wonderful for trying out the latest Salesforce innovations. You can test specialized features that may not be available in a more traditional developer edition. The big trade-off is their comparatively short expiration period. Forty-five days is enough for many use cases, yet it is still short enough that forgetting just one login can lead to losing your entire environment.

With the scheduled trigger flow approach, you can have the best of both worlds: a flexible, feature-rich developer org with a relatively short lifespan, backed by a reliable reminder system preventing accidental expirations.

Watch the video on YouTube for additional details 👇

Explore related content:

Salesforce Summer ’25 Preview: Major Flow Changes to Watch For

How to Get Your AI-Powered Enhanced Developer Org with Agentforce and Data Cloud

TDX 2025 News: Salesforce Agentforce 2dx and AgentExchange

My Inaugural Agentforce-Assisted Flow Building Experience

#Agentforce #DataCloud #DevOrg #Salesforce #SalesforceAdmins #SalesforceDevelopers

Stop Agentforce Dev Orgs from Expiring
Yonhap Infomax Newsinfomaxkorea
2025-02-27

Snowflake reports strong Q4 earnings, expands Microsoft partnership, driving stock surge in after-hours trading amid growing importance in data and AI sector

en.infomaxai.com/news/articleV

2025-02-23

🟢 Demo: SAP Business Data Cloud | SAP Business Unleashed youtu.be/OkwQimWDeos?si=UNGdcA via @YouTube
(and find related Videos in the SAP channel - see below)

#SAP #SAPBDC #GenAI #LLM #DataCloud #DataLake #SAPChampions #SAPBW #SAPDatasphere @sap

🇪🇺 Paul Schoonhoven 🍋🍉vosje62@mastodon.nl
2025-02-18

Rijk houdt documenten en e-mail ministeries voorlopig in eigen datacenters - #NRC
nrc.nl/nieuws/2025/02/18/docum

Goed nieuws op het gebied van cloud data!

Ongetwijfeld met dank aan @bert_hubert (en vele anderen) voor zijn inzet voor deze ingezette beleidswijziging.

Archive link: archive.ph/UpNel

#DataCloud #Overheid #Microsoft

Naar aanleiding van ,,de veranderende situatie en uw zorgen” heeft SSC-ICT ,,het ontwerp van de werkplek aangepast”, schrijft de staatsecretaris nu. De aanpassing moet de afhankelijkheid van de cloudbedrijven sterk verminderen. ,,Vanwege het veranderende nationale en internationale speelveld blijft de opslag van documenten en e-mail in de eigen datacenters, in afwachting van nader beleid op dit punt.”
De rijksoverheid werkt ook aan ,,een exit-strategie” en doet onderzoek naar ,,autonome alternatieven zodra deze beschikbaar komen”, onder meer in samenwerking met ,,Europese partners”.
Forceark Academyforcearkacademy
2025-01-27

What is Data Cloud? 🌐
It’s the future of unified customer insights! Data Cloud connects and harmonizes all your data in real-time, empowering you to deliver personalized experiences like never before.

For more details visit: zurl.co/2nxbM

2024-09-24

Dreamforce is now in the rearview mirror, so I’d like to give a few of my own reflections on the event. The one term that took center stage in Salesforce’s ecosystem: Agentforce. Introduced only weeks ago, Agentforce is now integrated across multiple platforms like Slack and Tableau, positioning itself as a key player in Salesforce’s vision for the future. This article explores how Agentforce, Data Cloud, and new trends like consumption-based pricing are reshaping Salesforce, and what it all means for the future of CRM.

Image from Salesforce+

Agentforce is everything

In a meeting with Salesforce right before Dreamforce, I heard Salesforce experts answer the ultimate question about Agentforce: What is Agentforce? Is it the new Copilot or Einstein bot? The answer was “Everything is an agent.” When I heard that, I should have known what was coming in Dreamforce. Almost every session had Agentforce in it. We just heard the term for the first time a few weeks back, and Salesforce said Agentforce is GA (generally available) in Slack, GA in Tableau, etc. That’s because all existing Copilot solutions got relabeled to Agentforce. What sets Agentforce apart from Salesforce’s previous pitches in this area?  Salesforce emphasizes that Agentforce can take actions. And Salesforce will provide you with a bunch of out of the box actions. If you want to learn more about Agentforce head over to our dedicated blog post HERE.

Data Cloud is still important

When I reviewed the schedule for Dreamforce, I saw that Data Cloud still had a keynote, but I was skeptical. Will we forget about Data Cloud’s notoriety in a few years? I am still not sure. However, after watching the sessions, I see that Data Cloud is required to deploy Agentforce. Whether that will remain so in the future, I am uncertain. I can tell you for now that Data Cloud is here and still holds a good share of the center stage.

Consumption-based pricing is the way forward

There is a general trend among the clients: They are tired of fixed per-seat subscription pricing. This is not isolated to Salesforce, by the way. I heard that the list price for Agentforce is $2 per interaction. Most of you know that Data Cloud is also sold based on consumption-based pricing models. There is a general trend in Salesforce’s approach, which is expected to continue: We are heading to a consumption-based pricing future.

Conversational Interface is where it’s at

When we watched these Dreamforce sessions, many of us did not realize that Salesforce is setting a vision for how users are expected to interact with their CRM in the future. The future is conversational. Clicking buttons, turning sliders on and off, or dragging items from one box to another will be a thing of the past. Instead, you’ll converse with your CRM. Typing may be necessary for now, but soon, you’ll be able to talk and listen.

ROI takes center stage instead of Trust

When AI came along, Salesforce sold very hard: It is the only provider that can provide the secure Trust layer you need. Salesforce was the Sheriff in the Wild West. After more than a year, it is evident that the Trust pitch did not work. Do we not have the Trust Layer anymore? We do. But the focus has changed. All demos focus on the ROI of the implementation rather than the Trust Layer aspect.

Slack’s Salesforce Channels: The first attempt to dethrone Chatter

Let’s face it: Until now, Slack has not done that much to integrate with Salesforce. AI capabilities have also been somehow slow as well. At Dreamforce, Salesforce announced Salesforce Channels in Slack. Now you can create a channel in Slack for your Salesforce object records. Your record field values now appear in Slack, and the Slack conversations are displayed on the Salesforce page layout. Sounds very promising, does it not? We will see how well it is implemented when it is GA. For the Slack and Salesforce clients this could be the start of the end of Chatter.

Image from Salesforce+; Lidiane Jones speaking at the Main Keynote

Salesforce Foundations comes bearing gifts

Salesforce announced Foundations, a free upgrade that brings key cross-department features to every Salesforce customer at no additional cost. Foundations includes Sales, Service, Marketing, Commerce, and Data Cloud features, all in a new, enhanced UI. This makes it easier for businesses to get a unified view of their customers and build connected, AI-powered experiences across every customer touchpoint. Salesforce has been making worthwhile attempts to bring various cloud capabilities into core under the same umbrella making it easy for smaller clients to adopt new features.

Admin Role is changing

I watched the admin keynote, and the content did not seem to be geared toward admins. What is the reason for that? There are two possibilities:

  1. Salesforce is pushing hard to sell Data Cloud and Agentforce, and the content was shaped this way due to that motive.
  2. Salesforce sees that the admin role needs to change and, therefore, is pushing the discussion in this direction.

In the admin demos, we saw user management, Data Cloud, Agentforce, integrations, flow, and invocables. The content was a mashup of pretty advanced capabilities. Are the admins expected to upskill to this scope? Will the new admin job posts include all these capabilities? I don’t know. What I know for sure is that the change is coming. Prepare for the inevitable by getting AI certified for FREE for a limited time.

What do you think?

Join the conversation.

Explore related content:

Top 9 Salesforce Winter 25 Flow Features

What is the Salesforce AI Specialist Certification?

Exploring Salesforce Admin Certification Paths

https://salesforcebreak.com/2024/09/24/dreamforce-2024-reflections/

#Admin #Agentforce #AI #ArtificialIntelligence #Certification #DataCloud #Dreamforce #Foundations #ROI #Salesforce #Slack #Update

Abhinav Guptaabhinavguptas
2024-03-28

❤️ AI Modeling Made Easy: Discover Model Builder's Zero-ETL Magic 🪄

🥸 For more, watch the full video of the @salesforce TrailblazerDX 2024 Keynote:
youtube.com/watch?v=8vzwVG2z-vI

Abhinav Guptaabhinavguptas
2024-03-27

Semantic Search & Vector Databases in @salesforce Data Cloud

🥸 For more, watch the full video of the Salesforce 2024 Keynote:
youtube.com/watch?v=8vzwVG2z-vI

Abhinav Guptaabhinavguptas
2024-03-23

🤩 We ❤️ co-pilot actions 🤖 announcement in

😎 Clip from discussion with Aditya, Lead Dev Evangelist at @salesforce

😎 The full Video podcast will premier soon on:
🔗 youtube.com/@abhinavguptas/
🔗 podcasters.spotify.com/pod/sho

Abhinav Guptaabhinavguptas
2024-03-22

🚀 Einstein Copilot: One app assistant seamlessly integrating data, logic, and insights.

🥸 For more, watch the full video of the @salesforce TrailblazerDX 2024 Keynote:
youtube.com/watch?v=8vzwVG2z-vI

Abhinav Guptaabhinavguptas
2024-03-21

🚀 Einstein Copilot is for Every Employee | Turtle Bay Case Study

🥸 For more, watch the full video of the Salesforce TrailblazerDX 2024 Keynote:
youtube.com/watch?v=8vzwVG2z-vI

Abhinav Guptaabhinavguptas
2024-03-20

🤩 Bajaj's leap with Data Cloud: Millions of profiles. Fast. Vast. Unheard-of

🥸 For more, watch the full video of Salesforce TrailblazerDX 2024 Keynote >
youtube.com/watch?v=8vzwVG2z-vI

@salesforce

Abhinav Guptaabhinavguptas
2024-03-20

🍿 How Netflix, HBO, Prime, and Disney are making similar movie recommendations?

🥸 Watch the full video on the @APEX Hours YouTube channel here - youtube.com/watch?v=wSBVhKy4PQ8

Abhinav Guptaabhinavguptas
2024-03-20

✨ Decoding Vector Databases for Salesforce Professionals

🥸 Watch the full video on the Apex Hours YouTube channel here - zurl.co/M3sE

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst