Group Abstract Group Abstract

Message Boards Message Boards

From local CDF to managed Cloud service?

Posted 2 years ago

In our organization, there exists a CDF document which provides insights based on data that is embedded into it. This comes with several issues:

  • No proper access management: (Basically) everyone can download the .cdf file from a shared drive -- including the underlying data. It would be desirable to exert control over who can access both the .cdf file and the data that is embedded within it.

  • Manual data updates: The original author has to regularly update the embedded data by hand, there is no pipeline that feeds up-to-date data into the .cdf file -- that's time consuming and should be outsourced.

  • Cumbersome access: Instead of asking people to download a file, we'd much rather share with them a URL, where they can find the CDF embedded into a web site. This web site could be protected by some authentication mechanism and would prevent the endless duplication of said file.

Hence my question: as we're hardly the first ones experiencing this, how is this typically solved? I read that CDFs can be deployed to Wolfram Cloud, but how does it integrate with enterprise access management? How is confidential data dealt with -- can Wolfram Cloud be connected to work with local data sinks?

I'm curious to hear you experiences. :)

POSTED BY: Just Another Guy
Posted 22 days ago

1. Cloud Deployment and Access Management

When you deploy your CDF notebook to the Wolfram Cloud, it becomes a CloudObject with a stable URL, solving your access issue. The key is to separate the interactive front-end from the confidential back-end data.

A. Sharing the Application (URL)

Instead of a file download, you use the CloudDeploy function to publish the notebook, which provides a single, permanent URL (e.g., https://www.wolframcloud.com/obj/YourName/YourApp).

B. Controlling Access (Authentication)

Access management is handled by setting the Permissions on the deployed CloudObject

Public (Read): Allows anyone with the URL to view the interactive application, but not the underlying code or data.

"Authenticated": Restricts access to only users who log in with a valid Wolfram ID (e.g., your organization's members).

Specific Users/Groups: You can grant access to a limited list of email addresses or defined user groups for granular control.

By setting the front-end application's permissions to a limited group and using Dynamic content, users interact with the app without ever being able to download or inspect the raw data.

2. Dealing with Confidential Data and Updating Data

Instead of embedding data, you externalize it into a secure, version-controlled cloud data resource.

A. Data Isolation and Security

Extract Data: Remove all raw data from the CDF notebook.

Store Securely: Store the data as a separate, private CloudObject - for example, aCloudExpression`, a private data notebook, or a JSON file.

Restrict Data Access: Set the permissions on the data object to "Owner" or "Private". This prevents users from accessing the data directly, even if they know the object's URL.

Connect App to Data: Your deployed application uses functions like CloudGet or CloudExpressionGet to securely read the data when the user loads the app. Since the application runs as you (the owner), it has permission to read the data, but the end-user does not.

B. Automated Data Updates

You can outsource the data update process using the ScheduledTask function

  1. Create a separate script/function that performs the manual update logic (e.g., querying a database, cleaning data, and preparing it).
  2. Use CloudPut to save the fresh data to your private CloudExpression or data file.
  3. Wrap this script in a ScheduledTask to run automatically at a specified time (e.g., daily at 2:00 AM).

    CloudDeploy[
     ScheduledTask[
      Module[{newData},
       (* Logic to fetch/process data goes here *)
       newData = (* Import from external system or run calculations *);
    
       (* Overwrite the secure cloud data *)
       CloudPut[newData, "YourPrivateDataExpression"]
       ], 
      "Daily"
     ]
    ]
    

3. Enterprise Access and Local Data Sinks

Connecting the Wolfram Environment to systems behind your corporate firewall (local data sinks, internal databases) requires a deployment option that can bridge the two networks.

Wolfram Public Cloud

Authentication

Wolfram ID (Single sign-on possible with paid tiers)

Local Data Access

Limited (Requires the system to be publicly reachable via HTTPS).

Scale

Shared resources

Wolfram Enterprise Private Cloud (EPC)

Authentication

Integrate with Active Directory (AD) or LDAP for full enterprise access control

Local Data Access

Full (Deployed within your firewall - direct, secure connections to SQL, file shares, APIs)

Scale

Dedicated, scalable infrastructure fully managed by your organization.

For organizations dealing with highly confidential, internal data and requiring integration with existing IT infrastructure (like connecting to local data sinks), the solution is typically the Wolfram Enterprise Private Cloud (EPC) or Wolfram Application Server.

These products are installed behind your firewall and can speak directly to your internal resources, while still providing the URL-based sharing and automated workflow benefits of the cloud platform.

POSTED BY: Rob Pacey
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard