Bloom with QIQ Orchestrate
BLUF: The upcoming mandatory migration from SAP to SAP S/4HANA poses huge challenges for organizations of every size. Our deep integration with both the current major revision and S/4HANA provides an automated approach to ingesting, transforming, and migrating data to the latest version.
Connecting to Your Data
The QIQ Orchestrate platform provides a full suite of data connection tools accessible from the web and hosted on AWS. For SAP migrations in particular, we can connect to SAP environments directly or ingest a static snapshot. After ingestion, we leverage an end-to-end software defined data integration (SDDI) model to extract all of the relevant data and prepare it for transformation
Transforming Your Data
Cleaning: Source systems often export data with common "cleanliness" issues, such as incorrect data types, poor handling of null / empty values or unwanted whitespace in string values. Our tools provide opinionated transformation options to fix these issues (and more).
Renaming: By using the source's metadata, we can rename the output tables and columns to be descriptive and self-explanatory, rather than sticking with an often non-human-readable schema.
Joining: Source systems often store related information (such as metadata) in separate tables, for example when conforming to the "normal" data model. Our tools use their understanding of the source's data model to join these tables together, providing a de-normalized, rich set of output datasets.
Filtering: Unwanted rows (such as duplicates) can be filtered out automatically, for example to de-duplicate change-data-capture inputs.
Migrating to S/4HANA
Once the transformation process has been completed, migration to S/4HANA proceeds in much the same manner as the original ingest. Through our SDDI model, we can directly connect to the various S/4 datastores and export the freshly transformed data. This can be achieved after the S/4 environment has been initialized; however, we can also package and export the data prior to initialization.
Fallout Shelter
BLUF: Ransomware protection is endemic to use of the Foundry platform, and, by extension, QuantumIQ products. We have expanded this protection into a secure, automated data enclave with alerting capabilities and resistance to social engineering attacks.
In recent years malicious agents have increasingly turned to use of ransomware attacks to extort organizations. These attacks compromise one or more of the host's systems and encrypt critical information or entire storage systems, often paralyzing operations entirely. The private key necessary for decryption is held in abeyance until or unless the listed ransom is paid to the attackers (usually in the form of cryptocurrency). These attacks can be particularly devastating for healthcare providers and other organizations performing critical and or time-sensitive operations.
One of the myriad reasons we chose to partner with Palantir to develop our products is their unwavering commitment to providing a secure platform for development. Built from the ground up with security and privacy in mind, Palantir's products (including Foundry) are trusted in every sphere of government and the commercial sector. Palantir's security pedigree includes a host of compliance standards – the complete list can be viewed at https://palantir.safebase.us/.
At QuantumIQ we build our products within the Palantir Foundry ecosystem to ensure that each one inherits the same rigorous commitment to privacy and security. Our products provide implicit ransomware protection for any involved data sources; additionally, that same protection can be added for unrelated data sources at the client's designation.
During the normal data ingestion process, a "digital twin" of relevant data is created within Foundry to power the various functions our products provide. This secure data copy is versioned and maintained to ensure that any data corruption or loss can be immediately rectified. In the event of a ransomware attack the client can recover any data protected by QuantumIQ, alleviating operational paralysis while the situation is undergoing remediation. In the event that all of the client's critical data is protected with us, the need for remediation is completely removed – simply re-image the affected hardware and sync the data from the data enclave.
Connecting to Your Data
Our partnership with Palantir gives us access to the full range of data connection tools within the Foundry platform. Among these tools are over 150 native data connectors to a full range of standard enterprise data sources, ranging from cloud-based object stores, file systems, databases, and data warehouses. Structured, unstructured, and semi-structured are all wrapped by a dataset, which is used internally to assign permissions and track the lineage of your data. In addition to providing a fully versioned view of your data, we also provide a suite of data health and upload scheduling tools to ensure that your data is fresh and correct in the event access is required.
Eliminating Single-Points-of-Failure through Decoupled Data Access
One of the most potent tools used by adversarial agents in the modern security space is social engineering. Rather than directly attack a system, it is often much simpler to deceive or otherwise compromise individuals that already have privileged access to a system. In these cases, detection of malicious behavior becomes increasingly difficult to detect and prevent.
Acknowledging that no organization is immune to attacks of this nature, we employ security principles designed to eliminate any single-point-of-failure caused by an individuals credentials being compromised. From the outset, our entire platform operates on the principle of least access: no account is given read or write permissions to any dataset by default. In addition to project and role-based access restrictions, markings and restricted views provide granular access controls down to the row and column level for applicable datasets. Users can also be granted access to datasets on a temporary basis. It is upon this last feature that we’ve built an additional layer of security to ensure the safety of your data.
When engaging with us to secure your data, a unique set of credentials is created and designated as the owner of all data ingested from your organization. Authenticating those credentials is split between the customer and us: the password is set and maintained by the customer’s organization, and a hardware authenticator is retained by us. This account is responsible for granting data access to our internal developers on a timed basis so they can handle ingestion, backup scheduling, and any alerting that may need to be implemented. Once configuration has been completed or the time has expired, only the owner account will retain access. Afterwards, only a join effort by both the customer and us will permit access to the owner credentials, your data is inaccessible by any single party within our system.
Useful Links
Exit with QIQ Orchestrate
BLUF: Migrating amongst different virtual machine providers and environments can require significant resources across your organization, often straining the balance between completion time and completion cost. Our cloud-based migration platform aims to facilitate and expedite this process through an intuitive, repeatable software pipeline that scales and adapts to your organization’s needs.
Connecting to Your Data
The QIQ Orchestrate platform provides a full suite of data connection tools accessible from the web and hosted on AWS. Among these tools are over 150 native data connectors to a full range of standard enterprise data sources, ranging from cloud-based object stores, file systems, databases, and data warehouses. Structured, unstructured, and semi-structured are all wrapped by a dataset, which is used internally to assign permissions and track the lineage of your data. In addition to providing a fully versioned view of your data, we also provide a suite of data health and tools to ensure that your data is correct.
Reusable Pipelines for VM Migration
Extensible and reusable data pipelines are critical for an efficient, scalable migration process. The necessity is amplified when migrating across fractured versioning or when migrating to diverse destinations. With config-mapping transforms developed either in-house or through partnerships with VM providers, we can automate the migration process at scale. We work with your organization to identify your specific migration workflow, including both data for transfer and configuration details that need to be migrated. Once established, the migration workflow is integrated into an existing pipeline, which can then be executed in parallel for each of your VMs.
Above is an example of the Pipeline Builder interface – each of the discrete components can be swapped as necessary to facilitate the client’s particular migration workflow.
Orchestrator
Contact Us for more information