Data Delivery Methods

The Four Most Commonly Used Data Integration Delivery Types

Data integration is a combination of technical and business processes used to combine different data from disparate sources in order to answer important questions. This process generally supports the analytic processing of data by aligning, combining, and presenting each data store to an end-user.
Organizations increasingly view data integration tools for enterprise-wide data delivery, data quality, governance, and analytics. Data integration allows organizations to better understand and retain their customers, support collaboration between departments, reduce project timelines with automated development, and maintain security and compliance.
If you’re just beginning your search for a new data integration tool, its important to arm yourself with the knowledge about the different delivery methods traditional software products offer. Sifting through the various styles of integration tools can be confusing, so we put together this list of the four major data delivery techniques common in integration tools.
Bulk/Batch
Acts as a support mechanism for extract, transform and load (ETL) processes to consolidate data from primary databases. This involves bulk and or batch data extraction that draws data from across system and organizational data stores. This is an efficient way of processing large data volumes over a period of time. Data is collected and processed, then batch results are created.
Data Virtualization
Allows users to create a virtual abstract layer than can be mirrored to provide one single view of all the data that resides in the database instead of having to run through the process of ETL to get the data loaded up into an analytic framework. This enables the user to piece together databases, data warehouses, and even cloud services to gain a comprehensive view of the data that matters most.
Message-Oriented Movement
Groups data into messages that applications can read so data can be exchanged in real-time. This depends on a message bus which becomes triggered by events and delivers data packets to application integration technologies. Oftentimes middleware is involved, acting as a software or hardware infrastructure that supports sending and receiving messages between distributed systems.
Data Replication
Frequent copying of data from a database to another that allows all users to share the same level of information, resulting in a distributed database that enables user access to data relevant to their own tasks. This provides data synchronization that enables users to manage growing data volumes while gaining access to real-time information.
Author Recent Posts Tim is Solutions Review’s Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 “Who’s Who” in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.
5 Data Integration Methods and Strategies - Talend

5 Data Integration Methods and Strategies – Talend

The ability to create massive amounts of data is mind-blowing. If only the ability to harness insights from this data kept pace with the ability to create it. Now, with exciting advancements in data integration, that gap is narrowing.
But how is data integration helping companies generate business intelligence? We’ll answer that question by explaining the five types of data integration, listed below, and how cloud computing is impacting this growing field.
Manual data integration: Data managers must manually conduct all phases of the integration, from retrieval to presentation.
Middleware data integration: Middleware, a type of software, facilitates communication between legacy systems and updated ones to expedite integration.
Application-based integration: Software applications locate, retrieve, and integrate data by making data from different sources and systems compatible with one another.
Uniform access integration: A technique that retrieves and uniformly displays data, but leaves it in its original source.
Common storage integration: An approach that retrieves and uniformly displays the data, but also makes a copy of the data and stores it.
What is data integration?
Data integration is the process of combining data from different sources to help data managers and executives analyze it and make smarter business decisions. This process involves a person or system locating, retrieving, cleaning, and presenting the data.
Data managers and/or analysts can run queries against this merged data to discover business intelligence insights. With so many potential benefits, businesses need to take the time to align their goals with the right approach.
To get a better understanding of data integration, let’s dive into the five types (sometimes referred to as approaches or techniques). We’ll discuss the pros and cons of each type and when to use each one.
Download The Definitive Guide to Data Integration now.
Read Now
1. Manual data integration
Manual data integration occurs when a data manager oversees all aspects of the integration — usually by writing custom code. That means connecting the different data sources, collecting the data, and cleaning it, etc., without automation.
Some of the benefits are:
Reduced cost: This technique requires little maintenance and typically only integrates a small number of data sources.
Greater freedom: The user has total control over the integration.
Some of the cons are:
Less access: A developer or manager must manually orchestrate each integration.
Difficulty scaling: Scaling for larger projects requires manually changing the code for each integration, and that takes time.
Greater room for error: A manager and/or analyst must handle the data at each stage.
This strategy is best for one-time instances, but it quickly becomes untenable for complex or recurring integrations because it is a very tedious, manual process. Everything from data collection, to cleaning, to presentation is done by hand, and those processes take time and resources.
Download 5 Data Integration Methods and Strategies now.
View Now
2. Middleware data integration
Middleware is software that connects applications and transfers data between them and databases. It’s especially handy when a business is integrating stubborn legacy systems with newer ones, as middleware can act as an interpreter between these systems.
Better data streaming: The software conducts the integration automatically and in the same way each time.
Easier access between systems: The software is coded to facilitate communication between the systems in a network.
Less access: The middleware needs to be deployed and maintained by a developer with technical knowledge.
Limited functionality: Middleware can only work with certain systems.
For businesses integrating legacy systems with more modern systems, middleware is ideal, but it’s mostly a communications tool and has limited capabilities for data analytics.
Download Self-Service Analytics now.
3. Application-based integration
In this approach, software applications do all the work. They locate, retrieve, clean, and integrate data from disparate sources. This compatibility makes it easy for data to move from one source to the other.
Some of the benefits include:
Simplified processes: One application does all the work automatically.
Easier information exchange: The application allows systems and departments to transfer information seamlessly.
Fewer resources are used: Because much of the process is automated, managers and/or analysts can pursue other projects.
Some of the cons include:
Limited access: This technique requires special, technical knowledge and a data manager and/or analyst to oversee application deployment and maintenance.
Inconsistent results: The approach is unstandardized and varies from businesses offering this as a service.
Complicated setup: Designing the application(s) to work seamlessly across departments requires developers, managers, and/or analysts with technical knowledge.
Difficult data management: Accessing different systems can lead to compromised data integrity.
Sometimes this approach is called enterprise application integration, because it’s common in enterprises working in hybrid cloud environments. These businesses need to work with multiple data sources — on-premises and in the cloud. This approach optimizes data and workflows between these environments.
4. Uniform access integration
This technique accesses data from even more disparate sets and presents it uniformly. It does this while allowing the data to stay in its original location.
Some of the advantages are:
Lower storage requirements: There is no need to create a separate place to store data.
Easier data access: This approach works well with multiple systems and data sources.
Simplified view of data: This technique creates a uniform appearance of data for the end user.
Some of the difficulties are:
Data integrity challenges: Accessing so many sources can lead to compromising data integrity.
Strained systems: Data host systems are not usually designed to handle the amount and frequency of data requests in this process.
For businesses needing to access multiple, disparate systems, this is an optimal approach. If the data request isn’t too burdensome for the host system, this approach can yield insights without the cost of creating a backup or copy of the data.
5. Common storage integration (sometimes referred to as data warehousing)
This approach is similar to uniform access, except it involves creating and storing a copy of the data in a data warehouse. This leads to more versatility in the ways businesses can manipulate data, making it one of the most popular forms of data integration.
Reduced burden: The host system isn’t constantly handling data queries.
Increased data version management control: Accessing data from one source, versus multiple disparate sources, leads to better data integrity.
Cleaner data appearance: The stored copy of data allows managers and/or analysts to run numerous queries while maintaining uniformity in the data’s appearance.
Enhanced data analytics: Maintaining a stored copy allows manager and/or analysts to run more sophisticated queries without worrying about compromised data integrity.
Increased storage costs: Creating a copy of the data means finding and paying for a place to store it.
Higher maintenance costs: Orchestrating this approach requires technical experts to set up the integration, oversee, and maintain it.
Common storage is the most sophisticated integration approach. If businesses have the resources, this is almost certainly the best approach, because it allows for the most sophisticated queries. That sophistication can lead to deeper insights.
Which data integration strategy is right for your business?
The race to the cloud has left systems scattered in on-premises, hybrid, and cloud-based environments. Data integration is a smart way to connect these disparate systems so businesses can effectively analyze their data.
Deciding which strategy is right for any business means understanding the complexity of the systems that need to integrate. If all you need is to integrate ony a handful of systems, a manual approach may be sufficient.
Enterprise businesses, however, will likely need to integrate multiple, disparate systems, which requires a multi-functional integration strategy.
To give you some guidance, we’ve outlined the best scenario for each approach:
Data integration approach
When to use it
Manual data integration
Merge data for basic analysis between a small amount of data sources
Middleware data integration
Automate and translate communication between legacy and modernized systems
Application-based integration
Automate and translate communication between systems and allow for more complicated data analysis
Uniform access integration
Automate and translate communication between systems and present the data uniformly to allow for complicated data analysis
Common storage integration
Present the data uniformly, create and store a copy, and perform the most sophisticated data analysis tasks
There are many aspects to consider in your choice of a data integration strategy. Along with the above benefits, consider the following when choosing your data integration strategy:
Create a data governance strategy. Take stock to understand the quality of the data, how you want to analyze it, and make sure the governance strategy aligns with business objectives.
Understand which cloud service provider is best for you. With so many providers and platforms, it’s wise to take the time to understand which provider/platform meets the business’ needs now and in the future.
Choose a data integration provider carefully. If you’re going to hire a data integration firm, research which ones have the breadth and depth of tools to provide a comprehensive service.
Decide which systems to update. Updating every system is the best option, but that’s expensive. Consider which ones are essential to update and which ones aren’t.
The cloud and the future of data integration
The breathtaking growth of cloud capabilities will continue to transform businesses in exciting ways. As these advancements march on, data integration strategies will become more complex.
We can’t see the future, but we do know that as the relationship between mobile technologies and cloud computing intensifies, managers, analysts, and execs will be less tied to workplaces. They’ll be able to access data, run complicated queries across disparate systems, and retrieve the results in real-time on a hand-held device — anywhere they want. This ability means data integration tools will need to work seamlessly across devices and on different networks.
Businesses will also start sharing their data. That requires data integration approaches that work not just within a business, but between organizations. The need for this increased access will drive data integration architects to develop even more robust capabilities. And cloud-based platforms will enable this sharing on even larger scales, across businesses, and at ever-increasing speeds.
Data integration tools
From manual to common storage, we’ve covered the main types of data integration. Businesses best implement these strategies by adopting data integration tools, but how do you know which tool to use?
A good integration tool has the following characteristics:
Portability: Movement between on-premises and the cloud is essential. Portability allows organizations to build data integrations once and run them anywhere.
Ease of use: Tools should be easy to understand and easy to deploy.
Cloud compatibility: Tools should work seamlessly in a single cloud, multi-cloud, or hybrid cloud environment.
The best tools are compressive and combine the capabilities above. Talend Data Fabric, for example, is a single suite of apps that collects, governs, transforms, and shares data by offering a host of features like self-service apps, pervasive data quality, and smart governance. These services span all data sources from end-to-end so that you can conduct your data integration quickly and comprehensively.
While some businesses are still producing more data than they can effectively analyze, data integration strategies are helping close that gap. As these strategies become more refined and elaborate, it can be challenging to pick the right one for your business. The stakes, however, have never been higher.
The right data integration strategy can translate into insights and innovation for years to come. Consider your needs, your goals, and which type of approach matches both, so you make the best decision for your business.
Common Data Integration Techniques and Technologies Explained

Common Data Integration Techniques and Technologies Explained

Most organizations of medium to large size use a wide array of applications, each with its own databases and data stores. Whether these applications are based on-premise or in the cloud, it is critical to the usefulness of these applications that they share data between them. Hence, to facilitate the sharing process, data integration applications are used but the question still remains, what is data integration?
In this blog, we will discuss what data integration is in general, the various data integration approaches, and how to integrate data from different sources.
Data Integration Explained (Source: Last Night Study)
What is Data Integration?
The process of consolidating data from multiple applications and creating a unified view of data assets is known as data integration. As companies store information in different databases, data integration becomes an important strategy to adopt, as it helps the business users to integrate data from different sources. For example, an e-commerce company that wants to extract customer information from multiple data streams or databases, such as marketing, sales, and finance. Data integration would help to consolidate the data arriving from various databases, and use it for reporting and analysis.
Enterprise data integration is done using different data integration techniques or strategies depending on a business’s unique requirements. Therefore, it is important to assess which data integration approach is right for your business.
Data integration is a core component of several different mission-critical data management projects, such as building an enterprise data warehouse, migrating data from one or multiple databases to another, and synchronizing data between applications. As a result, there are a variety of data integration applications, technologies, and techniques used by businesses to integrate data from disparate sources and create a single version of the truth. Now that you understand what data integration is, let’s dive into the different data integration techniques and technologies.
Types of Data Integration Techniques
The need for data integration in business intelligence arises when data is coming in from various internal and external sources. This is achieved by using one of the three different types of data integration techniques, depending on the heterogeneity, complexity, and volume of data sources involved.
Let’s take a look at these data integration approaches one by one and see how they can help improve business intelligence processes.
Data Consolidation
As the name suggests, data consolidation is the process of consolidating or combining data from different data sources to create a centralized data repository or data store. This unified data store is then used for various purposes, such as reporting and data analysis. In addition, it can also perform as a data source for downstream applications.
One of the key factors that differentiate data consolidation from other data integration techniques is data latency. Data latency is defined as the amount of time it takes to retrieve data from data sources to transfer to the data store. The shorter the latency period, the fresher data is available in the data store for business intelligence and analysis.
Generally speaking, there is usually some level of latency between the time updates occur to the data stored in source systems and the time those updates reflect in the data warehouse or data source. Depending on the data integration technologies used and the specific needs of the business, this latency can be of a few seconds, hours, or more. However, with advancements in integrated data technologies, it is possible to consolidate data and transfer changes to the destination in near real-time or real-time.
Data Federation
Data federation is a data integration technique that is used to consolidate data and simplify access for consuming users and front-end applications. In the data federation technique, distributed data with different data models is integrated into a virtual database that features a unified data model.
There is no physical data movement happening behind a federated virtual database. Instead, data abstraction is done to create a uniform user interface for data access and retrieval. As a result, whenever a user or an application queries the federated virtual database, the query is decomposed and sent to the relevant underlying data source. In other words, the data is served on an on-demand basis in data federation, unlike real-time data integration where data is integrated to build a separate centralized data store.
Data Propagation
Data propagation is another technique for data integration in which data from an enterprise data warehouse is transferred to different data marts after the required transformations. Since the data continues to update in the data warehouse, changes are propagated to the source data mart in a synchronous or asynchronous manner. The two common data integration technologies used for data propagation include enterprise application integration (EAI) and enterprise data replication (EDR). These data integration technologies are discussed below.
Different Data Integration Technologies
Data integration technology has evolved at a rapid pace over the last decade. Initially, Extract, Transform, Load (ETL) was the only available technology used for batch data integration. However, as businesses continued to add more sources to their data ecosystem and the need for real-time data integration technologies arose, hence new advancements and technologies were introduced:
Here is a roundup of the most popular data integration technologies in use today:
Extract, Transform, Load (ETL)
Probably the best-known data integration technology, ETL or Extract, Transform, Load is a data integration process that involves the extraction of data from a source system and its loading to a target destination after transformation.
ETL is used for data consolidation primarily and can be conducted in batches or in a near-real-time manner using change data capture (CDC). Batch ETL is mostly used for bulk movements of data, such as during data migration. On the other hand, CDC is a more suitable choice to transfer changes or updated data to the target destination.
During the ETL process, data is extracted from a database, ERP solution, cloud application, or file systems and transferred to another database or a data repository. The transformations performed on the data vary depending on the specific data management use case. However, common transformations performed include data cleansing, data quality, data aggregation, and data reconciliation.
Enterprise Information Integration (EII)
Enterprise Information Integration (EII) is a data integration technology used to deliver curated datasets on an on-demand basis. Also considered a type of data federation technology, EII involves the creation of a virtual layer or a business view of underlying data sources. This layer shields the consuming applications and business users from the complexities of connecting to disparate source systems having different formats, interfaces, and semantics. In other words, EII is a technology that allows developers and business users alike to treat a range of data sources as if they were one database and present the incoming data in new ways.
Unlike batch ETL, EII can handle real-time data integration and delivery use-cases very easily, allowing business users to consume fresh data for data analysis and reporting.
Enterprise Data Replication (EDR)
Used as a data propagation technique, Enterprise Data Replication (EDR) is a real-time data consolidation method that involves moving data from one storage system to another. In its simplest form, EDR involves moving a dataset from one database to another database having the same schema. However, recently, the process has become more complex to involve heterogeneous source and target databases, with data being replicated at regular intervals, in real-time, or sporadically, depending on the needs of the enterprise.
While both EDR and ETL involve bulk movement of data, EDR is different because it does not involve any kind of data transformation or manipulation.
In addition to these three key data integration technologies, enterprises with complex data management architectures also make use of Enterprise Application Integration (EAI), Change Data Capture (CDC), and other event-based and real-time technologies to keep up with the data needs of their business users.
Looking to implement an automated data integration software for your business? Learn in detail about how Astera can help you take advantage of these data integration techniques and create an agile data ecosystem, get in touch with our support department at and find out which data integration approach works for your use-case, or download a free trial of Centerprise and get started right away!

Frequently Asked Questions about data delivery methods

What is data delivery system?

DSS is a plug-ins based data delivery system extension of the application-oriented DBMS Phasme. … It is designed to satisfy maximum of application requirements and information systems’ needs and to get maximum database performance out of today’s hardware trends.

What are data integration methods?

Data integration is the process of combining data from different sources to help data managers and executives analyze it and make smarter business decisions. This process involves a person or system locating, retrieving, cleaning, and presenting the data.

Which of the following is used for bulk movements of data?

Extract, Transform, Load (ETL) Batch ETL is mostly used for bulk movements of data, such as during data migration.Sep 23, 2019

Leave a Reply

Your email address will not be published. Required fields are marked *