Cloud computing firm Appian has championed its newly-released Appian Data Fabric system, claiming the system’s real-time read/write feature offers capabilities that aren’t offered by any competing vendor.
In the keynote that opened its annual Appian Europe conference, the system, which launched as part of Appian 22.4 last week, was the central talking point. The solution combines the automation and low-code development for which Appian is known across a single virtual data model, allowing customers to access unified data and identify areas in which to implement systems such as robotic process automation (RPA).
Appian insisted that while rival data fabric systems are available, none can offer its ease of adoption. The firm said 45% of its customers have already leveraged Data Sync, a process for caching source data that is at the heart of the data fabric system, and stated that for all its benefits Appian Data Fabric is no step up from these existing systems.
The workloads, automations, and queries set up by customers already laid the groundwork for Appian Data Fabric, as metadata from all of this can be used to optimise data management within the platform.
For example, by analysing how a customer builds its processes, what it automates, and which queries that analysis will require, Appian’s data fabric can automatically repartition and re-index data within its virtual model to ensure it can be accessed with maximum efficiency.
“When other clients say ‘oh yeah, we have a data fabric,’ what they’re talking about is probably much more like a data lake,” said Michael Beckley, CTO and a founder at Appian.
“They might be able to analyse data from different locations. But being able to write transactions to the fabric in real-time, and read from it, that’s what the Appian data service can do, that no one else can.”
In his keynote address, Beckley also stressed that the data fabric is an improvement to the system that customers already use rather than a new product, and comes with no new training requirements or added cost.
Why smart businesses view a data fabric as an inevitable approach to becoming data driven
Adopting a data-driven strategy for success
He also stated that the company sees Data Fabric as an alternative to rip and replace models, or any vendor lock-in, giving the example of a data fabric that connects Oracle, Microsoft, and Salesforce alongside the Appian database.
“The answer is not just Appian. The answer is using Appian to take advantage of the technology investments you already have. And with Appian’s data fabric, we’re confident you’ll be able to do that.”
A data fabric is a data management architecture that integrates multiple data pipelines or cloud environments in an end-to-end data environment, allowing customers to access their data through an optimised access point that requires little to no code.
The firm claimed that Appian Data Fabric has doubled search speeds, requires customers to use 75% less code to build charts, and allows zero-code apps to be built up to ten times faster than previously possible.
Last year, Gartner named data fabric as a key technology trend for 2022, noting its potential for adding to existing infrastructure. Since then, a number of companies have offered data fabric systems, such as Google Cloud.
Down the line, Appian said its existing data system could also be used to measure parameters such as carbon impact across a number of silos.
Gartner recently published findings that marked sustainability as a key technology trend for 2023, and that businesses investing in it are doing so to tackle disruption, and reap its short and long-benefits.
Malcolm Ross, deputy CTO at Appian, told IT Pro that regulatory pressure increases in a recession, and that environmental, social, and corporate governance (ESG) will be core to this.
“ESG is definitely the next GDPR-level type of regulatory oversight that companies will have to report their carbon footprint. They will have to report how their governance modelling, their board, is accurately collecting the culture of that organisation or that country as well.”
Ross also stated that efficiency savings through efforts such as automating the back office are unlikely to provide meaningful results with sustainability in mind. He argues that only holistic oversight, allowing for proper identification of weaknesses through processes such as big data analytics, will be necessary.
“You’re not going to become better at ESG and sustainability, by just doing automation what you need is visibility.”
Head of process mining Karina Buschsieweke told IT Pro that Appian’s data fabric is already capable of providing this visibility, and that growing regulatory pressures might lead to more specific focus on this process down the line.
“How process mining works is, you have attribute data, so you have a process and you can add as many attributes to that as you like, so can be location, product, etc.,” said Buschsieweke
“But it could also be CO2 as a further data attribute. Now, of course, today the major challenge for all companies is where to get the CO2 impact data from, because that’s also a whole different issue – how to standardise it, and make it comparable.”
“However, if you have the data already, you could do this with our solution today. And then probably, there could be features that might even make it easier for customers than this. I think at some point in the future, companies will be required to report on CO2 the same way as they have to report their euros.”
AI for customer service
IBM Watson Assistant solves customer problems the first time
Solve cyber resilience challenges with storage solutions
Fundamental capabilities of cyber-resilient IT infrastructure
IBM FlashSystem 5000 and 5200 for mid-market enterprises
Manage rapid data growth within limited IT budgets
Leverage automated APM to accelerate CI/CD and boost application performance
Constant change to meet fast-evolving application functionality
See the original article here: ITPro