Dave Duggal
Founder / Managing Director at EnterpriseWeb LLC
Originally posted on LinkedIn
Over the last decade IT has increasingly made data accessible and systems, infrastructure and devices programmable via APIs giving developers more power (and responsibility) – it shifted effort left, but didn’t materially reduce the effort or complexity.
Now those same APIs, particularly when exposed through a high-level abstraction, can integrate and train AI/ML, expose NLP interfaces, and interact directly with GenerativeAI. This is the logical and inevitable trajectory to Autonomous Systems which can observe their own behavior, trouble-shoot and mitigate issues in a continuous closed-loop. The demands for real-time data-driven platforms that optimize actions and processes is hurdling IT towards autonomous, self-organizing systems.
All forms of AI/ML inherently favor models over hard-code and dynamic non-linear processes over static workflows, because declarative and dynamic systems fully separate domain logic from implementation, allowing for the greatest runtime flexibility and agility, which is the whole point of being real-time.
It will start with low-hanging fruit. RPA was really a workaround for challenges in formally integrating data across silos so it’s prime for informal automation methods like ChatBots. Complex, enterprise-grade, end-to-end automation is not going away anytime soon. However, Large Language Models (LLMs) will expedite the death of static, tightly-coupled, vertically-integrated stacks and compiled applications because they are not responsive or agile by design.
All forms of AI/ML inherently favor models over hard-code and dynamic non-linear processes over static workflows, because declarative and dynamic systems fully separate domain logic from implementation, allowing for the greatest runtime flexibility and agility, which is the whole point of being real-time.
A high-level abstraction provides a unified interface for an LLM to semantically and syntactically understand a domain. The breadth of the model effectively sets the scope for end-to-end automation across business silos, ecosystem partners and SaaS services. The depth of the domain model defines how it can work across layers of technology protocols (OSI 1-7).
LLMs are naturally aligned with high-level Model-driven Intelligent Automation platforms that are dynamic, loosely-coupled, horizontally-architected platforms like EnterpriseWeb, because their event-driven integration , orchestration and automation “middleware” capabilities are designed to dispatched on-demand (FaaS) and to be dynamically configured by models that provide context and policies.
EnterpriseWeb, which is API-first, can expose its models and capabilities to LLMs through a unified interface, just as it does for developers. Over time, LLMs take on increasingly complex design and management decisions.
Stay Tuned! Wait to you see how we’ve integrated generative AI for enterprise-grade service orchestration.
Copyright 2023, EnterpriseWeb LLC