Model Driven Engineering and Artificial Intelligence
As opposed to traditional manual programming, the use of Model Driven Engineering and artificial intelligence is a hot topic in modern software engineering.
Whoever tries, whoever starts developing software in an MDE (Model Driven Engineering) environment, no longer leaves this context. No one wants to regress. Going back to manual programming means wasting time, losing productivity, losing agility, losing the ability to change the way people around you work. Fundamentally, it means to lose quality of life, and not only on the professional perspective.
Software complexity is increasing
Software complexity has been growing through time. This complexity results from many factors: the size and number of components that make up a solution, the number of layers that support an information system (hardware, operating systems, DBMS, communications, browsers, programming), the number of different technologies required for its operation (including an increased number of programming languages), the rapid and accelerated evolution (and obsolescence) of these technologies, non-functional requisites (such as efficiency, testability or user experience), interactions with many other systems…
Software development, nowadays, is an extremely complex and highly professionalized activity.
Overcoming complexity is our mission
“Overcoming complexity is the mission of Software Engineering” – Vasco Amaral, NOVA-FCT University.
This mission has a new great challenge: software stakeholders – those who use it, who rely on it to make decisions, or who use it to exercise their rights (users, managers, citizens) – increasingly want to be present during the design and even be included in the production of the solution.
Software Models must be generalized
Software Engineering is far behind other engineering fields regarding the use of Models. Certainly, we can expect that the construction of a building is preceded by a detailed plan of the building, a blueprint. Civil Engineering has this practice fully instituted, but Software Engineering does not. The Software Model, which would be the equivalent to the building blueprint, in most cases does not exist or it is embedded and hidden in the software code.
This delay is not excused by the “youth” of Software Engineering, which, although more recent than other engineering fields, has existed since the 1960s. Six decades is more than enough time to generalize a good practice if its benefits were perceived and recognized.
In fact, regarding modeling in the last thirty years, software engineers have not kept up with their engineering peers.
A small story demonstrates it. In the 1980s, CAD (Computer-Aided Design), CAM (Computer-Aided Manufacturing) and CASE (Computer-Aided Software Engineering) systems emerged simultaneously. Thirty years later, all students of Civil Engineering (and Architecture) learn how to use CAD at University; all them use CAD, even for projects outside of the field of study in which they learnt it; and when they take a job, they continue to use CAD in all their activities. This did not happen with CASE. Ask a computer engineering student what the acronym CASE stands for and she will not know it.
Artificial Intelligence winter… and spring
The same was true about applying Artificial Intelligence to software development. In the 1980s a number of known researchers were discussing this subject, around inference engines, specific languages like PROLOG or LISP, and expert systems. They advocated, for example, the primacy of declarative programming versus imperative programming. They were a small step away from the use of Models and Artificial Intelligence.
Thirty years ago, the expectations regarding Software Engineering and Artificial Intelligence were very high. Herbert Simon, a Nobel laureate in Economics, stated in 1965: “Machines will be capable, within twenty years, of doing any work a man can do.”
In recent years, Artificial Intelligence is fashionable again. There is plenty of hype around it, and a lot of financial resources and many young talents are directed to this domain, which is very good.
Software development has, however, become separated from the themes addressed in Artificial Intelligence. One could even consider that expert systems have been bullied out of the picture, reserving AI for robotics, for natural language processing and for Machine Learning, which some of us, just a couple of years back, would have called statistics or forecasting methods.
Quidgest has been a pioneer in AI and Models since its establishment
Since its foundation in 1988, Quidgest has been a pioneer in the use of Artificial Intelligence and Models. Curiously, in this order.
AI at Quidgest and ALFAIA
In 1989, Quidgest started with Artificial Intelligence, creating an expert system for the financial evaluation of agricultural investment projects, named ALFAIA and written in PROLOG.
The pioneering enthusiasm of Helder Coelho, at that time professor at FCUL and ISEG, was an inspiration for the first Quidgest Artificial Intelligence projects.
ALFAIA allowed each expert to jump from 0.6 to 25 projects evaluated per day. ALFAIA provided a level of productivity and quality of decision making that would be envied by banks when analyzing credit requests today.
From this moment onwards, the quest for a drastic increase in productivity through the use of information technologies was already definitively and irrevocably installed at Quidgest. Our mission became, and remains, to play a leading role in the technological revolution of our time and in the digital transformation of the world.
Models and AI Inference at Quidgest and GENIO
In 1990, Quidgest started using Models and Inference to develop complex software. From this point, Genio was born.
First, we identified patterns able to unambiguously define any information system. By choosing, adapting, and grouping patterns, we could, from that moment, create the model (the equivalent of the civil engineering blueprint) for each information system.
Secondly, and based only in that model, we automatically, through AI and inference, started generating the corresponding software (source code).
Like an ultra-marathon runner, Genio constantly develops its capacity, advancing either in the model expansion, either in the corresponding automatic generation.
The teachings of Ana Lucas and the beauty and simplicity of the concepts she transmitted to us about database structures, were fundamental to begin this journey. It is a journey which we are still traveling today.
Why our approach to Software Engineering has been a success
ALFAIA and Genio, Quidgest’s pioneering systems, were already declarative (in line with MDE) and already used inference engines (in line with Artificial Intelligence). Both allowed for extraordinary increases in software developers’ productivity.
Gradually, however, we sadly saw that both the industry and academia left us alone. Failed projects and results far below expectations have led to widespread disbelief in the use of Models and Artificial Intelligence for software development.
As is often the case in the IT world, after all the initial excitement came a deep disappointment. Only a hand full of companies around the world persisted with this line of reasoning. Quidgest was one of them.
There are three main reasons why Software Engineering turned away from using Models and AI:
- the absence of a development discipline under a Total Quality Management perspective;
- a process-driven approach to software development; that means, process identification as the basis for software design;
- and UML (Unified Modelling Language).
Besides this, but we will not go into detail here, the non-existent pressure for the current software industry, a very profitable cash cow, to be more efficient.
Starting on development discipline. Caravela, an IST flagship project in the early 1990s, allowed (in fact, required) that, after code generation, someone did under-the-table tricks to ensure the solution worked. This bad practice, in total contradiction with good quality practices, prevented problems from being solved at source and become corrected for future use. It created a “Technological Debt”, which, like other debts, tended to accumulate.
It should be noted, by the way, that the ignorance of the traditional software industry in relation to Total Quality concepts continues to be glaring. This is today the only industry where “Quality Assurance” is isolated from production.
In Genio, “under-the-table” tasks to fix code are not allowed. If an error is found, it is fixed at the source, either the Model or the Inference Engine, before producing a new and correct version.
Caravela received more attention than its contemporary Genio, which was a shame because its failure resonated longer in the academic mind than the success of Quidgest’s solution.
Data-Driven versus Process-Driven Design
Which must be the basis for software construction: processes or data structure?
There was a brief period, with the emergence of relational DBMSs in the 1980s, during which data structure, not process flow, was the first step in software design. Obviously, any software has either data or processes. But the order by which they are considered in software design (or “Design Thinking”) is not irrelevant.
Modeling, during that small time window, was data-driven and not process-driven. Before and after it, processes ruled. The MRP / ERP of the 1970s and most of the software projects from the mid-1990s onwards began their development by collecting processes. The growing popularity of stories in SCRUM, Agile and even Scaled Agile also corresponds to a process-driven choice.
Whether you approach software design with processes or stories, nothing is comparable in simplicity and mathematical elegance to the relational model proposed by Edgar F. Codd for the structure of a relational database.
Starting from processes, the jump to modeling and artificial intelligence is much more difficult and with shady results.
In addition, the creation of process documentation, in contrast to data documentation, is an endless job. Very profitable for consultants, but desperately time-consuming and expensive for those who need the software.
Unified Modelling Language
Finally, Unified Modelling Language (UML). The association of Model to UML may be responsible for the divorce of many young students and software engineers from Model-Driven Development (MDD). MDD and UML are not synonyms, quite the opposite. Quidgest is an enthusiastic advocate of MDD and a fierce critique of almost everything in UML. In UML, only the data models are relevant, although they can be simplified.
Unfortunately, the melting pot from where this unification was cooked, provided equality of treatment to a lot of diagrams that only add noise to an already confusing situation.
The fundamental problem is that UML simply cannot bridge the gap between requirements and the solution:
- the UML accepts, uncritically, everything we put on paper,
- the reality is too complex to be represented in UML,
- when updating the software, UML documentation upgrade is not guaranteed.
Some tools to fill in these fundamental UML gaps have been tested by the academia but not by the industry, who already concluded the problem lies with weaknesses that are intrinsically inherent to UML.
In Software, all you need is… the Model
The model required for software development cannot use civil engineering blueprint as a reference. A software model must be much more powerful. It is not just an independent artifact that is consulted during the building phase. A software model works more like DNA. It must be a full representation of the software, from which the entire software is derived, through AI inference and automatic means.
One of the most significant innovations of Quidgest’s Genio, when compared to UML, is that Genio was born in the opposite direction: from real software to the model, not from the model to the intended software. And this is an absolute paradigm shift.
Within Genio, models are always partnered with artificial intelligence allowing to quickly transform each model into valuable software. Imagine it would be enough to hold the civil engineering project drawings in order to get the building finished. It does not happen, yet, in building construction (BIM does point in that direction), but we got used to seeing this process in 3D printing. Genio is the 3D printer of software.
Further information about Genio Model plus AI inference
For those who want to delve into the subject, the Genio “language”, that is, the meta-model used by Quidgest and its partners (and even some customers) is available. We provide training at our Academy. There are licenses for the community of developers and for university campus. There are forums and support teams. There is even a proper methodology (Hyper Agile QED), which is an extension of Scaled Agile, although suitable for modeling and AI-powered software generation.
Genio meta-model covers all the features required for a model, according to Herbert Stachowiak’s classic definition:
- it is separated from reality and it is an isolated asset;
- it abstracts detail and reduces complexity;
- and it fully describes the software, without any degree of ambiguity.
Therefore, unlike UML and other representations, the Genio model is a model that links and bridges the gap between Requirements and Products.
Furthermore, Genio’s capability to transform the model into the desired product, automatically granting consistency and generating all the source code in a few minutes, shows the full power of Quidgest’s Artificial Intelligence.
A strong market solution for highly complex systems
These two components, Models and AI, applied to highly complex corporate or governmental software, like core vertical systems or ERP solutions, with several thousands of database tables, make Quidgest’s Genio a unique product.
Simultaneously, Genio grants the most ambitious wishes to everyone who stands for DevOps, Digital Transformation, Design Thinking or Lean IT:
- drastic reduction on time-to-market of fully working solutions;
- continuous improvement by design;
- continuous integration and production cycles of hours, not weeks;
- a short learning curve;
- business autonomy to create new solutions;
- quality assured by poka-yoke, embedded either in modeling, either at generation process;
- management tools regarding security, testability, or complexity metrics for budgeting.
Many of the pursuers of these goals do not, however, know yet that Genio already exists.
For those who have been working in Software Modelling and Artificial Intelligence, it is certainly gratifying to know that systems as complex as a university ERP or Government Shared Services are developed through these new paradigms and compete, with obvious advantages, with similar systems produced through manual programming.
On the cutting edge of Software Engineering
State of the art reviews, included in recent academic papers, make obvious that what we have developed at Quidgest, as the result of three decades of research, demanding projects and co-innovation with our customers, is extremely advanced in relation to what is done around the world.
At Quidgest, we are available to collaborate on further dissemination of our Software Engineering, based on Models and Artificial Intelligence. We have therefore launched the challenge to the Academy to publish joint scientific papers on these topics. And, of course, we challenge our current and future customers to test us with even more ambitious ideas and products.
Note: This article is the result of a very fruitful exchange of ideas with professors Alberto Silva (DEI-IST), Vasco Amaral (NOVA-FCT), João Varajão and João Álvaro Carvalho (both from Minho University), in the context of planning the “MDE+AI Talks” being organized quarterly during 2019, together by Quidgest and Departamento de Engenharia Informática (DEI) from Instituto Superior Técnico (IST).
João Paulo Carvalho – Quidgest Senior Partner
Read portuguese version here