Knowledge transfer research and development
No progress without research. No market success without innovative solutions. That's why we are looking for research and development experts from companies, universities, and non-profit organizations who want to work with us.
SELMO is continuously developing its product and application possibilities to offer the highest possible customer benefit. The internal development department is a key driver of progress for SELMO's platform and hardware independence. Cooperation with other institutions and partners from science and industry plays an important role and offers valuable synergies.
By joining forces, we want to contribute to meeting economic and social demands with technological solutions that are born out of the practical experience and find their place in science. We need to make a sound contribution to responsible and smart digitization.
SELMO did not invent the logic of a step chain, nor the modeling, nor the generation. What we have invented is a patented process that describes step chains (state machines) in a 100% state-controlled way in a model. Therefore, the PLC sequence logic program that our SELMOstudio automatically generates, describes the system in its entirety. To the best of our knowledge of automation technology, we are the only ones who have successfully developed the technology and tool for this purpose.
The task of automation technology is to make process sequences repeatable and to map them reliably. It should check and monitor signals and data, generate information from signals and data, and read in data from the higher level of a machine. Its job is to output it to operator interface, MES (manufacturing execution system) or ERP (enterprise resource planning) software systems. Theoretically. Practically, there are some challenges to overcome, which we touch on with the following topics:
To fulfill the above-mentioned tasks, technical functions such as cylinder, drive, or system controls are required. These functions are usually developed once and are hardware-dependent. Continuous, hardware-independent development is missing in the practice of automation. Also, the specifications as to which data should be collected in and from the function are not always clear. As a result, the programmer, who is usually at the end of the engineering chain, will not program out the data. They remain invisible. This is why digitization projects for BigData, IOT, Edge, and Industrie 4.0 do not always run as smoothly as planned. The necessary control data is created, but the definition of the data needed for display or data acquisition is complex. Thus, manual programming, which links functions to logic, usually lacks a clear overview of exactly which functions to use. To break through the dilemma, many programmers design structured functions and test them.
It is no secret that the programmer is often required to reproduce important information in the program. In practice, an end position on the cylinder can quickly be realized with a timer. After two seconds it will already be "there". This is how fuzziness can arise in the evaluation of information from data. They are blind spots in data analysis or the monitoring of automation solutions. Should it be in the hands of the programmer to quickly reproduce some signals theoretically? To unintentionally introduce fuzziness into a software system? From the process and the requirement, it must be clear from the beginning which data goes in and which data goes out. Which data is ultimately important must be evaluated from within the business process. And this must be done as early as possible - otherwise, it must be assumed that the programmed data is limited. The programmer cannot do this alone. And yet he is always faced with the challenge of "reworking" in an already complex system.
If the functions are clearly defined and separated from the logic, then the data provides flexibility in the system. The logic allows process steps to run repeatedly in a discrete-event system. For this, the automation expert uses a formalism: Automat as a state graph, Petri nets (PN), or simple flowcharts. With these, the sequence and transitions can be mapped. This procedure creates a deterministic, finite-state automaton (DEA), which is usually executed as a MOORE automaton. This means: In the respective state, outputs are switched and thus the transition, i.e. the further switching, is generated. Practical example: A valve switches and the cylinder moves to the end position. The transition is fulfilled by the signal of reaching and it is switched on. New states and new actions follow. These actions are now the functions. From these data are generated and expected. If now a drive is to move an axis, then this must be controlled depending on the logic. How the function works - that is fixed. When, how fast, and in which position it is executed - that is determined by logic. This interaction only works in a discrete-event system: The function must report back that the action has been fulfilled and that a transition has occurred.
If the number of state variables in a system increases, then the size of the system state space grows exponentially. This is called the "state explosion problem" in the research literature. It has occupied much of the research in model checking for about 30 years. Numerous techniques have been developed and tested to solve the problem through "model checking". This is an automatic verification technique for hardware and software systems that take finite states or finite state abstractions. Model checking is also used, among other things, to validate PLC software, which often fails because of the state explosion problem, because of interpretation errors between departments, or because of manual copy and typing errors. And this despite top performance of programmers.
100% DESCRIBABLE MODEL.
How can the mapping of DEA (deterministic finite automaton) in a model be described in such a way that all information is there to automatically convert the model into an application to a real-time system?
What can be a structure that combines and maps logic, system, parameters (data) to a simple formalism?
100 % STATE CONTROL
How can the complexity problem of "state explosion" be solved in the modeling, realization and verification of software? How can what SELMO does in practice be scientifically proven?
GENERATION OF ALL DATA POINTS
How can functions be modeled with all relevant unit data points and verification options and automatically translated to real-time platforms?
What is the potential of SELMO's digital-first principle for the engineering development process? Because the software is the only thing that is digital in the machine, this must determine the engineering significantly and early on.
BEHAVIORAL PREDICTIONS OF MACHINES
How can the behavior of the machine be digitally captured and predicted? We want to go further than the digital twin, which focuses on the system structure. The goal is to capture every bit and every data point at all times. AI or BigData will only then enable valuable insights from processes and correlations between system and result. Productivity and behavior in logistics are then available in real-time.