Model best practices

Previous Next

This help topic looks at best practice for USoft data modelling in an event processing context. See also:

Best practice for USoft rules in an event processing context.

Best practice for USoft error handling in an event processing context.

Best practice for USoft logging in an event processing context.

In USoft, a model may be described using Business Areas and Business Objects in USoft Teamwork. A Business Object is a topic or an area of interest in the business or in the information system and is a subdivision within a Business Area. Within a Business Object you can specify which domains, tables and relationships belong to the area of interest. 

In USoft, such a model is implemented by declaring domains, tables, columns and their attributes in USoft Definer. At runtime, a model implementation is used to store data. In the case of an event processing system, which will need to combine two or more streams of data, the model must be designed in such way that data coming from different streams may be stored in different logical model partitions that are not overlapping. If tables are shared between two different logical partitions, and two events from two different streams are processed at the same time and are updating the same record at the same time, locks may occur. Finding and debugging lock problems is a difficult task, especially if they are not noticed until the system goes live.

In USoft's data-oriented approach to event processing, data streams are shaped as input queues and output queues into and out of the tables that you model. Business Objects have data contents that allow you to express how tables belong to different areas of interest (in this case, different data streams). A table may be shared by neighbouring Business Objects, but you can also express that a table is owned by 1 specific Business Object.

When designing your logical partitions in this way, you may find that a table is logically shared by two different Business Objects, In an event processing context, such a design is not appropriate because it is an indication that two data streams will be entangled and could lock each other out:

SD_clip0028

The solution is that you disentangle the model by extending it. For example, you can accommodate two input queues successfully by breaking up the flow structure, creating a new (internal) queue that picks up data from both input queues. Instead of belonging to 2 overlapping areas, tables are pulled further apart: they end up belonging to 3 distinct areas. In this way you can detect and preclude possible lock issues in the design/implementation phase.

SD_clip0029

 

See also

Event Processing