Image Image Image Image Image Image Image Image Image Image
Scroll to top


Su-Field Analysis for Information Processing System

Su-Field Analysis for Information Processing System

| On 02, Jun 2015

Contributing Author: German Voronov

Originally presented at TRIZfest 2013; Proceedings Page 9

Keywords: Su-Field Analysis, Data, Function, Knowledge, Information Processing System.


1. Problem formulation

In this paper, we present a new concept of Su-Field analysis adapted for information processing systems.

2. Publication review

In the previous works [1-6] , several TRIZ methods were presented to be applicable for IT systems. None of them mentions Su-Field analysis for IT systems.

In [7] Rubin proposed to apply Su-Field analysis for IT systems and extended concept of “substance” to “element” leaving the concept of “field,” so Su-Field became El-Field. Petrov [8] proposed substituting “field” with “action” and Su-Field became “El-Action.” In [9] Petrov suggested an adaptation of the law of increasing degree of Su-Field.

In this paper, we propose a new concept of El-Action analysis.

3. Definition

A new methodology of Su-Field analysis for IT systems is proposed. This methodology includes three components: Data, Function and Knowledge – DFK.

  • Data – incoming information to be processed by the system.
  • Function – processing applied on the incoming information.
  • Knowledge – aggregation of all proven, empirical or other a-priori information related to how the data to be processes.

The main difference between the Knowledge and Data is in structure and availability. The knowledge is defined during design or update of the system and exist independently of the incoming Data. When new facts or relations are discovered, Knowledge base is updated and the processing is accommodated to the new understanding. In certain cases, past Data forms part of the Knowledge available for processing the new Data.

Data, Function and Knowledge are parts of DFK, and the corresponding methodology is called DFK Analysis.

4. DFK Analysis

The system is called uncontrolled if its function is fixed and does not depend on knowledge. Such system is referred to as an Incomplete DFK (1):

6-2-2015 1-15-56 PM

Usually, some prior knowledge about the incoming data is available and used for adjusting the functionality of the system accordingly. Such system forms a Simple DFK, shown in (2):

6-2-2015 1-24-56 PM

The system may adapt its functionality autonomously, by analyzing the incoming data and selecting the best approach to process it. This is a Complete DFK (or DFK), presented in (3):

6-2-2015 1-25-17 PM


We believe that the proposed concept is helpful for analysis of the effectiveness and improvement of the existing solution as well as for design of the new ones.


Let’s consider a system for data compression.

  1. When input data type is unknown, the only reliable approach is lossless compression methods that achieve relatively low compression ratios. This system employs no knowledge and therefore it is an incomplete DFK (1).
  2. When data type is known, e.g. image or audio, specific compression schemes can be applied for that type of data, like JPEG for all images or MP3 for all audio streams. Dedicated compression schemes make use of data structure to achieve much higher compression performance. The system effectiveness is higher compared with the first example. This system uses only external knowledge provided from outside, without any analysis of the input, and there it is a simple DFK (2).
  3. To achieve optimal compression performance for a particular input, an image compression system analyses the input and determine the type of the image (e.g. photo, drawing, text, medical etc.) to selects an optimal method for this specific type of image. This system uses both the external knowledge provided from outside and internal knowledge collected by analysis of the input data and therefore it is an adaptive DFK (3).

Laws of DKF Degree Increase

Information processing systems tend to increase their degree of DKF according to the laws. We state here three laws of DKF degree increase:

  1. Law of multistage processing.
  2. Law of multiple source processing.
  3. Law of accommodation.

The law of multistage processing

The law of multistage processing states: any information processing system tends to process data in stages. That is, as the processing complexity rises, processing is divided into several stages. There are number of distinct reasons of multistage processing:

  1. Distributed systems. The systems where the information processing is performed by different components.

Examples: data transmission systems, client-server systems, sensor-display systems.

  1. Development optimization. Complex systems are being divided into components, so each component can be developed and verified independently. Minimizing the interfaces between the components makes system development more efficient and faster.

Examples: virtually, all processing systems are divided into components.

According to the law of multistage processing, Simple DFK becomes Simple Multistage DFK. In Simple Multistage DFK each processing stage is independent from all other stages, only data is shared. The Simple Multistage DFK transforms into Coordinated Multistage DFK, where partial amount of knowledge is being shared between the stages. Finally, Coordinated Multistage DFK becomes Common Multistage DFK. In Common Multistage DFK all knowledge is fully shared between all processing stages (Fig 1).

figure 1


  1. Simple Multistage DFK: Sensor/Display system. The image shown on a web page taken by a camera is presented on a monitor. The camera and monitor do not share any information.
  2. Coordinated Multistage DFK: Data transmission system. Certain preprocessing of the data can be performed by transmitter and post-processing may be carried by receiver. Since there is a link between receiver and transmitter, some metadata about the processing can be by transmitter to the receiver.
  3. Common Multistage DFK: Processing algorithm divided into subroutines can have a shared structure to contain all the knowledge.

The law of multiple sources processing

The law of multiple sources processing states: an information processing system with multiple sources tends to process the multiple sources jointly. That is, multiple inputs with the same or different sorts of information can be jointly processed to explore the correlation between them.

An example of multiple sources of information is video capture, where both visual and audio data is captured. The relation between them can be exploited to improve speech recognition, noise reduction and video compression.

According to the law of multiple sources processing, any Simple Multiple Source DFK tends to become Coherent Multiple Source DFK. In Coherent Multiple Source DFK all sources of information are observed and used for processing, but the processing is performed independently. Coherent Multiple Sources DFK may transform to Coordinated Multiple Sources DFK, where knowledge is partially shared by processing subsystems on each source. Finally, Coordinated Multiple Sources DFK may fold into the Shared Multiple Sources DFK with centralized knowledge handling (Fig 2).

fig 2

The law of accommodation

The law of accommodation states: an information processing system tends to accommodate past data to improve its performance. That is, a system tends to learn from information received in the past and adjust itself to produce optimal results for the incoming information.

Examples: unsupervised learning systems, like speech recognitions, search engines etc.

Static DFK is not changing in time, having constant a-priori set knowledge and functionality. Static DFK may become Learning DFK if the knowledge base changes with the incoming data. When not only the knowledge but also the function applied to the data changes with time such a system is an Evolving DFK (Fig 3).

fig 3


Complex DFK

The laws presented above are also applicable in combination. The example below shows Common Multistage Shared Multiple Source Learning DFK (Fig 4).

There are two sources of information D1n and D2n.  Shared Knowledge Kn learns data behavior with each time sample. Function F1n applied on D1n is evolving, so the D1n­–F1n–Kn is an Evolving DFK. On the source D2n multistage processing is applied by the functions F2 and F’2 with a common learning knowledge Kn.

fig 4


In this paper, we propose a novel approach for analysis of information processing systems. This approach is based on three elements: Data, Function and Knowledge and differs from the classical Su-Field analysis. New laws of design and analysis of such systems were proposed. We believe that the proposed concept is helpful for analysis of the effectiveness and improvement of the existing solution as well as for design of the new ones.


  1. Rea K.C. Using TRIZ in Computer Science – Concurrency / The TRIZ Journal, Aug. 1999
  2. Rea K.C. TRIZ and Software – 40 Principle Analogies, Part 1, Part 2 / The TRIZ Journal, Sep., Nov. 2001
  3. Mann D. TRIZ for Software? / TRIZ Journal, Oct. 2004.
  4. Mann D.L. Systematic (Software) Innovation, IFR Press, 2007
  5. Odintsov, I.; Rubin, M., TRIZ methods in SW development to enhance the productivity / Software Engineering Conference in Russia (CEE-SECR), 2009 5th Central and Eastern European. – P. 276 – 280
  6. Zadesenets I. Using TRIZ to Resolve Software Interface Problems. / TRIZ Journal, May, 2009
  7. Рубин М.С. Элеполи и универсальная система стандартов решения изобретательских задач. 2010 (Russian)
  8. Petrov V. The TRIZ concepts to information technology. /Papers of TRIZ-FEST-2011 Conference /Collection of Scientific Papers, MATRIZ, Saint Petersburg, 2011. – P. 66, 201-206.
  9. Petrov V. The Law of Increasing Degree of Su-Field. International research conference “TRIZfest-2012”. – Lappeenranta; St. Petersburg, August, 2-4, 2012: conf. proc. / MATRIZ. – 154 p. P. 49, 50-57. The CIL Journal