Neural Networks.

Artificial Neural Networks, also called Neural Networks are field of an Artificial Intelligence Research, used in Computer Sciences & many other Sciences.

NN have practical uses, are universal approximantion system representing multi-dimensional data sets, have ability to learn & adapt to changing environment conditions, ability to abstract (generalize) acquired knowledge.

Living oganisms's nervous systems research are basis for systems theory & it's uses in practice.

There are models of a neuron cell with:
- input signal weighted addition - with a weighted synaptic importance; input signals stimulate neuron or reduce, how much depends on it's weight measure,
- an activation function.

As neurons activate, weight of it's synaptic signals increases as well, depending on a weight measure as well.

Neurons (cells, graph nodes) & Synapses (connections) can be used in brain modelling as well, even if it's a model of non-human, artificial machine brain with different design & properties.

Living beings' brain activity interacts both ways with a nervous system & spine, affecting muscles & more.

In machines, artificial synaptic-neuronal system called Artificial Brain can interact via electricity with external devices such as cameras, or more abstractly sensors or other electronic machines as engines & more, as well.

Certain aspects of NN Research caught my attention for now:
- Singal Flow Graphs,
- Initial & Modified Weighted Measures of Data Importance for AI's tasks,
- Networks that Adapt,
- Patterns Recognition,
- Neuronal Competition.

See also, if You wish, or need, ... : Token Game, Stitie Space & Stitie Machine 'Sunsail'.

Source: [53], insights.


Idempotence, Invariants, State, Observable Moments.

Idempotence & State.

There's confusion & ambiguity with idempotence definitions in Computer Sciences & Mathematics.

In Mathematics, an unary operation (or function) is idempotent if, whenever it is applied twice to any value, it gives the same result as if it were applied once; i.e., ƒ(ƒ(x)) ≡ ƒ(x). For example, the absolute value function, where abs(abs(x)) ≡ abs(x).

There's more in this Wikipedia article as well.

In a context of this blog, we can call an object's method call 'strongly idempotent' when calling it multiple times - in any amount of calls - gives the same result, on condition that passed parameters are the same.

In a context of this blog we can call an object's method call 'weakly idempotent' when calling it multiple times - in any amount of calls - gives the same result, on condition that object's state is same between (& at) the method calls & passed parameters are the same; that is - this requires independence from external conditions such as database data or CPU Clock's state for example.


Invariants are properties of the class or method that happen all the time, or at least in all of the 'observable moments'.

For example we can have invariant that states:
- during the observable moments, state variable 'a' has a value of less than 5.

Observable Moments, Unobservable State.

Observable moments have something to do with Physics, but in Computing it's also related with concurrency mechanisms.

In Computer Sciences objects' methods' accessors can be 'synchronized' using the Monitor Mechanism for example; then only one thread or process can enter the method at once, during an 'observable moment'.

Other processes or threads have to wait in queue until initial thread or process leaves; during that time of the wait other threads or processes cannot access method & state so it's 'nonobservable moment' for these - at least when all state accessing methods are synchronized with the same wait queue; there are other causes for 'nonobservable moments' as well - for example when inner or external mechanism blocks access to the object's state & methods.

This is little vague explaination can be reduced to a process/thread synchronization & concurrency even if it's internally managed by interpreter, or included by a compiler in executable code. This works the same both with time division between a nonconcurrent CPU's processes or threads, as well as with a multi-core or multi-processor architecture concurrency.

There's also possibility of 'hardware blocks' that make parts of the code 'nonresponsive', it's state 'unobservable'.


Relations & Similarity in Computing & Mathematics.

Data Collections & Relations, Relation Criteria.

We can organize data in for example databases, then check how these relate with each other.

For example we can have a set of 'words':
 { 'plane', 'helicopter', 'plate', 'dish', 'cup', 'star', 'knife', 'place' }.

We can define 'greater length of word' relation for example, that states that a 'word' is in this relation with another 'word' if it consits of more 'characters'.

For example word 'helicopter' with it's 10 'characters' is in 'greater length relation' with 'plate' with it's 5 'characters'; but 'plate' is not in the same relation with a 'helicopter' word' - we can say that this relation is 'not reversible'.

We can define other relations as well, for example minimal amount of the same 'characters' at the same positions in a 'word'.

We can define a 'minimum of 3 same character in a same length word' relation; then 'plane', 'plate' & 'place' words are in this relation; this a relation is reversible.

Let's note that amount of the same 'characters' at the same position is 4 for the above 'words' - therefore lesser amount criteria in regards of character similarity at positions are met as well.

We can define a lot of different relation criteria - for similarity relations or for other relations - using programming languages for example.

Relation criteria is a function that accepts a zero, one, two, or more objects (their ordering is meaningful & important in this case) then returns a boolean value (true or false) depending on whether these criteria are met or not; this has uses in data categorization & sorting for example as well, as well as in defining conditional instructions & preconditions; perhaps more as well.

Relations & Similarity in the Relational Databases.

In relational databases we have a 'like' operator & more.

Relations are also called 'tables', for data meeting 'in relation' criteria can be grouped into a well-defined tables; it's semantics (meaning) is that they are in relation if they are in the same table or in the same view.

If we have a table with semantic similarity, then this can be used as well.

For example, we can have 'air vehicles' table - similarity relation nevertheless - that can contain { 'plane', 'helicopter' } data set; we can say that both 'plane' & 'helicopter' are similar semantically (according to their meanings) because they are 'air vehicles'.

If something is not in this table (for example: 'air drone') then this does not mean so strongly that it is dissimilar yet - only that it's in 'uncertain dissimilarity relation'; we can call 'uncertain dissimilarity relation' a 'weak dissimilarity relation' as well; Computers can fairly well see semantic similarities if they are programmed well & have enough of well-structured data; there are fields of Artificial Intelligence that are responsible for data discovery & learning from databases, to handle database(s)' unknown or partly-unknown structure.

There's much more of mathematics in that as well, including 'Relational Model' Theory.

Similarity Relation with Regular Expressions.

Character strings can be analyzed using Regular Expressions.

Regular Expressions can be used to form 'Patterns', then we can analyze if a character string(s) 'match this a pattern'.

When two or more character strings match the same pattern, then we can say they are similar that way.


Programs Learning.


Attempts to create programs that learn are not motivated by a desire to eliminate effort of programmers & designers.

Attempts to create programs that learn are not challenge to more classic software engineering.

Attempts to create programs that learn are not motivated by the challenges of software complexity - it's solved by modern analysis & design instead.

Attempts to create programs that learn are motivated by complexity of a certain types of tasks given to be solved by a software, that hinder or make impossible to formulate correct & fully detailed algorithms that solve these problems.

Intuition & Imagination.

Program that learns can be imagined as an abstract algorithm that can be parametrized to complete. Learning is then acquiring proper parameters that make it detailed, concrete algorithm to solve the tasks given by a software constructor. Parameters acquisition uses historical data.

Hypothesis, Knowledges & Skills.

Parameters acquired during learning process are called - depending on their type & on assumed point of view - 'knowledge' or 'skill'.

Each of parameters acquired during the autonomous learning is are often called 'hypothesis', coming from 'hypothesis space' that contains every hypothese that student can use to perform task(s). This terminology emphasizes both an uncertain state of knowledge or skill acquired by a student, as well as the fact that it's autonomous subject that is acquired. Uncertain state of a skill or knowledge makes it non-infailible for the task(s) given.

The difference between knowledge or skill is fairly fluid - it has unstrict character.

It's more a skill than knowledge when we require from program to perform certain operations sequence that is acquired during the learning process; it's also called 'procedural knowledge' as well.

It's more a knowledge when we can say it's a selection for a certain case, a choice for a single decision. We can discover how to interpret certain input data, for example it's type or how it's related with other objects; it's also called a 'declarative knowledge'.

A knowledge or a skill is also called knowledge in lesser sense, knowledge with skill together is also called knowledge in wider sense.

Initial & Acquired Parameters.

We do not know perfect initial parameters for task(s), or otherwise we would not need AI; we still use initial parameters to start with learning.

During the learning process a change occurs - parameters are acquired, stored & used; this change occurs because of 'experiences', we can treat these 'experiences' as a 'training information' in this case, at least.

Source: [52].

See also, if You wish or need, ... : Learning Definition.

Learning Definition.

Artificial Intelligence is a software component of a computer system that learns.

def. Learning of the computer system is any autonomous change in this system that occurs because of experiences that leads to performance increase.

Autonomous Change is a change that system introduces in itself, not because of the external factor such as recompilation using a compiler with better optimization, hardware changes, etc.

Performance Increase depends on a well-defined Quality Criteria, should & can be measured using the strict & logical means.

Performance Decrease is for example forgetting - data loss; not every change leads to performance increase.

Experience is data acquired from observation or experiementation performed by the AI.

System that learns is also called: a Student.

Source: [52].


Systems that Learn, Artificial Intelligence.

'Machine Learning' is a field of Computer Sciences that considers Artificial Intelligence Software & it's construction.

This software has capability to learn, or in simple terms: a capability to increase quality of it's tasks performance, based on experiences from past.

Program that learns can be imagined as program that uses abstract algorithm that needs concretization to perform certain tasks. Such algorithm needs to be filled with details that are not known beforehand.

Learning is transforming these 'empty places' into algorithm that fulfills needs of a constructor by choosing proper parameters (details) to fill the 'empty places'.

These parameters, acquired during the learning process, are named 'knowledges' or 'skills'.

Algorithms for acquiring or perfecting 'knowledges' or 'skills' are named 'Learning Algorithms'.

In Literature there are very many of learning algorithms. These can be categorized according to 'knowledges' or 'skills' data representation, tasks types for which 'knowledges' or 'skills' are used, as well as by the method(s) of acquisition for 'experiences' - named 'training information'.

Main motivation to learn AI would be to handle algorithms too complex & unknown to handle precisely, including software handling unknown environment influences, as for example robot navigation on the real streets - there are too many unknown factors as holes in ground or wind or animals, weather changes etc.

Second main motivation is handling tasks with too many parameters to handle, too expensive to execute precisely.

Thrid main motivation is that is a rewarding subject, well documented in a proper literature, but challenging still. Lot of possibilities to earn praise, from scientific works to PhD subjects.

Source: [52].

See also, if You wish: Learning Definition, Programs Learning, Neural Networks, Artificial Intelligence Learning Aspects, An Example Design of Artificial Intelligence for Martial Arts.


Modelling the Tree of Life.

The Qabalistic Tree of Life,
in the Servants of the Light organisation's Hermetic theory.

Anything can be modelled, including the Qaballistic Tree of Life of the Hermetic Qabbalah.

Anything can be modelled using Stitie Space, if/when computing resources allow.

Mindful Imaging module can be custom, well-tailored for this task as well.

(an unfinished article, the coding will follow as well).


Token Game.

i am not sure what exactly 'Token Game' is in the context of the Petri Nets, but in a context of 'Ola AH' Programming Language & Modelling it is a software construction method that involves:

- designing software models,
- filling software models with code & the initial data,
- designing the data flow (tokens can contain 'payload' data, can be in different places at different times),
- checking & managing properties of the model graph with tokens, such as deadlocks possibility, bottleneck relief management; detection, management & handling of graphs' cycles, etc,
- model/tokens application management at the runtime,
- conditionality & events,
- probably more.

Space that contains code & data (state) - including above-mentioned 'tokens' - might be distributed, using either or both GRIDs & clusters, using message queues as well.

Probably Tuple Space(s) also called: 'Linda' might be used to represent conditions, a presence of a condition tuple or a lack of condition tuple might inform us about a condition holding or not. Events occuring might model cause(s) happening. A cause-event occurance might make condition(s) to appear or disappear from a Tuple Space(s), or just trigger a code to start happening.

Let's think that for a certain code parts to start, there's need:
- Cause(s) occurance(s), in a proper moment(s) in time,
- Condition(s) holding in a proper moment(s) in time.

'Token Game' is an 'Ola AH' Programming Language's 4GL method.

for some non-strict ideas inspired by Abstracted Petri Nets, see: Decision Filters.

see also, if You wish or need, ... : 'Talking Objects' Solutions, Object Relationships Modelling & Analysis, Neural Networks, Stitie Space & Stitie Machine 'Sunsail', Stitie Machine 1.2 'Satellite', Causes & Conditions.

(probably unfinished a post, probably will be edited in the future).

MATEN, Prism & Modelling Software Parts.

Every of Software Parts can be modelled.

Every of Software Parts can be modelled in 3D using Stitie Space.

MATEN & Prism Functionalities can be used to reform graphs, to invoke or transform different forms - for the security, for a speed, for a task changes.

Mindful Imaging can be used to Visualize, can be interactive to manage forms manually, with AI hints, or to oversee management automation - with or without a proper Artificial Intelligence.

All of above ways will have a place in the Idiomatic Programming with 'Ola AH' Programming Language.

Both 'Ola AH' Programming Language Concurrency Nicety, as well as Decision Filters are examples of Modelling Software parts in 3D using Stitie Space.

See also, if You wish or need, ... : Agile Transformation of Information State.


Decision Filters.


There are very many ways of modelling decision-making process using computers.

Decision filters are one of these.

Weighted Precondition Sets.

We can have a set of tuples (there's tool called Linda, also known as a Tuple Space).

Each of tuples in this a set consists of: (precondition, score).

WPS is connected with precondition producers, as well as with postcondition receivers.

When a producer decides that a precondition is met, this tuple is added to a set.

When score values in a set reaches or exceed a threshold value, the set is completed - completion event fires & chosen tuples (postcondition receiver decides which tuples to take, it takes as little as possible as well) are transferred to a postcondition receiver node. When all required postcondition tuples reach a receiver or a receiver/producer, decision is made. This decision might be to produce another precondition or something else as well.

Data flows in one direction only, in this model.

Decision Filters.

By designing & connecting producers, WPS-es & receivers properly, we can model decision making in a way.

For example, we can model a part of computer game decision filter that way:

We can react if we have enough forces & when we have enough of warning signals, in a proper way.

Abstract Preconditions/Causes Collection.

WPS part can be abstracted, to include other ways of handling preconditions or causes.

For example:
- whether producer can remove a tuple when it's no longer valid or not,
- whether there's a score part at all in a precondition tuple,
- whether a tuple contains additional payload data,
- what are abstract preconditions collection completion requirements,
- probably more options as well.

Other parts (producers, receivers) can be abstracted as well.

See also, if You wish or need, ... : Causes & Conditions.

(to be continued, probably).


Managed Concurrency Nicety in 'Ola AH' Programming Language.


Nice Concurrency in 'Ola AH' Programming Language is about processor time loans.

These loans & repayments can be managed by a 'leader' machine.

Managed Concurrency Flow Graph.

Using Stitie Space, Concurrency Flow Graph can be built dynamically, during the application's rutime - then it can be visualized using Stite Space's Mindful Imaging part - then it can be managed, modified as neccessary.

Such Management can be automatic or can be done manually by an administrator during program's runtime, to relieve application's bottlenecks.

Threads start when preconditions are met - when there are enough of input data tokens waiting for Threads' consumption.

Threads run concurrently then, producing output data tokens at their end.

Such Thread Communication can be modelled & managed, rearranged as neccessary - new producer/consumer threads can be added to a processing bottlenecks, their loans/repayment strategy can be managed as well - one can think of it as of a 'slider' that changes nicety priority.

Stitie Space can be used to model processing threads graph, and data flow routes as well.

That way, concurrency bottlenecks can be relieved by assigning additional resources, which either depletes 'processing reserves' or puts a stress on other application parts.

This extra 'stress cost' can be occasionally reasonable - for example, when rest of application waits for bottlenecks to finish their tasks.

'Ola AH' Programming Language & Nice Concurrency Automation.

In 'Ola AH' Programming Language, Thread Object will have a two extra properties:
- 'nicety' value - which determines how much of processor time it gets during a time period,
- 'nicety group' value - which determines with which threads it engages in loans/repayments.

'Ola AH' Programming Language Nicety Automation for a GRID.

For GRIDs, 'Ola AH' Programming Language will Include:
- 'worker object' class - basic unit of code execution in a GRID that consumes 'input data tokens' & produces 'output data tokens',
- 'worker strategy' class for transferring code to run on a different 'worker object',
- 'foreign strategy' field on a 'worker object' with a foreign IP Address for guest strategy coming from other distributed 'worker object',
- 'workload dispatcher' object that manages 'worker strategy' exchanges & time loans/repayments between the distributed 'worker objects' according to their 'nicety' & 'nicety group'. a 'workload dispatcher' is also a 'worker object'.

See also, if You wish or need, ... : Stitie Machine 1.1 'Sunsail' & Stitie Space, Stitie Machine 1.2 'Satellite' & Stitie Space, Clusters.


The Linguistic Parser & AI.

... year or years ago i had insight, that i should (later in my life) write the 'linguistic parser'.

what is the 'linguistic parser', however?

it's parser that can analyze natural language, either with dictionary words written or spoken, images, films ... build & model data structures in memory more or less correctly.

'linguistic parser' should take both dictionary, meanings as well as the natural grammar in consideration.

... i imagine that both Statistics as well as Artificial Intelligence are useful with the 'linguistic parser' - i should study these as well.

Example: http://nlp.stanford.edu:8080/parser/.

i also read that in about three decades Artificial Intelligence will reach or go past the point of human brain's capacity - it's both exciting as well as terrifying possibility.

with properly modelled data structures, AI can process & act on these, to fulfill it's goals.