High Performance Computing and Communications Glossary 2.1

A significant part of the material of this glossary was adapted from material originally written by Gregory V. Wilson which appeared as "A Glossary of Parallel Computing Terminology" (IEEE Parallel & Distributed Technology, February 1993), and is being re-printed in the same author's "Practical Parallel Programming" (MIT Press, 1995). Several people have contributed additions to this glossary, especially Jack Dongarra, Geoffrey Fox and many of my colleagues at Edinburgh and Syracuse.

Original version is from NPAC at <URL:http://nhse.npac.syr.edu/hpccgloss/>

Original author: Ken Hawick, khawick@cs.adelaide.edu.au

See also the index of all letters and the full list of entries (very large)

Sections: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

O

oblivious(adj.) Working in the same fashion regardless of surroundings or history. An oblivious scheduling strategy always schedules processes in the same way, no matter how much work they have done in the past; an oblivious routing algorithm always routes messages in the same direction, regardless of local load. See also adaptive.

OEM (n.) Original Equipment Manufacturer; a company which adds components to someone else's computers and sells the result as a complete product.

OLTP (n.) On-line transaction processing; handling transactions (such as deposits and withdrawals) as they occur. An application area of great importance to banks and insurance companies.

Omega network (n.) a composition of shuffle-exchange networks with programmable switches.

OOF (n.) out of frame condition counter that increments every change in the framing status of a circuit or device.

operating system (n.) That software responsible for providing standard services and supporting standard operations such as multitasking and file access. See also kernel.

operation oriented language (n.) a programming language using remote procedure calls as the principle means for interprocess communication and synchronization.

optimal (adj.) Cannot be bettered. An optimal mapping is one that yields the best possible load balance; an optimal parallel algorithm is one that has the lowest possible time-processor product.

optimization block (n.) a block of code (rarely a whole subprogram, but often a single DO-loop) in which a compiler optimizes the generated code. A few compilers attempt to optimize across such blocks; many just work on each block separately.

optimization problem (n.) a problem whose solution involves satisfying a set of constraints and minimising (or maximising) and objective function.

OR-parallelism (n.) A form of parallelism exploited by some implementations of parallel logic programming languages, in which the terms in disjunctions are evaluated simultaneously, and the parent computation allowed to proceed as soon as any of its children have completed. See also AND-parallelism.

OS (n.) See operating system.

OSF (n.) Open Software Foundation; an organization established by a number of the major computer manufacturers to set software standards.

OSPF (n.) open shortest path first is a proposed IGP for the internet.

output dependence (n.) See dependence.