Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
IBM Systems Journal (v.30 n.4)Hoffnagle G. (ed)  IBM Systems Journal 44:1991.Type:Journal
Date Reviewed: Dec 1 1993
Comparative Review

This journal issue contains a set of papers on APL collected for its 25th birthday. Rather than analyze them in the order in which they were printed, I have grouped them into four classes: general features of APL, implementation, applications, and history and philosophy.

General features

In “APL2: Getting Started,” by J. A. Brown and H. P.  Crowder,  APL is presented as consisting of three fundamental components: arrays, functions, and operators. Some nice examples of the use of APL2 are given. The presentation is restricted to algorithms that can be expressed as combinations of operations on arrays. Nothing is said of the selection statement or the branch statement, which are part of the language because they are needed for induction when each term of a sequence is a function of its predecessor.

Connectivity between APL and other languages or applications is discussed by T. Wheatley in “Extending the Domain of APL.” Although APL is a wonderful tool for array manipulation, it does not have any facilities for files. Component file systems provide facilities for arrays to be stored in and retrieved from external files. They are not a mechanism for data interchange with other applications, because of the complex format of records. A different mechanism was needed for data interchange. Also, a programmer may wish to interact with the programming environment or use systems variables or access functions written in a different language. The designers of APL have kept it strictly abstract and machine-independent.

External variables may be declared in an APL program with a specific operator and shared between APL users or with the system or other applications. A control mechanism is provided to synchronize access to those variables. The implementation of communication between APL and the system or other applications is nicely discussed, and suggestions are made for future improvements.

Interfacing APL2 with the X Window System and C is discussed by J. R. Jensen and K. A. Beaty. An example illustrates the capabilities of the X Window System when used with APL2. The authors give an overview of the X Window System and discuss the interface design criteria. They include using C functions, and present and discuss an interface with C, including declaration of C types in APL2.

Wheatley’s paper and Jensen and Beaty’s paper convincingly illustrate the power of APL2 for manipulating complex data structures present in other languages and systems. The idea is to keep those languages or applications for their more specific tasks (such as managing files, manipulating windows, and giving codes for high performance), and to use APL2 for simpler manipulations of their objects.

R. G. Wilhoft discusses “Parallel Expression in the APL2 Language.” Four types of parallelism are recognized. Most APL functions allow data parallelism, in which the same operation is performed on a set of independent data, such as incrementing by one all the values of an array. Algorithm parallelism refers to the possibility of organizing an algorithm so that each step involves data parallelism. Data flow parallelism recognizes that several antecedents of a given operations may be computed in any order or in parallel. Task parallelism does the same thing at a higher level. The last two types require some kind of synchronization process. The paper discusses the possibilities of implementing parallel operation for the various APL functions or operators and their limits. This good paper is well documented and realistic about the facilities and limits of APL.


R. Trimble presents “Storage Management in IBM APL Systems.” The types of APL objects are dynamic: the shape and size of an array may be modified by functions or operations. APL can only be interpreted, not compiled, and storage must be managed at execution time. Space has to be collected to make room for a new array and freed when it becomes obsolete. Garbage collection is a key issue. Its cost is discussed, and several ways are given to improve space allocation and minimize garbage collection. Curiously, simple solutions are efficient.

M. Alfonseca, D. Selby, and R. Wilks describe IL, a generator of interpreters for APL. The idea is to design an intermediate language that is close to classical machine languages and able to support the main operations of an APL interpreter. A standard interpreter is written in this language. Porting it to a new machine is done by writing a compiler for IL, but the compiler does not have to be efficient because it is used only once. It may be written in APL on a machine that has an APL interpreter. This generator has been used to implement nine interpreters, with a considerable reduction in the amount of work to be done.


“The Foundations of the Suitability of APL2 for Music” are presented by S. J. Jordan and E. S. Friis. Musical notation is complex: it gives the pitch and duration of sequences of notes. Pitch has two psychological attributes: tone height and tone chroma. The paper discusses how APL2 structures are suited to representing such complex musical notations.

“APL2 as a Specification Language for Statistics,” by N. D. Thomson, describes the APL Statistics Library. Basic statistical functions are given in APL2, then more complex functions are constructed by combining them. This application is nice because APL2 is perfectly suited for these kinds of algorithms and data structures.

“Advanced Applications of APL: Logic Programming, Neural Networks, and Hypertext,” by M. Alfonseca, is less convincing. Efforts have been made to write an inference machine in APL; the problem is speed. The same problem arises for Prolog emulation in APL. The only possibility is to implement a Prolog-like inference processor in a lower-level language accessible from APL. Its use is described. Although the author claims that auxiliary processors are a part of APL, it is clear that APL by itself cannot allow reasonable implementation of inference machines. Neural networks are briefly presented. Hypertext can be implemented in APL in an object-oriented style, using general arrays.

A. Aharon et al. describe the use of APL2 for “Verification of the IBM RISC System/6000 by a Dynamic Biased Pseudo-random Test Program Generator.” They explain why test program generation for a new architecture is a difficult problem: the complexity of verification grows much faster than the complexity of designs; the architecture has to be modeled; sequences of instructions must be generated with a lot of constraints; the possibility of interrupts and address translations must be taken into account; the generator must be flexible; and so on. APL was chosen for its ability to describe operations on bits concisely. The dynamic test generation first chooses an instruction, initializes the corresponding facilities, and then executes the instruction and updates the facilities. The development of this product is not a negligible effort, but the authors consider it minor compared to the resources required to achieve similar verification with manually written or purely randomly generated test programs.

History and philosophy

The term APL is attributed to a suggestion made by A. Falkoff when a name was needed for the programming language built on the ideas in Ken Iverson’s book [1]. The history of APL systems is presented by Falkoff. The first interpreter started in November of 1966 on an IBM 360 system. It had the main facilities of APL and was highly interactive. The implementation under CMS enabled the use of variable-size workspaces. The lack of file facilities led to the concept of shared variables. At the end of the 1970s, APL2 was introduced to extend the language to generalized arrays, complex numbers, and possibilities for communication with the system or other languages. Early APL interpreters had been implemented on small machines, such as the IBM 1130. A dedicated APL machine (the IBM 5100) was built in 1974. Now, APL may be used on a PC. The paper emphasizes the fact that all these developments have been performed by small teams.

D. B. McIntyre discusses “Language as an Intellectual Tool: from Hieroglyphics to APL.” He begins, “The APL language, a language with symbols and not words, is one of the intellectual triumphs of our time.” This first sentence gives a good idea of what is discussed in the paper. The author insists on the use of symbols to represent operations. The paper includes an interesting presentation of arithmetic notations in Egypt. Our present arithmetic signs were introduced relatively late (in 1557), and were not easily accepted. McIntyre thinks that “APL’s concise notation helps us grasp the intellectual content of an algorithm without the distraction of extraneous and irrelevant matters prescribed by a machine.” Examples illustrate this point. Hindu-Arabic numerals and logic notation are revisited, and the APL notations are shown to be adequate to represent them. The paper ends with Iverson’s memorable phrase, “notation as a tool for thought.”

Iverson ends this issue by giving his personal view of APL. He starts with the first aspects he developed in his teaching at Harvard: APL as a simple, precise, executable notation for teaching a wide range of subjects. To achieve this goal today, a dialect of APL is needed that is available as shareware, inexpensive enough to be acquired by students, printable on standard printers, and usable on a variety of computers. The result is “J.” The paper describes the main features of J--its terminology (with nouns, verbs, pronouns, adverbs, conjunctions, punctuation, and copulas), spelling, functions, arrays, name assignment, grammar, and order of execution. Tacit programming (programming without variables) is presented briefly.


Mixed feelings arise from this set of papers. Clearly, APL is a powerful language for array manipulation. A programming language operates on two different kinds of objects: simple objects, which have values, and for which constants, variables, expressions, and functions exist; and compound objects, which are made of simple objects and can be designated by names, but do not have values. In ordinary languages, arrays are compound objects, and elements of an array are accessed through the indexing mechanism. The power of APL comes from the fact that arrays are simple objects. Hence they are not declared, and the language must be interpreted, but it is not that inefficient, because each operation in APL is complex, and the time of interpreting the operation is frequently negligible compared to the time needed to perform the operation. Moreover, this operation is implemented in machine code, which increases efficiency.

APL programmers tend to use the facilities of the language to avoid explicit loop construction and do not hesitate to construct large intermediate arrays to produce a scalar result. Is that really efficient? The size of the workspace rapidly becomes a major problem in APL, leading to garbage collection. A comparative study of performances of classical programming with loops, and APL-style programming with intermediate arrays, would have been interesting.

Wheatley says that “the definition of APL is purely abstract.…It makes the language truly machine independent, and avoids bias in favor of particular application areas.” This statement is not at all obvious. The journal contains no discussion of when APL is highly efficient and when it acts like any other language: for instance, it has no simple way to describe the recurrence mechanism in which a term of a sequence is computed from the previous term. Brown and Crowder say that “the structure of the data determines how algorithms are applied rather than determining the controls that a programmer inserts into a program.” This claim is hardly true. All the sorting programs apply to the same data structures. The algorithm determines the control, not the programmer. A tree may be represented as a set of nodes with successors or as a set of nodes with exactly one ancestor. The choice of representation is not trivial: finding the smallest common ancestor of two nodes is easy in the second representation, difficult with the first. The representation to be used depends on the problem to be solved. Using arrays is not a panacea.

The choice of applications presented in this journal is a consequence of these considerations: they are typical situations in which data may be represented as arrays manipulated as simple objects, every operation of a different style being transferred to an auxiliary processor or programmed in a different language.

APL remains an excellent language, because it considers arrays as simple objects and has a powerful set of primitive functions and operators, not because it uses concise notation. Discussions about languages are always passionate, probably because language is the core of a culture. I like APL. I am not convinced, however, that an emphasis on its qualities will induce the reader to move to a different culture.

Reviewer:  J. Arsac Review #: CR124034
1) Iverson, K. E. A programming language. Wiley, New York, 1962.
Comparative Review
This review compares the following items:
  • IBM Systems Journal (v.30 n.4):
  • The IBM family of APL systems:
  • APL2: getting started:
  • Extending the domain of APL:
  • Storage management in IBM APL systems:
  • Putting a new face on APL2:
  • The APL IL Interpreter Generator:
  • Parallel expression in the APL2 language:
  • The foundations of suitability of APL2 for music:
  • APL2 as a specification language for statistics:
  • Language as an intellectual tool:
  • A personal view of APL:
  • Verification of the IBM RISC System/6000 by a dynamic biased pseudo-random test program generator:
  • Advanced applications of APL:
  • Bookmark and Share
    Apl (D.3.2 ... )
    Concurrent Programming Structures (D.3.3 ... )
    Interpreters (D.3.4 ... )
    Modules And Interfaces (D.2.2 ... )
    Music (J.5 ... )
    Statistical Computing (G.3 ... )
    Would you recommend this review?
    Other reviews under "Apl": Date
    Efficient storage management for temporary values in concurrent programming languages
    Quammen D., Kearns J., Soffa M. IEEE Transactions on Computers 34(9): 832-840, 1985. Type: Article
    Aug 1 1986
    A reliable stable storage system for UNIX
    Anyanwu J. Software--Practice & Experience 15(10): 973-990, 1985. Type: Article
    May 1 1986
    Total recall
    Saxer G., Sander E., Osborne/McGraw-Hill, Berkeley, CA, 1993. Type: Book (9780078818356)
    Aug 1 1994

    E-Mail This Printer-Friendly
    Send Your Comments
    Contact Us
    Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2023 ThinkLoud®
    Terms of Use
    | Privacy Policy