Systems and Ideas
"Don't worry about people stealing your ideas. If your ideas
are any good, you'll have to ram them down people's throats."
- Howard Aiken
I like to think of my research career at the interface between people and
computers as having been driven by the twin forces of
Systems and Ideas. I believe ideas drive the building of systems, which
prove and then generate the next set of ideas. Sampled here are some
of the ideas that have held me under their sway and the systems that I
have participated in building that tested these ideas and lead to new
ideas. The references given are those that I think best represent the work,
not necessarily ones I've written. The order given is roughly chronological.
My original plan was that there would be a page or more on each topic,
but it looks like the reality is that there will only be a quick
stream-of-consciousness paragraph. Ah well.
- Economic Models
- I first became enamored with economic models as an undergraduate.
I was a math major taking computer courses and economic courses because
I found both interesting. The idea of modeling a complex system using
math and statistics seemed infinitely cool. A few years ago, my parents
when cleaning their house send me a copy of my undergraduate
honors thesis. I have to admit
that I didn't go back and read it, but I found it so charming. The onion
skin paper, the press-on math symbols, the memories of a box full of
computer cards of data, and running statistical programs batch. Ah, youth.
- Semnet
- Certainly one of the most fortuitous occurances in my career was
applying for a summer job during my first summer as an economics graduate
student with Jim Hollan's
research group at the Navy Personnel Research
and Development Center. Jim has an astonishing knack for gathering together
excellent research teams and attacking challenging problems. The first
project I worked on at NPRDC was a tactical training program. The central
idea of this project was to build a semantic network (hence the name) of
information, and then build various generic games (such as flash cards)
off of the relations specified in the network. This was work conducted on
Teraks (a computer built on the LSI11 chip with bitmaped graphics running
UCSD Pascal). I also was involved in porting the code to UCSD Pascal that
ran on an Apple II. Here is a scanned version of User's Manual NPRDC Tactical Training System Version 5.2
from October 17, 1982.
- Mental Models
- Both the maneuvering board work and the work on Steamer were driven
by the idea of fostering correct mental models. It is an interesting
idea that a lot of our expertise is driven by models we have of the
world. I suspect these models are less real than I thought at the time.
They are more fungible, flowing easily between contradictory states.
A reasonable introduction (maybe, now historical) is: _Mental
Models_ by Albert L. Stevens (Editor), and Dedre Gentner. Lawrence
Erlbaum Assoc. 1983.
- Direct Manipulation
- Whew. Here is an area of research that got way over-hyped. I buy into
direct manipulation, when direct manipulation makes sense. I also buy into
the idea that there is huge power in computation, that you take away from
people if you limit them only to direct manipulation. Programming by direct
manipulation makes as much sense as feeding worms into parking meters. Why
would you ever believe it would work? There are powerful ideas in computation,
such as iteration, recursion, and most importantly abstraction that are
better represented symbolically than moving icons around on the screen.
Having worked on Steamer and the Icon Editor, we knew they were winners,
exactly because we were not frustrated by the interface and driven to code
behind the scenes.
- Relative Motion
- A tough training task for the Navy was instilling understanding of
relative motion. Data devices, such as radar provide only descriptions of
movement of ships relative to each other, not necessarily referenced to
some fixed frame of reference. At that time, the Navy used a graph paper
representation, called a maneuvering board to perform computations in
this relative space. I was part of a team that built a training system
on Perq computers for maneuvering board. This system was eventually transfered
to Xerox Dandelion Lisp machines.
- Expert Systems and Tutors
- As part of the maneuvering board work, we looked into using expert systems
as tutors. Many of the interesting issues in tutoring, such as how far do
you let students go down wrong paths, arose as part of this work. It was
at this time that I began to learn about Lisp, since the expert system
was written in Franz lisp.
- Steamer
- Since Steamer was a visual system, maybe you should start out with some
Steamer images or this set of black-and-white views compiled in a .pdf file. I came into the Steamer project after it was well underway, so I don't
accurately know its history. Steamer was an instructional system for
the Steam Propulsion plant of Navy ships. An existing Fortran simulation
that had been used to drive a physical simulator was converted to lisp,
and was the basis of a virtual steam plant. A graphics editor was developed
that allowed subject matter experts to build views onto the simulation. So
for instance a subsystem like the Lubricating Oil subsystem could be built
using icons to represent the components, and these components were then
"tapped" into the simulation, so that a student could see and operate the
plant. The classic paper on this work was published in AAAI. I was a co-author
on this paper. This work was all
built on various versions of Symbolics Lisp Machines.
- Simulations
-
- Visualization
-
- Tools
- The thing that amazes me most about computer science and how system are
built is that so many of the tools, suck. Seriously suck. Having worked on
Lisp Machines, both Xerox and Symbolics, I was one of the fortunate few who
have worked on machines designed for building code. Everything was integrated,
and actually aimed at the way code is built. For instance patches, which are
an everyday event in the software process, were built right into the system
definition and editor systems. The whole idea of software is to build
abstractions, and building tools to do the work is one elegant way to move
the workload from the user to the computer. I think in those areas that I
have been most successful in, being able to abstract out the right tool has
been a major contributing factor.
- HITS
- The Human Interface Tool Suite was
an amazingly ahead of its time effort at MCC. The idea was to develop a
set of tools that allowed building interfaces across modes and media.
It was an astonishingly collaborative effort among researchers with
experience in a handful of disciplines. The Icon Editor was part of that
effort. I also played a major role along with Louis Weitzman in integrating
the parts of this work together.
- Icon Editor
- One of the challenges from the Steamer Project was building new icons.
The Icon Editor project which I worked
on with Louis Weitzman was an attempt
to build a tool that allowed not only building the visual appearance of an
icon, but also in a modular fashion build its behavior.
- Visualization of large Data Spaces
- Soon after I moved to Bellcore (now Telcordia) I began to work on
understanding what was going on in the telephone signaling network (SS7).
See my publications page for details on the
SS7 work.
- ART
- I was a fortunate user of a software system developed at Bellcore with
the internal name of ART, a 3D visualization system developed by Larry
Stead. It greatly influenced my thinking about what was possible using
3D.
- The Web
- I can vividly remember in the Fall of 1993 when Todd Outten, then our
SGI system administrator came into my office and said you have to see this.
What "this" was, was the web. Very soon after that Larry Stead and I
downloaded a copy of httpd from NCSA and had a server running. It was just
awesome to have the instructions in the browser as we worked our way through
the configuration file. Soon after that I began collecting sailing links,
and putting them on my
sailing page.
This was in the days before firewalls, so it was just available. Later I
moved all the work over to the apparent-wind site.
- Community Based Recommending
- This is research built on the simple observation that for most information
sources you consult, others have already used the resource. The question is
how can you use their experience in ways to guide you through these information
resources. For our work, check out the web site
and/or the CHI paper.
- Restricted Domain Search
- The ideas here are detailed on my
restricted domain search page.
- Visualizations of the Web
- This work was coupled with the ideas from Restricted Domain Search.
Using a tool built by Kent Wittenburg and Eric Sigman (with an earlier
version build by Louis Weitzman) we looked at various tree representations
for visualizing portions of the web. Here is a
reference to a paper I
was a co-author on a system called SiteSeer.
- Searching on the Web
-
- Personal Site Navigation
- Boy. You can never sit still on the web. It seems like everyone is
doing this work now. The question this research looked at was what facilities
could you provide a repeat user of a web site to make repeat visits easier.
- Desired Advertising and Branding
- My, I hate spam. The trouble is that there are some things that I am
interested in receiving information on. How do you reconcile these two
competing goals. Better understanding these issues is the goal of this
research.
- The Next Cool Thing
- What will it be? The next cool thing needs to be a combination of
a problem worth solving and developing technology far enough along that
it can help.
"Only by laying bare and solving substantive problems can
sciences be established and their methods developed."
-- Max Weber, The Methodology of the Social Sciences, 1949 translation.
I can be reached to discuss these various jottings at:
mbrmbr@acm.org
Last updated: Sun Jan 11, 2009
Click on anchor to return to main sailing page
Copyright 1998-2015 Mark Rosenstein. Disclaimer.