Bipscript is a simple scripting language ideal for creating interactive musical "bots" that can play along with human performers. By creating one or more scripts that listen and play along with human performers one can create a range of musical players from a simple accompanist up to an entire bot orchestra.
A typical script will have a main body that runs immediately in order to set up the various connections (audio, MIDI, OSC etc.) and define the event handlers which, in turn, run after the main script has completed.
Most scripts will use this basic structure but fully defining how the bots behave requires answering the following questions:
Unlike many other programming languages aimed at music creation bipscript does not provide a native API for sound generation and instead uses audio plugins for that purpose. This allows a range of sound limited only by the selection of plugins available on the system running the script. Software synthesizers, samplers and effects plugins can all be used to create unique signal paths in much the same way as a traditional sequencer/DAW environment. Bipscript currently supports the LV2 plugin standard with support for AU and VST plugins coming soon.
Please see this tutorial lesson for an example of instantiating a plugin and scheduling notes to play on it.
Of course the listening part is not limited to "instruments" in the traditional sense: events can come from e.g. MIDI control surfaces or anything that sends OSC messages. Just about any external trigger can be used by externally translating events to MIDI or OSC format and listening on the appropriate inputs.
In other words, how much do the scheduled output events depend on the input events?
For example notes scheduled in the main part of the script will happen without regard to real-time events. This is because the main part of the script executes before anything else, typically before the piece has even begun. This makes sense for some parts i.e. a particular melody or beat pattern that is to play at a particular measure no matter what else happens.
The more interesting case is when output events depend on input events. In this case the output events are scheduled from within event handlers, usually at a particular point in the timeline. The basic strategy is this:
An example: imagine a script that creates a variable to store how "busy" a human performer is playing. This variable has an initial value of zero and is incremented in the event handler for incoming events e.g. note onsets from an audio stream. Now create a method that schedules output notes based on the value of this variable and execute this method immediately before each measure.
A constant tempo can be achieved by simply defining a transport master object somewhere in the script. This object takes a BPM value as argument. The entire piece will play at the given BPM tempo unless the script changes it by e.g. scheduling a change at a particular point in the timeline.