Creating Interactive Musical Bots with Bipscript

Bipscript is a simple scripting language ideal for creating interactive musical "bots" that can play along with human performers. By creating one or more scripts that listen and play along with human performers one can create a range of musical players from a simple accompanist up to an entire bot orchestra.

Interactive scripts operate by "listening" to one or more input streams that represent the human performer(s) and using the information from that stream to make decisions about scheduling output events (e.g. MIDI notes) in the transport timeline. Scripts run in a single execution thread with event handlers similar to the way Javascript runs in a web browser. The events that can be handled include:

A typical script will have a main body that runs immediately in order to set up the various connections (audio, MIDI, OSC etc.) and define the event handlers which, in turn, run after the main script has completed.

Most scripts will use this basic structure but fully defining how the bots behave requires answering the following questions:

What instruments or sounds will the bots play?

Unlike many other programming languages aimed at music creation bipscript does not provide a native API for sound generation and instead uses audio plugins for that purpose. This allows a range of sound limited only by the selection of plugins available on the system running the script. Software synthesizers, samplers and effects plugins can all be used to create unique signal paths in much the same way as a traditional sequencer/DAW environment. Bipscript currently supports the LV2 plugin standard with support for AU and VST plugins coming soon.

Please see this tutorial lesson for an example of instantiating a plugin and scheduling notes to play on it.

Alternatively the script itself may make no sound itself and instead use other kinds of output streams (e.g. MIDI or OSC) to drive external software or hardware.

What instruments will the bot listen to?

Listening to external events is achieved by creating one or more input streams and listening to the events on each stream. A script can define any number of MIDI, OSC, and audio input streams.

Of course the listening part is not limited to "instruments" in the traditional sense: events can come from e.g. MIDI control surfaces or anything that sends OSC messages. Just about any external trigger can be used by externally translating events to MIDI or OSC format and listening on the appropriate inputs.

How interactive are the parts?

In other words, how much do the scheduled output events depend on the input events?

For example notes scheduled in the main part of the script will happen without regard to real-time events. This is because the main part of the script executes before anything else, typically before the piece has even begun. This makes sense for some parts i.e. a particular melody or beat pattern that is to play at a particular measure no matter what else happens.

The more interesting case is when output events depend on input events. In this case the output events are scheduled from within event handlers, usually at a particular point in the timeline. The basic strategy is this:

An example: imagine a script that creates a variable to store how "busy" a human performer is playing. This variable has an initial value of zero and is incremented in the event handler for incoming events e.g. note onsets from an audio stream. Now create a method that schedules output notes based on the value of this variable and execute this method immediately before each measure.

For a concrete example of the above please see the Robot Jazz Band example. See also this example from the tutorial.

Is the tempo constant or does it vary?

A constant tempo can be achieved by simply defining a transport master object somewhere in the script. This object takes a BPM value as argument. The entire piece will play at the given BPM tempo unless the script changes it by e.g. scheduling a change at a particular point in the timeline.

Bipscript also contains native objects for beat tracking an input MIDI or audio stream. When using these objects the tempo is continually updated to stay in sync with the connected input stream.

Further Reading

Please see the examples, tutorial and API documentation.