-
put together a one-shot ubuntu install script that sets you up with
scsynth, the necessary jar-files, etc...
-
create a series of livecode session movies going through how things work
- Audio samples: loading, playing, and manipulating
- support a variety of file formats, not just wav
- select line-in/out, mic...
- show a scope along side the chooser and with a single click connect up
so it's easy to see what you are hooking up with quickly
- bring up the scope and an FFT window with some helpful controls for exploring sound spaces
- either bring up a window or make it easy to do in code:
- connect arpeggiators, chord progressors, midi-in, etc., to your
live instrument definitions so you can mess with the parameters and the synthdef while
hearing useful audio input
- develop more of a dsl for:
- defining instruments
- managing voices, fx, and busses
- synthdefs should automatically register their information with a synthdef library
area so users can easily browse available instruments and fx.
- Implement some basic optimizations for BinaryOpUGens to do things like:
- pre-compute constants
- combine '*' and '+' ops into MulAdd
- look at BasicOpsUGen.sc in supercollider source
- we can do this either at the form processing phase, connected to (replace-arithmetic ...)
or in the ugen connection phase in (detail-ugens ...)
- create a library of things like 12-bar-blues, etc., and make it easy
to connect them up to instruments.
- add more knowledge about scale construction, modes, etc
- create a library of interesting scales
- lazy sequences of notes (combine or separate freq, velocity, duration?)
- markov models (basically gets us finite state machines too right?)
(markov funky-jazz
a 3 -> b ; b 3 times more likely than c
a -> c
b -> d
c -> d
d -> a)
- so an overtone client can be in step with an externally generated clock
- so multiple overtone clients can be in step
- Re-write the tune-up script in clojure using the shell-out (sh ...) function from clojure.contrib
- will need to figure out how to correctly manage classpath bullshit...
It would be very cool if voices, fx and generators could register various named knobs, triggers and observable sequences that could be used to interact and visualize live-coded musical systems.
Use: Plasma, graph based networking
Use: google app-engine/wave cloud server
Use: jVSTWrapper (http://jvstwrapper.sourceforge.net/