Tuesday, 19 September 2017

ScriptSonic for generative ambient

ScriptSonic is an interesting application for iOS, which did not get the attention it deserved. Perhaps because when it launched it had a very high price (24.99$) but since april 2017 the price is much more reasonable (9.99$).

Basically, it allows to use a flavour of JavaScript to generate midi events which can be received by other midi-enabled applications in the iPad. It can also play samples and perform some simple processing on them, but in my opinion it is the midi side what makes it much more interesting.

The user interface is not familiar, and a bit difficult to use. The first thing you see is the "Projects" screen, from which you can load one of the 17 examples which come with it (and later, your own creations), each one visible as a thumbnail:


Selecting one and clicking "Import", or double tapping in one thumbnail, you arrive to the "Loop" screen, which is where the performance of the music happens.


It looks like a tracker, and indeed it is that, in some way. You can mute/unmute each track separately, play/stop, etc. But each cell in the tracker contains JavaScript code which is triggered when that cell is played. The code can generate midi events (note-on, note-off and CC), and to some special variables which represent the percentage of the time elapsed inside the cell, or in the track.

This view can alternate between the "Play" and "Edit" views. In "Play" view (image above) touching the cells can generate events which can be read from the javascript and used to change parameters, for example. In "Edit" view the javascript code in each cell is partially visible, and touching the cell opens an editor which allows you to change it (even in real time, for live-coding).




In addition, you have four extra "code buffers", not related to the main tracker, in which you can write any JavaScript and run it at any time. The code has the ability to create new cells/tracks in the main interface, and this way you can do meta-programming, i.e: to write javascript whose mission is to create the cells and to write in each one the javascript to be executed when played.



If this sounds confusing it is because it is! The concepts in this app are new, and weird. The interface is not very polished. Even if all revolves around writing code, the integrated editor is very primitive, and in addition it provides its own custom keyboard (visible in the figures above) which is rather unconfortable. You can switch to the standard iOS keyboard, but in iPad Pro it uses the "legacy keyboard" instead of the keyboard with the extra numbers row.

After the initial learning curve, and the possibilities begin to appear. Basically you can write code which can control any of the synthetizers in your iPad.

Examples

I wrote some code (which I can post if anyone is interested) to generate a set of random cells in the "Loop" section, each cell having a different (random) lenght and containing (meta-generated) javascript which either plays a note, or a rest. Each track is played at different speed, and this way, even if each track is looped and repeats from the beginning, the whole song never repeats, because the different speeds. This is the same idea used in "Bay" (see previous posts).


Note that the code only generates midi-on and midi-off events (you can select the channel for each one of them), but ScriptSonic itself do not generate sound. Depending on the synths running in the device, the sound can be very different. Using this same code, I created four different songs:

The first one uses TheraSynth (pads) and  ThumbJamb (flutes) as synts:



The second one uses the same synths, but different presets (piano en ThumbJam), and a longer reverb.


The third one uses also Nave synth to play the notes generated by ScriptSonic, and also ThumbJam percussions driven by Xynthesizr


Finally, I used the same code to send notes to Mersenne synth, but using its arpeggio function, which changes completly the feeling, because instead of long chords, produces short repetitions of the same notes in a rithmic semi-chaotic pattern (the randomness appears because sometimes theere are four notes to arpegiatte, but other times there are only two, or even a single note!). This is the most interesting one in my opinion.

Friday, 25 November 2016

Hello SonicPi!

I recently discovered SonicPi, and it is awesome!
It is a programming framework in which you write Ruby code to control the Supercollider engine and produce music. It is aimed to “live” performance (live coding), but it can be also used to experiment with algorithmically generated music.
My first idea was that it should be not very difficult to “port” my piece “Bay“ from ngen to SonicPi, and it was very straightforward, indeed.
The same code can produce the two variations “Bay” and “Bay at night” simply uncommenting one line. If you don’t have SonicPi installed but want to know what does this sound like, here is a little excerpt.


Here is the code:

# Bay (2005) re-implemented in SonicPi (2016)

# The original Bay was created with ngen an rendered with Csound.
# The original ngen implementation is here
# https://dl.dropboxusercontent.com/u/2374752/csound/bay.gen
# and the csound rendition can be listened at
# https://archive.org/details/jld_bay

# The following algorithm description is extracted
# from the ngen version comments
#
# ; The algorithm is very simple, but leads to interesting results. The
# ; composer selects a note by name (eg. 'a2'), and other parameters
# ; governing the random process. The algorithm produces several
# ; activations of this note, separated by a interval of random
# ; length. The lenght of the note and its midi velocity are also
# ; random. The composer can specify the limits for these random
# ; quantities.
#
# ; The composer can call this algorithm several times, with different
# ; parameters each time.  Each invocation produces a random stream of
# ; repeating notes, all starting at zero. The ramdom length of the
# ; intervals leads to unexpected melodies, chords and dissonancies.

# This function generates a stream of repeated notes
define :generate do |start, reps, n, imin, imax, len, vmin, vmax|
  # Parameters:
  # start: initial number of notes to skip (will produce silence)
  # reps: number of notes the stream produces before dying
  # n:    Note to repeat. If it is an array, one random
  #       note from the array is selected in each iteration.
  #       Do not abuse of this feature
  # imin: Minimal interval between note repetitions
  # imax: Max interval between note repetitions
  # len:  Length of the note to produce in each repetition
  # vmin,vmax: range of "velocity" for the notes. In the original
  #       implementation, the values were MIDI velocities. In
  #       this implementation they are used to set amplitude
  #       and cutoff frequency

  reps.times do |r|
    if r<start                  # Wait the specified initial delay
      sleep rrand(imin, imax)
      next
    end
    # Choose a different synth in each iteration
    use_synth (ring :hollow, :prophet, :blade).tick
    # Adjust the volume, depending on the synth selected above
    # (:hollow is much quieter than :prophet, so I pump up the volume)
    vol_multiplier = (ring 1.2, 0.6, 0.7).look

    # Select the note to play in this iteration
    if n.is_a?(Array)
      chosen_note = n.choose
    else
      chosen_note = n
    end



    # Uncomment next line to get "Bay at night" :-)
    # chosen_note += -1 + rand_i(2)

    # Play the note. Several parameters are set to get a random
    # volume and length, between the limits specified
    play chosen_note, amp: vol_multiplier*rrand(vmin, vmax)/120,
      attack: len/2, decay: rrand(0.2,0.8)*len/2, 
      pan: choose((stretch rrand(-1, -0.6), 45, rrand(-0.2, 0.2), 10, rrand(0.6,1), 45)),
      cutoff: rrand(vmin, vmax),
      vibrato_delay: len/2, vibrato_onset:1,
      vibrato_rate: 2, vibrato_depth: 0.09

    # Wait a random time before triggering the note again
    sleep rrand(imin, imax)
  end
end

# The above function is the basic infrastructure. Next, I'll call several instances
# of the function, each one in a separated thread. For this, I'll define some
# arrays with the paramaters to be passed to the function in each invocation

use_bpm 40
use_random_seed = 1982
piece_length = 40

args = [
  [0, piece_length-3, :f2,9,12,9,60,90],
  [0, piece_length-3,:a2,9,12,9,60,90],
  [0, piece_length-3,:c3,9,12,9,60,90],
  [1, piece_length-2,:e3,9,12,9,60,90],
  [2, piece_length-3,:f3,9,12,9,60,90],
  [1, piece_length,:e4,9,12,9,60,90],
  [3, piece_length-2,:a4,8,12,9,60,70],
  [3, piece_length-2,:b4,8,10,9,50,70],
  [3, piece_length-2,[:c4,:cs4,:d4],10,20,3,60,90],
  [5, piece_length-4,:c5,8,10,9,50,80],
  [5, piece_length-5,:d5,8,10,9,50,90],
  [6, piece_length-5,:e5,8,10,9,50,70],
  [8, piece_length-5, [:b6,:c7,:d7],4,7,5,70,98]
]

# Now, I create another array containing the time at which each instance of the
# function has to be called. I fill the array with random values below 5, so that
# each call starts at a random instant, but all of them in the first 5 secs
instants = []
args.each do
  instants.push(rand(5))
end

# Using "at", I launch several instances of the function, each one at a
# random instant (specified in "instants" array) and with different
# parameters (specified in "args" array). All those functions share
# the same reverb and flanger effect
with_fx :reverb, spread: 0.9, mix: 0.8, room: 0.99 do
  with_fx :flanger do
    at instants, args do |params|
      generate *params
    end
  end
end

Prologue.

It has been some time since I'm interested in generating Ambient music using computers, mainly iPad and PC, trying different tools for algorithmic composition or improvising with software synthethizers.

One of the first thing I published was a piece titled "Bay", algorithmically generated using ngen and rendered in a PC using csound. The result was uploaded ten years ago to the Internet Archive, and it is still there.

When the iPad arrived, a plethora of new tools and synths were at my disposition. I published some of my experiments at SoundCloud and YouTube, but I realized that all these works were disperse, and that a blog would be a good way to tie them up, at the same time that will provide me a more personal space to write about the tools and process around each piece. So, here it is!

 I wrote most of this for myself, but if anyone else is reading it, you are welcome. Drop a comment if you like it.