Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorporate control signals #27

Open
ssfrr opened this issue Sep 7, 2014 · 3 comments
Open

Incorporate control signals #27

ssfrr opened this issue Sep 7, 2014 · 3 comments

Comments

@ssfrr
Copy link
Owner

ssfrr commented Sep 7, 2014

(from an email conversation with Shashi Gowda)
At some point we need to allow users to vary AudioNode parameters on the fly. Reactive.jl might be a good conceptual framework and library to handle this. It also might be worth checking out The Haskell School of Music

s = SinOsc(440)
play(s)
wait(1)
stop(s)

And you'll hear 1 second of 440hz sin tone. Rather than putting a constant value for the frequency you can put another AudioNode instance, e.g.

f = LinRamp(220, 440, 1)
s = SinOsc(f)
play(s)

And you'll hear a Sin tone sweeping from 220hz to 440hz.

With a little knowledge of AudioIO internals you can change the frequency on the fly:

julia> s = SinOsc(220)
julia> play(s)
# oscillator plays at 220hz
julia> s.renderer.freq = 330
# oscillator plays at 330hz

So currently they support providing a Real value or and AudioNode, it would fit well into the design to accept a Signal as well, so that as the value changes it would automatically propagate where it needs to go.

@shashi
Copy link

shashi commented Sep 7, 2014

Thanks for opening the issue! :)
Right now in IJulia,

using AudioIO, Interact #, Gadfly
s1 = SinOsc(220)
s2 = SinOsc(220)

@manipulate for f1=100:880, f2 = 110:880
    s1.renderer.freq = f1
    s2.renderer.freq = f2
    #plot(t->sin(f1*2pi*t) + sin(f2*2pi*t), 0, 2pi)
end
play(s1)
play(s2)

gives you sliders which control the two frequencies of sine waves being played.

That macro call translates to:

lift(f1->s1.renderer.freq = f1, signal(slider(110:880)))
lift(f2->s2.renderer.freq = f2, signal(slider(110:880)))

where signal(slider(110:880)) is a signal of slider values.

This sort of API requiring side effects on s1, s2 doesn't feel ideal though. But it's still cool!

what makes the Interact API nice for plotting is IJulia's display mechanism. Interact is able to override display(::Signal{T}) to display an updating view of a value of T, in the richest mime type T supports.

if play(::Signal{AudioNode{T}}) could be similarly handled (by playing via the current renderer in the signal's current AudioNode value at any given time) then we are realizing more power.

In such a set up,

f1 = slider(1:800)
f2 = slider(1:800)

sound = @lift AudioMixer(SinOsc(f1), SinOsc(f2))
play(sound)

would produce the same result.

this would translate to a simpler @manipulate macro call:

sound = @manipulate for f1=1:800, f2=1:800
       AudioMixer(SinOsc(f1), SinOsc(f2))
end

play(sound) # or this could be made implicit

Moreover it is pretty damn simple to add sources of signals. E.g. it is trivial to turn an accelerometer reading or mouse position event stream into a signal. Which opens up possibilities of creating sounds from a lot of sources. I don't know much about interfacing musical instruments via MIDI, but once able to receive signals, it should be easy to turn them into reactive's signal.

An ambitious, and also unfeasible reactive programming approach would involve piping actual Float32 values from/into source/sink at 44100fps. It would look something like:

t = timestamp(fps(44100))
y1 = @lift sin(2pi * f1 * t)
y2 = @lift sin(2pi * f2 * t)
y = @lift y1 + y2

lift(sink, y)

Someday.

@ssfrr
Copy link
Owner Author

ssfrr commented Sep 8, 2014

I'm trying to figure out whether lifting is appropriate here. As I understand it lift takes a transformation function and a signal, and returns a new signal which is a transformed version of the original signal. That doesn't seem to be what I want here, as I'm not really thinking of an AudioNode itself as a Signal, it would just take Signals as control parameters.

The SinOscRenderer is a parametric type parameterized on the frequency control, so you can have SinOscRenderer{Float32} for a constant oscillator, or SinOscRenderer{AudioNode} to control the frequency with another audio-rate AudioNode. Right now I'm considering adding SinOscRenderer{Signal}, so once you have a Signal created you can just instantiate the oscillator:

f1 = slider(1:800)
f2 = slider(1:800)

sound = AudioMixer(SinOsc(signal(f1)), SinOsc(signal(f2)))
play(sound)

So rather than using lift, SinOsc(signal(f1)) returns a value of type AudioNode{SinOscRenderer{Signal{Float32}}} (which for convenience is type-aliased to SinOsc{Signal{Float32}})

It seems that with your example proposal, it's instantiating a new AudioNode every time the signal changes, but maybe I'm misunderstanding an important concept here.

The other thing I would do is set up writemime so that any AudioNode with signal inputs displays widgets for all of them using the Interact.jl machinery.

@shashi
Copy link

shashi commented Sep 8, 2014

That's right, I was suggesting that the manipulate expressions instantiate AudioNodes when the control signals change. :) I imagine AudioNodes as recipes for generating the sound for a short amount of time. We need a different recipe to generate another sound. I imagined the recipes themselves are inexpensive to create: E.g. SinOsc(440) would just be just that, an immutable type SinOsc, and a frequency 440. However I realize this is not how it is implemented. But it's a direction that I think would be profitable to go. There may be clever ways to get this to work.

In my mind, Reactive is best kept orthogonal to the functions it is intended to be used with. This has some benefits:

  • users do not need to learn about extensions to AudioIO API with Signals. They only need to know that play(::Signal{AudioNode}) is possible, and can bring in their knowledge of the two APIs (Reactive, AudioIO) and start doing nice things.
  • Composability: other arbitrary pure functions can be used. Also makes it easy to conceive things with other Reactive primitives than just lift.
# Change the AudioNode being played.
switch = togglebuttons(["sine" => SinOsc(440), "noise" => WhiteNoise()])
play(signal(switch))

# Or alternate every second
ticks = every(1)
flip(a, b) = (a[2], a[1])
snd = lift(x -> x[1], foldl(flip, (SinOsc(440), WhiteNoise()), ticks))
play(signal(snd))

I can imagine it will be powerful if signal combinators like merge, sampleon, keepif, keepwhen, dropwhen, droprepeats were allowed to operate on signals of AudioNodes.

I tried to implement play this way, but didn't have much success, the stop condition remains false once set to false, I will take a look at this tomorrow.

switch(a, b) = begin play(b); stop(a); b end
play(s::Signal) = foldl(switch, NullNode(), s) # NullNode plays nothing

shashi added a commit to shashi/AudioIO.jl that referenced this issue Sep 10, 2014
Nodes now hold exactly the amount of information needed to represent
their domain of signals. removed the "Renderer" abstraction in favor
of `pull`ing samples from a node. Mostly inlinable `sampleat` function
makes `pull` generally fast. pulling for portaudio doesn't allocate a
whole lot of memory. Time is consistently better than previous
iterations, in most cases. the other cases have a lot of room for
improvement.

One can now create signals of AudioNodes (signals as in Reactive.jl) very
cheaply (a SinOsc takes 216 bytes), and lift such a signal to play a
sound that changes in real time. Using Interact signals and the
@manipulate macro, one can do things like:

    mix = @manipulate for f1=110:880, f2=110:880
        SinOsc(f1) + SinOsc(f2)
    end
    lift(play!, mix)

removed play and stop in favour of play! and stop!. play! replaces what is
being played. to stop a node, simply remove it from the tree and play
the tree again. cc ssfrr#27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants