Replies: 2 comments 7 replies
-
I don't understand a lot of what you said - but pure data looks like dog shit. I haven't had a really good look at vmix, but my impression is that we're best off not shipping it.. but instead setting up a single mix with all the parameters people could want. We should be making the audio stuff simple to use, rather than adding a ton of complexity to it. I kind of feel like the audio customization of the engine is insane and probably out of hand - when compared to other things - like how you have to make a text file to create a texture. |
Beta Was this translation helpful? Give feedback.
-
I don't know how much game designers want to be messing with all that stuff though? You just want to play sounds, have reverb and echo in big rooms, have it sound all muffled when you're in water. I don't know how vmix fits into sbox. It's not modular, it's defining the mix for the game. Do we let people create their own vmixes and override the global mix? |
Beta Was this translation helpful? Give feedback.
-
I realize that this is pretty forward-thinking and should be tackled much later on once VMix is actually implemented and Facepunch starts working on an SDK/API for the other tools that s&box has, but I've had this stuck in my head for a little while, and I want to see what other people's opinions are.
When VMix gets implemented fully, I think it'd be best for people to be able to make their own effects (and potentially generators like synths and samplers and such) if there's something that VMix doesn't have and distribute that for others to use, instead of making a feature request in here for, say, convolution reverb, or a sampler, or something like that. DAWs have been doing this approach for decades for effects and generators with the VST plugin standard.
However, VST plugins probably aren't the best approach for VMix for a variety of different reasons. Particularly, they aren't made to be redistributed for real-time use at all, they're moreso made to be used by a musician/audio engineer in a DAW and rendered to an audio file after everything's done. Not to mention the complications that would come with paid plugins and redistributing those...
Something that I think would be a better fit for this is Pure Data. It's an open-source and lightweight visual programming language for making pretty much anything audio-related (Or at least, that's the idea I got from everything I've gathered.)
Here's some examples of what you can do with it:
https://youtu.be/DJCoOr4uHD4 - Fuzz, Reverb, Tremolo, Overdrive, Synth/Pitch Tracker, and looper effects for guitar (or any audio input really) with controllable parameters
https://www.youtube.com/watch?v=WQ7ifSykvXU - A synthesizer controllable via MIDI
Ideally the way this could work is that people could import Pure Data patches as nodes, and have that nodes' inputs and outputs set in the patch, probably in a similar way to how the guy in the first example controlled parameters via MIDI.
Beta Was this translation helpful? Give feedback.
All reactions