You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently unstructured notes. Idea for being able to compose together audio, video and non-AV content onto virtual timelines. This mirrors the time-based Canvas in IIIF and could be used by viewers to visualise complex scenes.
Primary focus points:
Syncing time
Reliable and easy events
Seamless "virtual" tracks
Ideas for virtual tracks
Adding removing static images from DOM
Changing position of viewport/annotation
Non-time based, using scroll position
React ideas
A time-based component
functionMyTimeBasedThing(){const{ time, percent }=useTimeline({freq: 1/60,duration: 5});if(percent<0.5){return<div>A</div>}return<div>B</div>}
Or with keyframes
functionMyKeyFrameThing(){const{ time, currentStop }=useTimeline({stops: [5,10,15],duration: 60})// .. only renders when transitioning past 5, 10 and 15}
And composed when rendering:
<MediaKit.Providerplayer={player}>{/* Custom source allows some time-warping, shifting on the timeline */}<MediaKit.CustomSourcestart={0}end={2.5}speed={2}><MyTimeBasedThing/></MediaKit.CustomSource></MediaKit.Provider/>
Custom player UI:
functionControls(){const{ play, pause, nextTrack, prevTrack }=usePlayerControls();const{ isPlaying, getTime }=usePlayerState();// Element.innerText will be formatted time (avoid re-renders)consttimeRef=useTimeRef();// Click on el will go to % of width// Also css variable with percentage will be set, for stylingconstscrubberRef=useScrubberRef();return(<div><CurrentTimeref={timeRef}/><Scrubberref={scrubberRef}/></div>);}
When mixing React and Non-React sources:
Before the provider is created - you should know basic information about what you want to play, like the duration.
Once you have this empty temporal space, anything under the provider can add to the timeline (interactively)
You can also add non-React elements with the player API (such as audio) and have components that control the player API itself - and not directly with the audio elements.
When react components mount, they will be added to the player instance - they will also be added contextually (warping)
Dev tools can then be used to still visualise the input on a timeline.
const$el=document.create('audio');$el.src='http://example.org/music.mp3';constmedia=newMediaKit.HTMLAudio('http://example.org/music.mp3');consttrack=newTrack();track.addMedia(media,{source: {start: 0,end: 20,},target: {start: 0,end: 20,}})track.addMedia(media,{source: {start: 40,end: 60,},target: {start: 20,end: 40,}});track.play();track.pause();track.seekTo(24);constsequence=newSequence();sequence.addTrack(track);sequence.play();sequence.pause();sequence.seekTo(24);sequence.next();sequence.previous();constcomposition=newComposition(sequence);// or new Composition(track), or new Composition(media), or new Composition([track, media, track]); which will create a sequence internally.composition.addSection(0,20);composition.addSection(25,30);composition.addSection(25,30);composition.addSection(25,30);composition.addSection(0,20,1);// targeting first index 1 of the sequence internally.composition.play();composition.pause();composition.seekTo(24);
// Only run once.constCustomSource=defineSource({create(){returndocument.createElement('div');},clear(el){el.innerText='';},render(el,data){el.innerText=data.text;},timestops: [{start: 0,end: 10,data: {text: 'Some transcription'}},{start: 10,end: 20,data: {text: 'Some other transcription'}},],});constplayer=(<audio-sequence><mix><CustomSourcestart="0"duration="20"/><audiosrc="https://example.org/track1.mp3"start="0"duration="20"/></mix></audio-sequence>);document.appendChild(player);// Has the same API as an HTMLAudioElementplayer.play();player.pause();player.currentTime;
<!-- At 1 second, the following HTML would be produced: --><audio-sequence><!-- a web component wrapper --><div>Some transcription</div><!-- our "CustomSource" we created --><audiosrc="https://example.org/track1.mp3" /> <!-- the native HTML audio --></audio-sequence>
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Currently unstructured notes. Idea for being able to compose together audio, video and non-AV content onto virtual timelines. This mirrors the time-based Canvas in IIIF and could be used by viewers to visualise complex scenes.
React ideas
A time-based component
Or with keyframes
And composed when rendering:
Custom player UI:
When mixing React and Non-React sources:
JSX API (idea)
With integration with custom timeline elements.
Beta Was this translation helpful? Give feedback.
All reactions