-
Notifications
You must be signed in to change notification settings - Fork 3
/
Copy pathindex.html
123 lines (118 loc) · 7.34 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:og="http://ogp.me/ns#"
xmlns:fb="https://www.facebook.com/2008/fbml">
<head>
<!-- Required meta tags -->
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<!-- Extra meta tags -->
<meta property="og:image" content="static/icon.png"/>
<meta name="description" content="generate music with an variational autoencoder">
<meta name="author" content="mobeets">
<link rel="icon" href="static/favicon.ico">
<title>Neural Synth</title>
<!-- CSS -->
<link rel="stylesheet" href="static/css/bootstrap.min.css">
<link href="static/css/ie10-viewport-bug-workaround.css" rel="stylesheet">
<link rel="stylesheet" href="static/css/styles.css">
<link rel="stylesheet" href="static/css/font-awesome-4.7.0/css/font-awesome.min.css">
<!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media queries -->
<!--[if lt IE 9]>
<script src="https://oss.maxcdn.com/html5shiv/3.7.3/html5shiv.min.js"></script>
<script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script>
<![endif]-->
<!-- Javascript -->
<script src="static/js/lib/jquery.min.js"></script>
<script src="static/js/lib/bootstrap.min.js"></script>
<script src="static/js/lib/p5.js"></script>
<script src="static/js/lib/p5.dom.js"></script>
<script src="static/js/lib/p5.sound.min.js"></script>
<script src="static/js/lib/keras.js"></script>
<script src="static/js/app/sketch.js"></script>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-98024339-1', 'auto');
ga('send', 'pageview');
</script>
</head>
<body>
<nav class="navbar navbar-default">
<div class="container-fluid">
<!-- Brand and toggle get grouped for better mobile display -->
<div class="navbar-header">
<button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#bs-example-navbar-collapse-1" aria-expanded="false">
<span class="sr-only">Toggle navigation</span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
</button>
<a class="navbar-brand" href="https://mobeets.github.io">mobeets</a>
<p class="navbar-text">Neural synth</p>
</div>
<!-- Collect the nav links, forms, and other content for toggling -->
<div class="collapse navbar-collapse" id="bs-example-navbar-collapse-1">
<ul class="nav navbar-nav navbar-right">
<li><a href="https://www.github.com/mobeets/neural-synth"><i class="fa fa-github" aria-hidden="true"></i> View on Github</a></li>
</ul>
</div><!-- /.navbar-collapse -->
</div><!-- /.container-fluid -->
</nav>
<!-- <nav class="navbar navbar-toggleable-md navbar-light bg-faded fixed-top">
<a class="navbar-brand" href="#">Jehosafet</a>
<ul class="nav navbar-nav navbar-right">
<li><a href="#"><span class="glyphicon glyphicon-user"></span> View on Github</a></li>
</ul>
</nav> -->
<div class="container">
<div class="main-container">
<h1>Neural synth</h1>
<div class="row">
<div class="col-md-4"></div>
<p class="lead col-md-4">Click the pad below to make music with a variational autoencoder.</p>
</div>
<!-- <p class="text-center">Notes decoded: <span id="model-output"></span></p> -->
<div class="row">
<div class="col-md-4"></div>
<div class="col-md-4">
<div class="btn-group" role="group" aria-label="...">
<button id="notes" class="btn btn-default" type="button"><i class="fa fa-music" aria-hidden="true"></i>: <span id="model-output"></span></button>
</div>
</div>
</div>
<div id="canvas-container"></div>
<div class="row">
<div class="col-md-4"></div>
<div class="btn-toolbar col-md-6 text-center" role="toolbar" aria-label="...">
<div class="btn-group" role="group" aria-label="...">
<button id="drums-toggle" type="button" class="inst-toggle btn btn-primary" data-toggle="button" aria-pressed="false" autocomplete="off"><i class="fa fa-bolt" aria-hidden="true"></i> Drums</button>
</div>
<div class="btn-group" role="group" aria-label="...">
<button id="bass-toggle" type="button" class="inst-toggle btn btn-primary" data-toggle="button" aria-pressed="false" autocomplete="off"><i class="fa fa-bolt" aria-hidden="true"></i> Bass</button>
</div>
<!-- <div class="btn-group" role="group">
<input class="range-slider" type="range" min="0" max="9">
</div> -->
<div class="btn-group" role="group" aria-label="...">
<button id="key-toggle" type="button" class="inst-toggle btn btn-primary" autocomplete="off"><i class="fa fa-key" aria-hidden="true"></i> Key: <span id="cur-key">C maj</span></button>
</div>
</div>
</div>
</div>
<div class="row">
<div class="col-md-2"></div>
<div class="col-md-8 text-justify"><p>A variational autoencoder (<a href="https://arxiv.org/abs/1606.05908"><q>VAE</q></a>) is a type of neural network that learns how to represent high-dimensional data with only a few numbers. In the case of music, we can train an autoencoder to represent all the notes and chords you might play on a piano using only two variables. You can think of this as the network learning a relationship between (x,y) values and a different collection of notes. To generate music, click on the pad below to choose an (x,y) value, and the autoencoder will use that value to generate a collection of notes.</p><p>The generating process is probabilistic, which means the same location on the pad won't generate the same exact notes every time. Still, nearby locations on the pad correspond to similar sounding chords, which makes it easy to learn to control after you play around with it for a while. For example, if you click in the top-left of the pad below, you will likely play a C major chord, while clicking in the top-right of the pad will usually play an A minor chord.</p><p>The notes the network generates depends a lot on the data you train it with. In this case, the autoencoder was trained on 382 of Bach's four-part chorales (<a href="http://www-etud.iro.umontreal.ca/~boulanni/icml2012">source</a>), with the songs transposed to either C major or A minor.</p>
<p> Listen below to the results when you use a VAE (and two other similar models) to generate an entire song.</p>
<p style="text-align: center">
<iframe width="356" height="200" src="https://www.youtube.com/embed/VF9i9JpEtg8?rel=0" frameborder="0" allowfullscreen></iframe>
</p>
<p>Code and more information is available <a href="https://github.com/mobeets/neural-synth">here</a>.</p>
</div>
</div>
<hr>
</div>
</body>
</html>