To speak in front of people. To see that failure is something and only exists to let me grow. Thank you for being you.
[Orchestra music playing]
Weve come so far, and we've reached so high
And we want to stay
[Synth keyboard playing over drum beats]
[It's very catchy]
[The music stopped!]
[ Applause ]
>> You might wonder how we ended up in this situation.
>> So, yeah. Hello hello JSConf. We
>> We want to show you how this stage was built. And I would love to get the slides now, please. As everything is working. So, yeah, there should be slides.
And the title of this talk is how no making of the stage. And yeah. The second slide says
[ Laughter ]
Oh, there it is. Thank you.
[ Applause ]
Yeah, you see? We don't practice anything. We just do everything live. Oh, so the clicker is also not working. Yeah.
So, we have live:js. We consistent of, yeah, different people.
The core team consists of Jan, Martin, Matt, Ruth, Sam, and myself. And we would like to explain how all of this is working or not in a small presentation. And I will give the microphone now to Zach. Actually, the clicker.
>> Thank you. Hello. So, we prepared a very easy to understand overview for you about what has happened since about November. We were like planning this.
So, basic this is what you know. You see this is kind of the stage. And you might remember the beautiful presentation of the curators you saw today. A nice keynote.
And the keynote didn't work because, you know, keynote restricts the width to something like 4,000 pixels. We needed to take something else.
Keynote, please. Obviously not PowerPoint. So, we took Google Slides. They allowed us to get the 8,204 pixels that we needed. And basically, that was our first little hurdle that we overcame.
So, in this presentation you might remember many people have already worked on this presentation. Including me doing some background graphics for that. Which also was quite a pain because, you know, if you put in Google Slides, SVG wouldn't work so you have to go over the EMF files, which is a vector form of very old time and so on and so on. That was already a little bit complicated but actually worked out fine.
And, of course, in such an intro, you want an opening title. We asked our beautiful friends to prepare some animation for us for which they needed Illustrator files and they did some after effects magic with it.
And the after effects magic went into like the moving, and the JSON files. And the JSON files were exported to SVG files and that's what you basically saw in JS. And you also saw a beautiful live video from Ruth and Sam that went in and you will see and enjoy today some more. That is Canvas and WebGL.
And all of this including all the slides to make it work in the browser because that was what we aimed for. And that was our concept. And then you heard also beautiful music also made for the browser with web audio from Jan and Matt. And this music actually came from Sam.
Thank you, Sam. And everything is here lighted in color theme and rhythm item with his custom interface that he did using DMX.
Thank you, Sam. And, yeah. That's about it. Then we also saw a wonderful beautiful image, because what is JSConf without the opening of the audience's programming and for the tenth anniversary we had something very special going on here. That was made with WebGL and a beautiful also live WebGL presentation in order, well, you're guessing it, to work in the browser.
Yeah. And somebody had to pull all of this together, thank you, Martin. So, somebody has to take all that stuff and bring it into the browser and switch between stuff. Doesn't always work fine. But it was a sweaty job, Martin did it.
And we had help from Paul who went up the stage and checked our performance. We actually succeeded in doing a good job. Thank you, Paul. But then Paul broke our browser.
But that's okay. Happens sometimes. So that was basically in short very easy to understand a nonchaotic overview about what has to happen to do this. Thank you.
[ Applause ]
>> Yes. So, this was kind of an overview. So, now we want to explain the all the small moving parts. So, what we got from JSConf was a song that you heard. And, yeah, we should make a cover out of this.
And the only one doing this stuff in our group is usually Sam with his Game Boys. So, what he did is he created the remix using this framework you can actually see on the screen. And, yeah, you heard it also.
And this is not enough. So, we also met or Monday evening this week to actually have tech week with the whole crew. And in the end, we had like kind of this scrum board with many, many tasks and hopefully the most important parts are done. And one of the focus things across was the intro. So, we actually created some kind of timeline and to be sure what we wanted to do.
And we also wanted to do more remixes because we have two more musicians in our group, and they all want to play with the music. And first of all, it's Matt. And the second of all is them and they want to talk about the stem cells.
But in the meantime, I've actually stopped using my software with WAV audio and I've switched to using actual synthesizers, samplers and drum machines. Because it turns out that actual things are better than virtual things. But, hey.
>> Okay. So, yeah. You can this is all open source now. So, you can go there.
You can take a look. It's currently it's a little bit hard to play.
But we hope that we can fix that at some point so you can just open the website and play that. So, the brief was to basically create a remix from that song from Sam. And we, as musicians we have certain softwares that we like to use. One of them is able live, a professional like musicmaking software and it was really easy.
It's like an IDE for song writing, basically. So, it was really easy to do sketches. Much easier than programming something. And so, we did sketches in there. You can pull in the remix, you can pull in audio, you can do all kinds of things.
But at some point, we had to, you know, get this somehow in the browser. And the main language that music tours speak still is Midi. And Midi is not ring tones or something for those that remember ring tones. It is a very old protocol from 1983 which allows musical instruments to talk to each other.
And it's still in wide use. You can see it here on stage. It's great. It's reliable, it's old, it's robust. Much different than, for example, getusermedia or something.
And, yeah. And for some reason and I still haven't understood why we now have web midi which allows us to talk to midi instrument from the browser. And so, what we could do is still keep all of our arrangement enabled live and send this to the thing that we've built over the last four days or, so which is a combination of different modules. It's a sampler.
It's a synthesizer. It's a playback device. It has lots of effects. A whole bunch of things you can take a look at. And all of this is driven by Ableton live.
And it's wonderful because it also allows you to set cue points and everything. So, we were able to switch between different scenes of the intro and we could also send synchronization data to Martin's computer who would then play certain videos or, you know, not videos. Not actual videos but, you know, animations and 3D animations and stuff like that.
So, yeah. That's how we built this. Take a look at it. Yeah. And we hope we can give you something so that you can play it back on your own.
>> Okay. So, now let's come to the actual setup on the stage. And this is where Martin should come and say something.
>> Yeah. But I think I would just give you some basic facts about the stage. As you can see, it is big. The main screen is something around 40 meters wide and 5 meters high.
And yeah. Okay. Doesn't work so well.
So, we have five projectors. They are perfectly aligned to create one continuous image. And this one image is 8,000 x1,000 round about pixels wide. We run this from a single PC that has three dual HD outputs.
And so, it outputs like 11,000. But we only have the browser window a little bit smaller than that.
So, additionally to this big screen and the projectors we have all these light panels around here. 34 of them which are controlled by software. And finally, we have this nice little X over there which has an interesting story behind it. Originally it was supposed to be canvas and projected on.
But then it turned us that projectors have sort of a focus. So, we couldn't really project anything sharp on it.
So, Matt had the idea to put it full of LEDs. And we were almost ordering 200 meters of these LED strips in order to make it work. Until suddenly someone who would have been responsible for soldering it all together was like, wait. We have these these professional LED panels lying around and maybe you just use them.
So, these are 17 panels. Each of 192x192 pixels. So, in total, much more than we could have bought with LED strips. And it's controlled with a single full HD signal which is the full screen browser window as you might have guessed. And those are the specs that we received for this.
And it works surprisingly well. If you want to create content for it, there is a CodePen somewhere if you check Twitter, JSConf, you should be able to find it. Where all of these positions are already yeah hardcoded and you can just like start doing that. We'll probably show some of these at least tomorrow.
Yeah. And I guess that's it for me. Yeah.
[ Applause ]
Yeah. Yes, Ruth.
>> Yeah, sorry, the slide. It's not here. So, next thing is this is Ruth.
>> Yeah, we'll just mention. She did do all the design for everything as well as everything else, by the way. So, yeah.
[ Applause ]
So, yeah. My software. So, I started building a piece of audiovisualization software approximately five years ago, I think? This was very much an experiment. Like we got the web audio API about seven years ago and I started analyzing it and moving stuff around the DOM.
I started playing with SVGs. And this is a great idea until you chuck about a hundred SVGs at the DOM and analyze audio at the same time and try to move them around. I quickly moved to Canvas.
But the software which I built, and I have been building it for five year was a tangled spaghetti code. And I was gigging with it, mixing video to the beat, mixing Canvas shades and crashing every five minutes. It was a great experiment and I had a really great time making it. But I came to this year, it's probably a good time to start naught, naught, point naught, one instead of naught point naught, naught, naught version. And it would be good to do one thing.
So, I started this year, and for the past couple of weeks, especially during this hack week, I have been basically been trying to get it to a stage to use it for circumstantials. Sam is doing visuals with me. It's got four moving parts. And it's doing one thing.
And focus on my style as an artist. It's generative. It basically just chucks some Canvas with some parameters.
I'm analyzing audio and using midi as well. And it just generates visuals. That's what I want it to do. No more video or mixing, just visuals. And four moving parts, colors, turn into palettes.
I've got vectors and turn into grids. I put some physics in, I'm looking forward to that. And shapes. Shapes are going make sprites.
And when it came to the X, Martin mapped the X to use the normal screen. We knew where the positions were.
I made a CodePen with the coordinates of where these positions were. For me, I was pretty okay. She had made a grid system. I could make a custom grid, just map the coordinates of the X, right? So, if we take a look at that. There's a great contentious with my software.
All I have to do this is a normal visualization going on. I can't see what's going on.
>> Yeah, it's working.
>> I just pipe in the coordinates and map nice little squares like we saw for the opening which actually didn't take too much effort in the end. So, yeah. This is available on GitHub. The URL is there.
It is in a state of getting it working for this conference. But yeah. That's Bizra.
[ Applause ]
>> Yeah. So, that's powering the O and the X. And we also have a software which is powering the whole stage behind me. And Sam will talk about that.
>> Indeed. Hello. So, yeah, I'm just going to quickly run you through modV. And if all goes well, you should be able to see my desktop in modV in the center.
The X is kind of in the way. But never mind, hey.
Hey, there we go. Perfect. Thank you. So, yeah, this is modV.
Wow. This is a screenshot, the X is in the way. But modV is an environment that I have been building in the browser for I think it's six years this year now. It's Canvas, WebGL. It's all audio reactive.
It's got a whole module system which you can kind of just drop in new modules. So, like if we just did everything that's going on. We can just kind of drop in, I don't know, some sound calls.
And then if we just give it some signal through the microphone. Yeah. So, you can kind of see it going there. But then you can stack this, and it's got a whole UI and you can build modules for it.
Yeah. To get this working because modV has a plugin system, Martin sent me the specs for the wide screen.
So, this whole wide screen is actually a 1080 signal split into two sections. The top section is the lefthand side of the screen, and the bottom signal is the righthand side of the screen. And with the plugins I could hook into the plugin and spawn a new output window and that's just actually running on another desktop here. If I swipe, that's my desktop background.
So, yeah, this is just another browser window for the software that I just kind of am tinkering away at in my free time. It's all open source. Go to modV.JS.org. It's all there.
But, yeah. That's a quick run through, really.
[ Applause ]
>> Yeah. So, now we are switching to the software I'm creating, I have been doing this also for five years now. And I'm controlling the lights. And this is how my software looks like.
And it's it's getting its databased on what we use as projecting. And you can see in the top I have a grid of colors which I get from modV and I'm mapping them to the actual lights that you can see in the venue.
And now we'll just go over to my PC to actually make sense of what I'm saying so that you can see that the lights are working. So, yeah. This is all also controllable by a midi controller and I can just add kind of different yeah. Visuals in modV and this is presented live on the lights. Yeah.
Which is pretty cool. I think. And all of this is connected to the XPort using a custom created web use, and there's a controller. As you can see it here, using an Arduino. And this is everything in the browser.
There's no server running at all. Yeah. So, you can control lights that are actual everywhere just with the browser window which is cool. I think.
[ Applause ]
Yeah. And that's like how we built the stage and how we worked together to actually get you this live show. And you see there are a lot of moving parts which could go wrong. Especially if you're just using everything in the browser.
And to debate get you some kind of goody, we released the song that Sam created on Sound Cloud. You can find it on his sound Cloud if you want to.
Yes. And that's it from the talk perspective. We will do now the remixes and the show again. So, have fun.
[ Applause ]
[Electronic music playing!]
>> Live:js remix.
>> Live: JS