Welcome to The Container!

The potential virtual club

'The Container' is a realtime 3D space.


'The Container' v.1.0 is the setup of a future realtime live environment, infused with live streams, (data-driven) parameters and surfaces to project on, objects and notes to leave. A playground aka a live scape. It will turn into a live exhibition/conference/VR space. The first of such kind. -
Deep into that scape: Creating an environment that is dedicated to next-fresh fem.tech unfoldings/additive technologies, non-racist maths - and yes within it, we will be able to danc e - maybe tomorrow, maybe next year.. With the many sounds and streams of relevant real time streams, combined into a polyphony. Morphing that space with the flows, surplussing their dimensions. Watching knots blossom. Towards complex sphere compositions: - What was the light like?

To open that room: such a realtime 3d space and plugin streams, objects, surfaces to represent the space and map it to (geospatial/virtual/carbon/…) ‘truth’..

set a test run scape.. just think of all the possibilities...


Opening a “clear” defined frame, a .teaser, a canvas, a virtual spatiotemporal scape. HERE it is! Within: struggling, mapping, slow gates & processors and still: unsophisticated exchange rules plus a missing combining framework, that includes all of such possibilities. -> 

//SOFT: the question of ‘perfection’ - composition towards randomness and control, ///the true line, never triggered in linear code, plain narration, one string. Next dimensional -form trajectories: scape: polyplex structure of in-between augmenting next? translat:d? the upwinding of a form. towards.
it starts with a 3 dimensional box, in which can be mapped.  

/Version 1.0 - Toward stable spaces:

node representation in next augmented scape.
space modelling in (potential) virtual web.

technical overview and testing.

digging for depth
form and __
Certain Measures

toying in p5.js / three.js / x3dom

and getting sucked into all the wonderful streaming data out there



depth of the canvas/scene/ other determinations of the scape, dimensions, parameters, groups, namespaces, ‘effectivals’, possible triggers and transformations, translocations and polyspheres and its merge forms. dimensional compositions. that space. (relate to(the(outer))(box)? what are the axis? *to think about, in/out- or beside them.*)

1. SPACE rep
dimensional compositions (across frameworks(3+D (on web). augmented. frameworks (xml3D, x3dom, Unity+C, ..),..). fix it to real time map/globe/“physical location within the universe”, augmenting & determining:(:)geo and GIS allocation :X3D :ontologies, its enlargements, namespaces, autoconnections (…); having the canvas, the scene, other determinations of the scape, dimensions, parameters, groups, namespaces, ‘effectivals’, … sort and combine, make them transparent and effective: outline and fill that space ==> DIMSs infused by dynamic data. (enlargement of space)
LATER: gating data streams to dimensional parameter: Space with unpredictability / real-time data animating/shaking the x/y/z achsis. what form of movement/interaction spanned? movement/contact - relativity. -> # fresh trajectory mappings

… so much more relevant to mention on that


gating and triggering.
-> implementation of realtime (sensor) data, ( live feeds (formats)) for what gates? e.g. to take those strings as parameters for and/or give an interface for effective triggers. 

/POV controller movement and multiscape


__REAL TIME Point 1__
a lot of information (which would best fit in a paper or online), some drawings and scripts - ->shared information! 
__frameworks / technologies / implementations__, javascript, special libraries, (…)

-> knowledge(-to potentially be shared)

In the end it can also become something very simple again, just a trigger, or some small function.. minimalized.. or maybe yet a dancehall for danceable technology (and we are like disco for VR for hot acid information and real time shared environment.. oh.. it will all be so much like in MATRIX.. this could be yr VR Technology Club, fresher than Berghain, future`s technology performing live at this real time VR environment club (accessible through that tiny little rabit hole teaser#div.box,).. ) or other stream scapes, gated in realtime from the outside. Most acid infos. Putting those elements together in a VR space… sexy environment for such moves and lines in which we will be able to dance in future..

TheContainer (the virtual space in which I can plug in time by time (- as technology develops) The problem is: what I want build, is just not possible yet within just one tool (WebGL/WebVR/interactive/data-). To keep it small and simple for now: (just pointing to those hidden treasures and trying to not want to completely grasp them.. for now), so I baked this twerk.
/ problem overall: That SPACE: WebGL now, as descriptive X3D is not appliable enough yet: p5.js or three.js (p5.js(data-interactivity-expansion possibilities), three.js (VR and space rendering more advanced).. waiting for WebVR and VR standards/(tools/) ) the space possibilities will grow (with time and tech) (e.g. one year later with a sidewall texture mapping live events and with physics and more interaction possible (within next WebVR suite or a standard)) (- and I left out the meta data possibilities and much more..) this or and that. or much more reduced.

/… this is investigations (and maybe a bigger part in the “production” than the APPLIED examplyfied part. It is very very open.

floors.other techniques///functional: other 3D buildscapes:

the information below is it



The Input:

Audio1: Live ()Meteor) Radio Radar Echoes, Roswell