CarthoGraffiti
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Abstract
This project explores the usage of modern tracking
technology to create an interactive field of soundaugmentation
that encourages exploration through
movement.
It’s a system which translates live motion data into MIDI
signals, that in relation to the spatial discretization ends
up as sound waves that can be heard in real-time. It’s an
audible mirror to movements.
The design proposal for the project is an installation with
the augmented field at its center, surrounded by responsive
panels that through and optical illusion extents into infinity
when activated. The audio and visual feedback are both
representations of the same movement, adding to an
intuitive understanding of the system.
Exploration is a central them of the project, and several
trials with a variety of guest have guided the system
towards a spatial discretization that encourage the user to
engage with the field. Both melodic scales of notes, easily
understandable arrangement logics and hidden discoveries
add to this engagement.
1
Intro .................................................................3
References ..................................................5 - 8
- Theoretical framework ...............................5
- Movement and Sound................................6
- The perception of space.............................7
- Technology; Tracking and Affect ...............8
Methodology....................................................9
Technical (coding skill) progress ......................9
System arrangement......................................10
Test / Guest setup..........................................10
Trials.......................................................11 - 16
- 1 st test.................................................11 - 12
- Dancers .............................................13 - 14
- Various ..............................................15 - 16
Spatial discretization......................................17
The boundary................................................19
Trials.......................................................21 - 22
- Various......................................................21
The installation.......................................23 - 28
- Features.....................................................25
- Entrance / Initiation ................................27
Conclusion.....................................................29
Guest..............................................................31
Annex .....................................................33 - 42
- LED-wall ..................................................35
- Early sketches ...........................................36
- Scripts................................................37 - 42
- Fundamentals of computation.............41
2
Intro
Architecture as we know it is almost
exclusively focusing on one of our
senses; sight. This tendency isn’t
new, but have been the case at least
since the ancient greeks, where
Aristotle among others proclaimed
sight as the noblest of senses, often
used to describe “higher thinking”,
ie. through proverbs like “the eye of
the soul” or “the light of reason”.
A multitude of famous architects
counting Vitruvius, Palladio to
Le Corbusier have since then also
regarded sight as the most powerful
of senses, to some extent focusing on
appearance over experience.
When we think of architectural
practice, we think of plans, sections,
diagrams and only very rarely do we
consider sound to be as important
as the visual. This is unarguably a
strange state of affairs considering
the huge impact sound has on our
well-being in a space.
At the core of this project lies
amusement and leisure; and from this
point of departure the goal is to test
the capabilities and possibilities of
using modern tracking technology,
not to create animations or analyses,
but sound that correlates in so
called “real-time” to the tracked
movements.
The intention of the project is to
create a space, or field if you will,
that instantly affects the behavior of
its users to be expressionistic. We’ve
grown accustomed to modern spaces
like the metro of Copenhagen;
spaces that through their functions
don’t want us there. Hurry, catch the
next train! Escape to the comfort of
your phone! But for the love of god
don’t stay! Don’t do anything out of
the ordinary!
This project offers an augmentation
of space that encourages playfulness
and exploration, where your
movements are mirrored by music.
It’s a spacious instrument that
doesn’t rely on skill, but engagement
and creativity.
The space is ordered with a certain
logic behind it; a logic that you have
to explore as you engage with the
field. You can tell that the feedback
comes in notes of different pitches
and volumes, but which movements
affects which outputs? Maybe you
figure that the field is arranged in
invisible areas or zones, but where
are the boundaries of these zones,
and do all of them produce the same
kind of feedback? Maybe you reach
a point where you feel you have an
understanding of the invisible setup,
and how can this insight then be
used to produce movement patterns
and music?
3
4
References
Theoretical Framework
Stan Allen
From Object to Field:
Field conditions in architecture (1997)
John Frazer
The architectural Relevance of Cyberspace
(1995)
The theoretical framework of this
project is primarily based on Stan
Allen, and his notion of the field.
The field to him is a space of effect,
that doesn’t contain matter or
material but rather functions, vectors
and speed; a space where local
relationships are more important
than overall form. It’s a systems of
organization capable of producing
vortexes, peaks and protuberances.
“…an architecture that admits
change, accident and improvisation.
5
It is an architecture not invested in
durability, stability and certainty, but
an architecture that leaves space for
the uncertainty of the real.”
And then we have Frazer who in his
essay “The architectural relevance
of cyberspace” envisions virtual
worlds as an extra dimension that
allows a new freedom of movement
in the natural world, or in other
words how augmentation can break
ground for new ways of expression.
Motion and Sound
Leon Theremin
Theremin instrument (1919)
The theremin, invented by Leon
Theremin, is an instrument you play
without touching it or said in other
words a system that creates sounds
in accordance with the performer’s
movements. The instrument relies
on electrical signals to play, and
it’s notoriously hard play. (He also
developed other similar systems
where the actors were dancers
that through their choreographies
produced the sound.) - I want to
develop a system that is available to
all, a system that requires no prior
knowledge.
Jeppe Hein
Invisible Labyrinth (2005)
Jeppe Hein is a danish artist who
in his work “Invisible Labyrinth”,
through the use of infrared signals
connected to headsets creates a
labyrinth without physical walls
that is only experienced in the mind
of the viewer. - But where Hein
incorporates “a correct way” in the
installation, I rather work with the
uncertainty of the real.
6
The Perception of Space
Doug Wheeler
LC 71 NY DZ 13 DW (2013)
James Turrell
Aftershock (2022)
The two American artists Doug
Wheeler and James Turrell, are
both pilots and have a very unique
relation to space and the possible
infinity of it. Doug Wheeler through
his installation that both looks
and feels like the horizon, where
the diffusion of white light in the
spherical room makes the feeling of
focus impossible, and Turrell who
7
with his mixing of colors creates a
space that varies from being infinite
to misty and even aggressive at times.
Both in very defined spaces, framed
by very ceremonial entrances. But
they are solely concerned with the
spectacle of visualization, where I’m
more interested in the movement of
the body encouraged by sound.
Technology; Tracking and Affect
Mette Ramsgaard
I see what you hear (2002) Sea Unsea (2006)
Nitsan Bartov
Kin-Aesthetics (2022)
CITA also has its share of locally
grown augmentation installations
and projects; Mette Ramsgaard
through works such as “I see what
you hear” where sound becomes
visualization, and Sea Unsea where
the movements of dancers likewise
becomes visible projections.
Nitsan Bartov, who last year
graduated from CITA, also
experimented with modern tracking
technologies to both augment light
in real-time, and create object
oriented architecture derived from
human motions.
Depth map motion tracking
Kinect
Infrared marker-based motion tracking
OptiTrack
But instead of using depth map
motion tracking I’m employing
infrared marker-based motion
tracking mainly due to its supreme
performance in regards to speed.
8
Feb. Mar. Apr. May Jun.
Methodology
The methodology i’ve applied
has been an iterative process of
technically and conceptually
developing the system and then
analyzing multiple trials with a
variety of guests, all with only
a minimum of prior knowledge
about the system. These trials have
then been assessed on the basis of
technical functionality, feedback,
aesthetic choices and interpretations
of their intuitive exploration affected
by spatial discretization, variations
in sound and arrangement logics.
Technical (coding skill) progress
Feb. Mar. Apr. May Jun.
Feb. Feb. Mar. Mar. Apr. Apr. May May Jun. Jun.
Feb. Mar. Apr. May Jun.
9
With no prior coding knowledge the
projected started off with a great
deal of time spent on learning first
python, and later on also a bit of
C++. The initial coding goals were
to script a synthesizer in python that
in so called “real-time” turned MIDI
signals into sound waves. This was
done by making a dummy program,
where my mouse over certain areas
of the screen triggered MIDIsignals
for the synthesizer to play,
substituting the tracked movement
data. And methods of movement
simulation have proved extremely
useful throughout the project.
To optimize the speed of the system,
calculations were done by closest
point, rather than inside/outside
breb, that proved miles too slow.
Two months in and I was finally
ready to start testing.
Feb. Mar. Apr. May Jun.
System arrangement
The system arrangement works
through a series of scripts all relating
back to the motion capture input
data. The input data comes as
coordinates, which is both translated
into MIDI signals that ends up as
sound waves through the speakers,
and into electric signals that light up
the LED panels.
Test / Guest setup
8x Tracking cameras
Video camera position
Trackable area
10
Trials
1 st test
System performance and Spatial discretization
What I’m initially testing is the
functionality of the marker glove,
whether or not the movements are
translated into sound fast enough for
it to feel like “real-time” (Over 10 ms
is no good) and the merits and flaws
of this specific discretization of the
space.
Findings
I found that the glove was way too
rough on the skin and through
this constantly drawing attention
to it instead of the sounds. The
movement/sound correlation was on
point and the usage of string sounds
mixed well with the movements.
The spatial arrangement proved to
be way too simple, in essence more
like a point where the 8 zones meet
than a spacious environment.
11
Video recording
MoCap recording
Audio recording
Animation from
recorded MoCap data
12
Dancers
Spatial discretization, Movement and Sound
For the next trials the tracker had
been smoothed and cloned into two
marker gloves. The discretization of
the space is still an array of boxes,
but now consists of 27 instead of 8,
with both the logic of higher notes
up and lower notes down and an
entirely random distribution being
tested.
Different sound will be tested to
challenge the sound of strings.
Findings
The discretization still proved to
be too low resolution, and the
homogeneous box array too simple.
The random distribution of notes
proved to be too chaotic, where
the logic og higher notes up and
lower notes down proved to be
something that intuitively increased
the understanding of the room,
and once this feature was realized
it encouraged them to find further
hidden logics.
The rhythmic sounds made the
dancers more static; they don’t
blend as well together as melodic
instruments.
Two trackers proved hard to handle,
the most often issue was the motion
capture software confusing one
tracker for the other, flickering and
confusing the soundscape.
Though initially designed to be
two handheld trackers the dancers
quickly introduced others ways of
equipping them; one leg one hand,
two legs, both around the neck,
and two persons at the same time.
What they were missing was hidden
ways the two trackers connect; what
happens when they come close/are
far apart?
The dancers were overall thrilled
to test the system; dancing is at its
core something you do to music, but
this systems allows for an entirely
new sensation of creating the music
as you dance it; a symbiosis of
architecture, dance and music, or
as Stan Allen would call it; a Moiré
effect of several fields overlapping.
13
14
Various
Spatial discretization, Exploration and Stimulation
Another important test group is
the various, where guest with no
professional relation to movement
or music try the system.
The focus is again the spatial
discretization and how the intuitive
understanding of it is produced.
What pulling factors seems to be
missing in regards to the intuitive
exploration of the space?
And having to do with a test-system
that solely augments the soundscape,
what then catches the eyes, and
how could this focus of the gaze
be utilized to deepen the intuitive
understanding of the space?
Findings
Nestor, who is 8, had certain height
limitations that restrained him from
experiencing the higher octaves
above him, which forced him to
primarily trigger the notes of the
middle octave. And notes of a
single octave of the scale proved too
simple a constellation to encourage
a deeper exploration of the space.
The space begs to be discretized by
more complex arrangements!
Thomas intuitively stood very
statically at the middle of the room,
missing some hidden pull factors
that would encourage the wider
exploration.
Both of them very clearly keep their
focus at the cameras, expecting some
sort of response from them.
15
16
Spatial Discretization
These trials all point towards a more
elaborate discretization of the space,
and that can be done in several ways.
You can increase the resolution
of the grid, or even mix different
resolutions together.
The discretization can be done
through a voronoi layout, and then
of course the combination of the
grid and the voronoi.
Or you can start to introduce empty
areas as a variation where no sound
is triggered.
Having two trackers raises the
possibility of having different
discretizations for each.
You can introduce easter-egg zones
that trigger sounds that greatly differ
from the majority of zones, or eastereggs
that only occur when both
trackers are close in overlapping
constellations. And it’s possible to
introduce a surrounding zone that
informs about reaching the limit of
the tracking cameras.
17
18
The Boundary
Having an installation as the design
proposal raises some questions or
limitations for the augmentation
system; the most essential one being
what is the boundary of the physical
installation? The sound augmented
field in it self is potentially vast or
without a boundary which is a fact
that I’ve chosen to reflect by looking
into the concept of infinity mirrors.
An infinity mirror is an optical
illusion produced by a one-way
mirror, a regular mirror and a light
source in between, that produces a
visible space containing an infinite
depth dimension.
The optical illusion changes in
accordance to your perspective, and
this in combination with the urges
look into the vastness of the space
creates a Rothko like experience of
being pulled towards the illusion.
Having the light source as a
controllable LED-strip offers the
possibility of activating a grid of
panels in correspondents to the
triggering of sound. This both
deepens the intuitive understanding
of the field of sound and the
overall theme of interactivity; the
installation only offers something if
you engage it with movement.
19
20
Trials
Various
Spatial discretization and LED boundary
So with both these new ways of
discretizing the space and visualizing
the sound augmentation through the
boundary of the panels a new trial
was conducted.
Here with focus on the affects caused
by these, and also the concept of
easter-egg zones was introduced as a
type of hidden pull factor.
Findings
Arranging the zones like a 3D
voronoi diagram enhanced the drive
to explore greatly; every corner of
the system being a new discovery.
The trackable area of the lighting
LAB proved to be to small though
with easter-eggs and zones in general
laying out side the reachable area.
The concept of pitch-bending
was interesting though in this case
implemented through sounds that
varied too much from the overall
soundscape, a different MIDI setup
is needed for it to be in smooth
accordance with the rest. Audible
spacious concepts like reverb and
areas of vibrato proved to be strong
tools in regards to exploration.
And the screens were a big hit! They
instantly fixates the gaze and through
visual feedback greatly improved
the sensation of getting to know the
system as you explore it. A possible
improvement would be to have each
tracker represented by different
colors of light, that when combined
or close to each other would produce
a new third visualization.
21
22
The Installation
The installation space I’m proposing
is an almost spherical space with a
walkable platform surrounded by a
grid of infinity mirror panels and
speakers. The spherical setup puts
the sound augmented field in the
center and at the same time allows
the panels to extend infinitely in
almost all directions.
23
24
The Installation
Features
The spatial discretization is a mix
of a radial grid becoming more and
more dense and chaotic the further
you go from the initiation spot. The
notes all follow the logic of lower
notes at the base and higher notes at
the top.
The sound field features a reverb
effect that becomes strong the closer
you are to the boundary, and thus
through sound augmenting the
spatial sensation of the room to
become larger the closer you are to
the infinite spaces of the panels.
A series of hidden pull factors are
placed around the field, one of
them being the vibrato effect which
gradually increases over a range of
zones, becoming more and more
present the closer you are to the
source.
And then there is the hidden Easteregg
zones; you have zones that
produce a pitch bending effect,
you have the chronologic octave
stack in the middle of the room,
the two green zones that when
triggered simultaneously creates
an audible spinning effect around
the installation, there’s a holy choir
occurring when both hands are
above your head right in the middle
and more simple effects like the
sound of rain being triggered in
certain zones.
And surrounding it all you have an
intensifying pitch declaring that
you’ve reached the edge of the
trackable area.
25
26
Entrance / Initiation
The installation requires the marker
gloves, that in themselves can be a
little intrusive. To poetically embed
these in the installation, the first
interaction with the installation is
the simple question; “Do you accept,
cookies?”
27
28
Conclusion
The performance capabilities of
modern marker based tracking
technology is fast enough for this
kind of system to feel like real time,
even for a musician.
The natural boundary of the
lighting LAB proved too small,
and trials would potentially have
benefited from being conducted in
larger spaces, where you can move
over larger distances.
A spatial discretization that changes
throughout the field encourages
exploration, you want to see what’s
around the corner so to speak.
But it should be balanced, too few
zones make it boring to explore, but
the same thing happens if there are
too many, and the same goes for
gridlike vs. voronistic so to speak.
Slight sound alterations that
gradually change in the field
encourages movement over bigger
distances, where does the effect end?
Easter-eggs add the notion of never
having explored the field completely,
there could always be another
hidden realization.
Melodic sounds affected guests to
intuitively explore wider ranges of
the field, where rhythmic sounds
more induced a static, repetitive
exploration.
The infinity mirror as a boundary
is very effective in widening the
perception of space.
Visualization stimulates a feeling of
getting to know the system; the ear
and the eye experience different
expressions of the same thing.
29
30
Guests
Miriam Enguita
Dancer
Nana Anine
Dancer
Thomas Nordkap
Philosopher
31
Olatz Cantón
Dancer
Amalie Leth
Designer
Nestor Fraile Henning
Elementary School
Marie Holst
Designer
Peter Knoblauch
Historian
Anders Nordkap
Musican
Søren Henning
Architect
32
Annex
33
34
LED-wall
I’ve stacked 4 recordings of the
panels together to visually and
audibly represent the movement of
a single tracker.
Showcasing the connection between
the sound and light both being
triggered by movement and then
slowly dissolving their presence.
35
Early sketches
36
Scripts
Properties.py - Main script
import socket, subprocess, time
import pygame, pygame.mixer, rtmidi, mido, serial
# import cProfile
# import pstats
# profiler = cProfile.Profile()
# --- Initialize ---
pygame.init()
pygame.mixer.init()
midiout = rtmidi.MidiOut()
available_ports = midiout.get_ports()
if available_ports:
midiout.open_port(0)
else:
midiout.open_virtual_port(“Virtual Output”)
serial_comm = serial.Serial(‘/dev/cu.usbmodem1301’, 9600)
serial_comm.timeout = 1
# --- Setup ---
def closedown():
print()
print()
print(“Over and out”)
print()
print()
def set_client():
client.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
client.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
client.settimeout(2.0)
client.bind( (‘’, 5061) )
return client
class Position:
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
@property
def live(self):
return self.x, self.y, self.z
@live.setter
def live(self, xyz):
x, y, z = xyz[0], xyz[1], xyz[2]
self.x = float(x)
self.y = float(y)
self.z = float(z)
@property
def previous_pos(self):
return self.x_o, self.y_o, self.z_o
@previous_pos.setter
def previous_pos(self, xyz):
x, y, z = xyz[0], xyz[1], xyz[2]
self.x_o = x
self.y_o = y
self.z_o = z
@property
def dist_to_right(self):
return self.dist_right
@dist_to_right.setter
def dist_to_right(self, dist):
self.dist_right = float(dist)
@property
def dist_to_left(self):
return self.dist_left
@dist_to_left.setter
def dist_to_left(self, dist):
self.dist_left = float(dist)
setup_notes = [4, 7, 0, 10, 0, 10, 2, 7, 10, 4, 7, 0, 2, 0, 4, 2, 10, 7, 0, 10,
10, 0, 7, 2, 2, 4, 10, 4, 7, 2, 4, 10, 4, 10, 2, 4, 0, 2, 10, 7,
7, 2, 4, 0, 4, 10, 2, 7, 0, 0, 0, 0, 10, 4, 7, 2, 2, 7, 4, 0,
0, 7, 2, 10, 10, 4, 7, 0, 2, 7, 10, 4, 7, 2, 0, 10, 4, 0, 2, 7,
2, 10, 0, 7, 7, 0, 4, 2, 4, 2, 0, 0, 0, 4, 7, 2, 2, 10, 0, 4]
# --- Zones RIGHT layout ---
base_note_right = 50
x_step, x_dom_s, x_dom_e = 5, -2, 2
y_step, y_dom_s, y_dom_e = 5, -2, 2
z_step, z_dom_s, z_dom_e = 4, 0, 3
zones_right = []
counter = 0
for x in range(x_step):
for y in range(y_step):
for z in range(z_step):
if x > 2 and y > 2 and z > 1:
x_p = round(x_dom_s + (x * (abs(x_dom_s) + abs(x_dom_e))/(x_step - 1)), 3)
y_p = round(y_dom_s + (y * (abs(y_dom_s) + abs(y_dom_e))/(y_step - 1)), 3)
z_p = round(z_dom_s + (z * (abs(z_dom_s) + abs(z_dom_e))/(z_step - 1)), 3)
note_p = base_note_right + 24
zones_right.append(Zone(x_p, y_p, z_p, counter, note_p, 1))
counter += 1
else:
x_p = round(x_dom_s + (x * (abs(x_dom_s) + abs(x_dom_e))/(x_step - 1)), 3)
y_p = round(y_dom_s + (y * (abs(y_dom_s) + abs(y_dom_e))/(y_step - 1)), 3)
z_p = round(z_dom_s + (z * (abs(z_dom_s) + abs(z_dom_e))/(z_step - 1)), 3)
note_p = int(setup_notes[counter]) + int(12 * z_p)
zones_right.append(Zone(x_p, y_p, z_p, counter, note_p))
counter += 1
# ^
# --- Zones LEFT layout ---
base_note_left = 38
x_step, x_dom_s, x_dom_e = 5, -2, 2
y_step, y_dom_s, y_dom_e = 5, -2, 2
z_step, z_dom_s, z_dom_e = 4, 0, 3
zones_left = []
counter = 0
for x in range(x_step):
for y in range(y_step):
for z in range(z_step):
@property
def procent_to_center(self):
return self.procent_center
@procent_to_center.setter
def procent_to_center(self, procent):
self.procent_center = float(procent)
# ^
x_p = round(x_dom_s + (x * (abs(x_dom_s) + abs(x_dom_e))/(x_step - 1)), 3)
y_p = round(y_dom_s + (y * (abs(y_dom_s) + abs(z_dom_e))/(y_step - 1)), 3)
z_p = round(z_dom_s + (z * (abs(z_dom_s) + abs(z_dom_e))/(z_step - 1)), 3)
note_p = int(setup_notes[counter]) + int(12 * z_p)
zones_left.append(Zone(x_p, y_p, z_p, counter, note_p))
counter += 1
class Zone:
def __init__(self, x, y, z, id, note, special = 0, played = 0):
self.x = x
self.y = y
self.z = z
self.id = id
self.note = note
self.special = special
self.played = played
@property
def play(self):
return self.played
@play.setter
def play(self, played):
self.played = int(played)
# --- Live instance and initial variables ---
pos_right_hand = Position(0, 0, 0)
pos_right_hand.previous_pos = (0, 0, 0)
pos_left_hand = Position(0, 0, 0)
pos_left_hand.previous_pos = (0, 0, 0)
christmas_special = Position(1.5, 1.5, 2.5)
center = Position(0, 0, 0)
fps = 200
timer = pygame.time.Clock()
last_right = 0
laster_right = 0
last_left = 0
37
# profiler.enable()
try:
client = socket.socket( socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
set_client()
data, addr = client.recvfrom( 1024 )
if data:
synth = subprocess.Popen([‘python’, ‘/Users/jeanhonning/Desktop/2./MIDI/newsynth.py’])
time.sleep(1)
while True:
data, addr = client.recvfrom( 1024 )
id_data, x_data, y_data, z_data = data.decode(“utf-8”).split(“, “)
if int(id_data) == 1:
pos_right_hand.live = float(x_data), float(y_data), float(z_data)
elif int(id_data) == 2:
pos_left_hand.live = float(x_data), float(y_data), float(z_data)
#print()
clst_pt_right = None
clst_pt_left = None
speed_right_hand = int(((((pos_right_hand.x_o - pos_right_hand.x)**2 +
(pos_right_hand.y_o - pos_right_hand.y)**2 +
(pos_right_hand.z_o - pos_right_hand.z)**2)**(1/2))*200)*15) # *200 is to get m/s
and *15 is the multiplier
if speed_right_hand > 100: speed_right_hand = 100
speed_left_hand = int(((((pos_left_hand.x_o - pos_left_hand.x)**2 +
(pos_left_hand.y_o - pos_left_hand.y)**2 +
(pos_left_hand.z_o - pos_left_hand.z)**2)**(1/2))*200)*15) # *200 is to get m/s
and *15 is the multiplier
if speed_left_hand > 100: speed_left_hand = 100
# --- Brute force distances ---
for zone in zones_right:
zone.dist_to_right = ((zone.x - pos_right_hand.x)**2 +
(zone.y - pos_right_hand.y)**2 +
(zone.z - pos_right_hand.z)**2)**(1/2)
if clst_pt_right is None or zone.dist_to_right < clst_pt_right:
clst_pt_right = zone.dist_to_right
for zone in zones_left:
zone.dist_to_left = ((zone.x - pos_left_hand.x)**2 +
(zone.y - pos_left_hand.y)**2 +
(zone.z - pos_left_hand.z)**2)**(1/2)
if clst_pt_left is None or zone.dist_to_left < clst_pt_left:
clst_pt_left = zone.dist_to_left
# --- Distance to center (0, 0, 0) --- # for reverb effect
pos_right_hand.procent_to_center = ((center.x - pos_right_hand.x)**2 +
(center.y - pos_right_hand.y)**2 +
(center.z - pos_right_hand.z)**2)**(1/2)/3.6 # 3.6m is approximately the max
distance that is reachable
# --- Pitch to apply in pitch_zone ---
else:
note_on = mido.Message(‘note_on’, channel = 0, note = base_note_right + zone.note, velocity
= speed_right_hand).bytes()
effects = mido.Message(‘control_change’, channel = 0, control = 91, value = int((0.01
+ pos_right_hand.procent_center)*100)).bytes()
midiout.send_message(note_on)
midiout.send_message(effects)
zone.play = 5
laster_right = last_right
last_right = zone.id
if pos_right_hand.x > 0.55:
serial_comm.write(b’D’)
print(f ”D {pos_right_hand.x}”)
elif pos_right_hand.x < 0.55 and pos_right_hand.x > 0:
serial_comm.write(b’C’)
print(f ”C {pos_right_hand.x}”)
elif pos_right_hand.x < 0 and pos_right_hand.x > -0.55:
serial_comm.write(b’B’)
print(f ”B {pos_right_hand.x}”)
elif pos_right_hand.x < -0.55:
serial_comm.write(b’A’)
print(f ”A {pos_right_hand.x}”)
print(f ”RIGHT! --- Zone ID: {last_right} --- MIDI Note: {base_note_right + zone.note}
--- Effect: {int(pos_right_hand.procent_center*100)}%”)
for zone in zones_left:
if zone.dist_to_left == clst_pt_left and not last_left == zone.id:
note_on = mido.Message(‘note_on’, channel = 1, note = base_note_left + zone.note, velocity
= speed_left_hand).bytes()
midiout.send_message(note_on)
zone.play = 15
last_left = zone.id
print(f ”LEFT! --- Zone ID: {last_left} --- MIDI Note: {base_note_left + zone.note}”)
for zone in zones_right:
if last_right != zone.id and laster_right == zone.id and zone.special == 1:
note_off = mido.Message(‘note_off’, channel = 2, note = zone.note, velocity = 0).bytes()
midiout.send_message(note_off)
zone.play -= 1
if zone.play == 0 and zone.special == 0:
note_off = mido.Message(‘note_off’, channel = 0, note = base_note_right + zone.note, velocity
= 0).bytes()
midiout.send_message(note_off)
for zone in zones_left:
zone.play -= 1
if zone.play == 0:
note_off = mido.Message(‘note_off’, channel = 1, note = base_note_left + zone.note, velocity
= 0).bytes()
midiout.send_message(note_off)
pos_right_hand.previous_pos = pos_right_hand.x, pos_right_hand.y, pos_right_hand.z
pos_left_hand.previous_pos = pos_left_hand.x, pos_left_hand.y, pos_left_hand.z
timer.tick(fps)
except KeyboardInterrupt:
# profiler.disable()
# stats = pstats.Stats(profiler).sort_stats(‘tottime’)
# stats.print_stats()
closedown()
exit()
pitch = (((((pos_right_hand.z - christmas_special.z) - (-1)) * 16000) / 2) + (-8000))
# remapping them pitches
if pitch < -8000:
pitch = -8000
elif pitch > 8000:
pitch = 8000
# --- Music ---
for zone in zones_right:
if zone.dist_to_right == clst_pt_right and zone.special == 1:
note_on = mido.Message(‘note_on’, channel = 2, note = zone.note, velocity = 100).bytes()
pitch_effects = mido.Message(‘pitchwheel’, channel = 2, pitch = int(pitch)).bytes()
midiout.send_message(note_on)
midiout.send_message(pitch_effects)
if zone.dist_to_right == clst_pt_right and not last_right == zone.id:
if zone.special == 1:
note_on = mido.Message(‘note_on’, channel = 2, note = zone.note, velocity = 100).bytes()
# REMEMBER CHANNEL 2!!
p_effects = mido.Message(‘pitchwheel’, channel = 2, pitch = int(pitch)).bytes()
midiout.send_message(note_on)
midiout.send_message(p_effects)
laster_right = last_right
last_right = zone.id
print(f ”RIGHT! --- Zone ID: {last_right} --- MIDI Note: {zone.note} --- Effect: {int(pitch)}
Pitchbend”)
38
Newsynth.py - Synthesizer
import pyaudio
from pygame import midi
import fluidsynth
BUFFER_SIZE = 256
SAMPLE_RATE = 44100
NOTE_AMP = 0.1
# Multiple tracks for each rigidbody
fl = fluidsynth.Synth(1.0)
sfid = fl.sfload(‘/Users/jeanhonning/Desktop/2./MIDI/FluidR3_GM/FluidR3_GM.sf2’)
# sfid = fl.sfload(‘/Users/jeanhonning/Desktop/2./MIDI/FluidR3_GM/ModWaves.sf2’)
# fl.program_select(0, sfid, 0, 46) # 46 is the harp
# fl.program_select(1, sfid, 0, 32) # 32 is an accustic bass
# fl.program_select(2, sfid, 0, 1)
fl.program_select(0, sfid, 0, 1) # 91 space voice
fl.program_select(1, sfid, 0, 1) # 114 steel drum
fl.program_select(2, sfid, 0, 1)
# -- INITIALIZION --
midi.init()
default_id = midi.get_default_input_id()
midi_input = midi.Input(device_id=default_id)
stream = pyaudio.PyAudio().open(
rate=SAMPLE_RATE,
channels=1,
format=pyaudio.paInt16,
output=True,
frames_per_buffer=BUFFER_SIZE,
)
chan_0 = 0
chan_1 = 1
chan_2 = 2
# -- RUN THE SYNTH --
try:
print()
print(“Synth is activated...”)
print()
notes_dict = {}
effects = None
bend = 0
while True:
if notes_dict:
samples = (fl.get_samples(note))
fl.set_reverb_roomsize(effects/100)
fl.pitch_bend(chan_2, bend)
Stream_out.py - Dummy signal
import time, socket
x_1 = 0
y_1 = 0
z_1 = 0
id_1 = 1
counter = 0
x_2 = 0
y_2 = 0
z_2 = 0
id_2 = 2
stream = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
stream.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
stream.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
while True:
print(f ”{id_1}, {x_1}, {y_1}, {z_1}”)
print(f ”{id_2}, {x_2}, {y_2}, {z_2}”)
msg = f ”{id_1}, {x_1}, {y_1}, {z_1}”
stream.sendto(msg.encode(“utf-8”), (“<broadcast>”, 5061))
msg = f ”{id_2}, {x_2}, {y_2}, {z_2}”
stream.sendto(msg.encode(“utf-8”), (“<broadcast>”, 5061))
print(“Sending”)
x_1 += 0.0501
y_1 += 0.0302
z_1 += 0.0803
x_2 += 0.0701
y_2 += 0.1002
z_2 += 0.0403
time.sleep(0.17)
if x_1 > 2:
x_1 = -1.999
if y_1 > 2:
y_1 = -1.999
if z_1 > 3:
z_1 = 1.501
if x_2 > 2:
x_2 = -1.999
if y_2 > 2:
y_2 = -1.999
if z_2 > 1.5:
z_2 = 0.001
samples = fluidsynth.raw_audio_string(samples)
stream.write(samples)
if midi_input.poll():
for event in midi_input.read(num_events=16):
(status, note, vel, _), _ = event
if status == 0x80:
notes_dict[note] = fl.noteoff(chan_0, note)
elif status == 0x81:
notes_dict[note] = fl.noteoff(chan_1, note)
elif status == 0x82:
notes_dict[note] = fl.noteoff(chan_2, note)
elif status == 0x90:
notes_dict[note] = fl.noteon(chan_0, note, vel)
elif status == 0x91:
notes_dict[note] = fl.noteon(chan_1, note, vel)
elif status == 0x92:
notes_dict[note] = fl.noteon(chan_2, note, vel)
elif status == 0xB0:
effects = vel
elif status == 0xE2:
bend = -8192 + (vel * 128 + note)
except KeyboardInterrupt as err:
midi_input.close()
stream.close()
#print(“ - Stopping...”)
39
LED_serial.ino - Arduino script
String command;
#define A_LED 9
#define B_LED 6
#define C_LED 5
#define D_LED 10
int aGO = 0;
int bGO = 0;
int cGO = 0;
int dGO = 0;
int aBright = 0;
int bBright = 0;
int cBright = 0;
int dBright = 0;
void setup() {
Serial.begin(9600);
pinMode(A_LED, OUTPUT);
pinMode(B_LED, OUTPUT);
pinMode(C_LED, OUTPUT);
pinMode(D_LED, OUTPUT);
delay(2000);
}
void loop() {
if (Serial.available() > 0) {
char command = Serial.read();
if (command == ‘A’) {
aGO = 25500;
}
else if (command == ‘B’) {
bGO = 25500;
}
else if (command == ‘C’) {
cGO = 25500;
}
else if (command == ‘D’) {
dGO = 25500;
}
else if ((command == ‘a’) && (aGO < 8000)) {
aGO = 8000;
}
else if ((command == ‘b’) && (bGO < 8000)) {
bGO = 8000;
}
else if ((command == ‘c’) && (cGO < 8000)) {
cGO = 8000;
}
else if ((command == ‘d’) && (dGO < 8000)) {
dGO = 8000;
}
}
aBright = aGO/100;
bBright = bGO/100;
cBright = cGO/100;
dBright = dGO/100;
analogWrite(A_LED, aBright);
analogWrite(B_LED, bBright);
analogWrite(C_LED, cBright);
analogWrite(D_LED, dBright);
if (aGO > 0) {
aGO--;
}
if (bGO > 0) {
bGO--;
}
if (cGO > 0) {
cGO--;
}
if (dGO > 0) {
dGO--;
}
}
Pygame_test.py - Initial dummy
import pygame, pygame.mixer, rtmidi, mido
#Initialize
pygame.init()
pygame.mixer.init()
midiout = rtmidi.MidiOut()
available_ports = midiout.get_ports()
if available_ports:
midiout.open_port(0)
else:
midiout.open_virtual_port(“Virtual Output”)
#Creating screen window
WIDTH = 800
HEIGTH = 800
fps = 60
timer = pygame.time.Clock()
screen = pygame.display.set_mode((WIDTH,HEIGTH))
# Displays
pygame.display.set_caption(“sone”)
icon = pygame.image.load(“/Users/jeanhonning/Desktop/2./MIDI/images/icon.png”)
pygame.display.set_icon(icon)
b1 = pygame.image.load(‘/Users/jeanhonning/Desktop/2./MIDI/images/note_d.png’)
b2 = pygame.image.load(‘/Users/jeanhonning/Desktop/2./MIDI/images/note_f#.png’)
b3 = pygame.image.load(‘/Users/jeanhonning/Desktop/2./MIDI/images/note_a.png’)
b4 = pygame.image.load(‘/Users/jeanhonning/Desktop/2./MIDI/images/note_e.png’)
# Hit boxes
LU = pygame.Rect(0,0,400,400)
RU = pygame.Rect(401,0,400,400)
LL = pygame.Rect(0,401,400,400)
RL = pygame.Rect(401,401,400,400)
# make class
sectors = [RL, LL, RU, LU]
sector_ids = [3, 2, 1, 0]
sector_note = [2, 7, 4, 0]
note_play = [0,0,0,0]
notes_total = len(sectors)
def speedcheck():
speed_pix = pygame.mouse.get_rel()
x = speed_pix[0]
y = speed_pix[1]
speed = int((abs(x) + abs(y))/4)
if speed >= int(124):
speed = 124
last = 0
#Interface loop
running = True
while running:
for event in pygame.event.get():
if event.type == pygame.QUIT:
running = False
#Background color
screen.fill((255,255,255))
# Mouse tracking
mouse_position = pygame.mouse.get_pos()
### Collision
N = notes_total
for i in range(N):
if sectors[i].collidepoint(mouse_position) and not last == sector_ids[i]:
speed = 50
speedcheck()
note_on = mido.Message(‘note_on’, channel = 0, note = 62 + sector_note[i], velocity = speed).bytes()
midiout.send_message(note_on)
note_play[i] = 30
last = sector_ids[i]
# Display notes
B1 = screen.blit(b1, (1,1))
B2 = screen.blit(b2, (401,1))
B3 = screen.blit(b3, (1,401))
B4 = screen.blit(b4, (401,401))
# Mouse Tracker
speed_pix = pygame.mouse.get_rel()
#print(speed_pix)
# Timer tick
timer.tick(fps)
for i in range(N):
note_play[i] -= 1
if note_play[i] == 0:
note_off = mido.Message(‘note_off’, channel = 0, note = 62 + sector_note[i], velocity=0).bytes()
midiout.send_message(note_off)
# Update screen
pygame.display.update
40
Fundamentals of Computation
import rhinoscriptsyntax as rs
import Rhino.Geometry as rg
import random
sites = []
steps = 199
frameLines = rg.Curve.DuplicateSegments(frame)
frameLineLen = []
for line in frameLines:
frameLineLen.append(rg.Curve.GetLength(line))
frameMax = max(frameLineLen)
for i in range(len(points)):
site = rg.Circle(points[i], 0.002).ToNurbsCurve()
breps = rg.Brep.CreatePlanarBreps(site)
if breps:
brep = breps[0]
sites.append(brep)
row1 = rs.DivideCurve(frameLines[0], steps, True, True)
move_vecs = []
for i in range(steps):
pt1, pt2 = rs.AddPoint(0,0,0), rs.AddPoint(0,(frameMax/steps)*i+(frameMax/steps),0)
move_vecs.append(rs.VectorCreate(pt2, pt1))
pts = []
for i in range(steps):
row = rs.CopyObjects(row1, move_vecs[i])
for point in row:
pts.append(point)
pts = pts + row1
regions = [[] for _ in range(len(points))]
for i in range(len(pts)):
clst_pt = rs.PointArrayClosestPoint(points, pts[i])
regions[clst_pt].append(pts[i])
diagram = []
for i in range(len(regions)):
reg_m = rg.Mesh.CreateFromTessellation(rs.coerce3dpointlist(regions[i]),
None,
rg.Plane.WorldXY,
False)
diagram.append(reg_m)
41
42