11
May/18

Scaffolding an introduction to Scratch

In anticipation to Scratch Day we have been experimenting in the Tinkering Studio with the awesome video sensing ability that is built into Scratch 2.0. As usual, we have spent some time playing and prototyping ideas that make use of video sensing internally before trying it out with the public, but one of the big hurdles in the context of the Tinkering Studio is how to get visitors engaged meaningfully, within a short time, with something quite complex—programming an interactive animation using an unfamiliar environment (Scratch) and making use of advanced tools (video sensing from a webcam). Here is the way I approached it today.

I started with a simple program running: a parrot is flying on the screen and when “captured” by the net it disappears. To make it reappear you have to shake the tree with the net. This is achieved simply by having the sprites responding to the color of the net (off white). I first encouraged kids to play with the program as it was set up and pointed out the various parts, like the camera, the virtual sprites, and the code that was animating the parrot. After a while I asked if they would like to add their own character to the animation and make it do something. They all enthusiastically said yes! I encouraged them to make their own using the construction paper available.

Video sensing with Scratch Video Sensing with Scratch

Once a character was created I helped them bring it into Scratch using the camera function and magic wand to get rid of the background. Having a camera already mounted pointing straight down at the stage made this super easy and fast. Having a character that they created in physical form be transported inside of a virtual world was at once magical and meant kids were immediately invested in what happened to it. I asked a pretty open ended question: “What would you like your character to do?” and gently nudged them if necessary toward thinking about movement first. How should it move?

Video sensing with Scratch

From there I pointed out the Motion section of Scratch and had them drag a couple of initial blocks (like move 10 steps and turn 15 degrees) onto the stage, start clicking on them and notice what happens to the sprite. I encouraged them to play around with the values, notice how the movement changes. I encouraged them to snap two or more blocks together and see what happens when you string commands together. Finally I revealed the repeat and forever loops as a way to avoid clicking repeatedly on the code and making the character move autonomously. This led to a more intentional phase of experimentation with values and blocks to see what kind of movement they could get out of their character. I found it very interesting that every kid had a clear idea of how their character should move, determined by the nature of the character itself, and that led to very different bits of code and behaviors. A butterfly moves very differently than a dragon, naturally!

Video sensing with Scratch

Once the visitors were satisfied with the movement of their creature, I re-introduced the net from the beginning, asking them now what should happen to the character when it is captured by the net. Once again, every kid had a different idea in mind for what their character should do in that situation!

Jade’s butterfly moves erratically and very fast on the screen, and when captured by the net it disappears for 10 seconds, then reappears.

Jailen and Jayden’s dragon (the similarity between all the kids’ names is purely coincidental!) glides smoothly on the screen and when captured it breathes fire. Of course a dragon’s fire breath is blue, didn’t you know? In this case it also required a trip to the Costumes tab where the kids duplicated their sprite and hand-drew the flame, then we worked out how to switch costumes based on whether the net was touching the sprite or not.

The most interesting part to me is that when introducing the net as an interactive device, the first thing kids said was something along the lines of “I want it to do X when the net catches it.” When I pointed out that the computer doesn't know about the net, it can only detect either color or motion, everyone autonomously came up with the solution of having the sprite react when touching “white.” I think this is a good example of abstracting a high level goal into a set of instructions that a computer can understand and work with.

All these interactions were around 20 to 30 minutes, and I think that for such a short engagement it resulted in meaningful and authentic exploration of programming, Scratch, and a fairly sophisticated technology such as video sensing. This is definitely a more scaffolded and guided approach than we usually adopt in the Tinkering Studio with lower threshold activities, but perhaps in this case it is the better approach. I also noticed that many of the parents who were not previously aware of Scratch were very impressed with how easy it is to introduce programming concepts and practices and mentioned wanting to continue playing with it at home. The fact that Scratch itself is free and this particular approach only uses a webcam and readily available materials certainly contributed to them feeling they could do so easily.

10
May/18

Weaving Roundup

Since my last post about weaving way back in December, we've been continuing to try different approaches to tinkering with textiles. It's been an interesting journey, with lots of discoveries (and challenges!) along the way. Here's an overview of some of the big ideas we've experimented with.

Incorporating Technology
My initial excitement for exploring weaving as a topic came from its connections to computational thinking. With that starting point in mind, I did some research into computational connections and stumbled across this post on Medium that linked to a Processing file for a "digital loom." With this code, you can play with changing variables to adjust color and pattern. My knowledge of Processing is pretty limited, so most of my exploration was around linking changes to variables to noticeable outcomes that affect the pattern. I definitely made some mistakes along the way, but eventually got the hang of knowing which variable to change to get the outcome I was looking for.

width x plus 80 width x plus 10
These examples show some glitches in my first attempts at playing with the variables.

starting to understand - b 10 w 20 test three - block size 50
These show some before and after examples of more successful tests in playing with the code for more purposeful outcomes.

While the digital loom was a fun foray into incorporating technology, the question remained, how could we turn these patterns into physical woven objects? For help on that, I turned to our longtime collaborator Stacy Speyer. In addition to designing playful ways to explore polyhedra, Stacy is also a talented weaver and knows tons about ways to make connections between math and the physical world. She introduced us to a software platform iWeaveit that allows you to create digital designs that can be easily translated on a loom. We spent a few afternoons tinkering with designs on the software then using her table loom to translate them to a woven piece of cloth. Which brings me to our next phase of prototyping...

IMG_9645 IMG_9648

Playing with Real Tools
Having the opportunity to prototype with a real table loom was one of my favorite parts of the process. Using the loom makes creating fabric go really smoothly and quickly. It's amazing to me that lifting and lowering levers can make patterns appear so quickly. Our friends at MAKESHOP at the Children's Museum Pittsburgh have a floor loom they keep out all the time for visitors as an exhibit; they use all sorts of interesting found materials for weaving, and keep the experience very open ended. I'd love to be able to try something like this in the Tinkering Studio someday.

IMG_9653

Making the Materials Accessible
Since at the moment we're not equipped to have a workshop full of real looms for visitors, we brainstormed other ways of making the process more accessible by using simple, everyday materials. One idea we came across was that of a cardboard loom. By cutting evenly spaced notches into sheets of cardboard and looping threads of yarn into those spots, you can effectively make a simple loom structure for needle weaving. The photo below shows an early prototype where I sketched a pattern by hand, then wove it using a cardboard loom. There are sections of plain weave above and below the zigzag pattern.

IMG_6310

While the process of needle weaving is much more accessible in terms of tools and materials, once you've tried a real loom it feels a lot more...tedious. I tried to design a pattern on iWeaveit then create it in real life using the cardboard loom, but I only got a few layers in before deciding to take an extended break because it took forever.

IMG_6309

One material that helped solve this for me was chunky knitting yarn. It's really bulky, so the process goes a lot faster. One good discovery we made was that the narrow looms are good for plain weave, but are less helpful if you want to develop a pattern since there's limited space to work with.

IMG_6308

As a next step, we'll be trying this process on the floor with visitors sometime this summer after Maker Faire. I still have lots of questions about this activity. How can we make it more tinkerable? Are there ways to make the connections between weaving and computation clearer without making it feel like a force fit? What's the line between exploring a process and following step-by-step instructions for this activity? How can we relate this practice to the cultural traditions of weaving found all over the world? Even though we don't have these questions answered yet, I think this puts us in a good place for future exploration with visitors to the Tinkering Studio because we'll have so much to learn!

06
May/18

Creating the app for SOUND SAFARI at Tinkering Studio

This is a guest post written by Tinkerer in Residence, Keina Konno. Keina is a Video & Device Engineer in the InterLab at the Yamaguchi Center for Arts and Media (YCAM) in Yamaguchi, Japan. Keina participated in a year-long residency at the Exploratorium, where she spent half her time with the New Media group in Exhibits and Media Services and half her time in the Tinkering Studio. During her time with us, Keina conceived and developed a computer app to record and play back sounds for a prototype activity, "Sound Safari."

I joined the Tinkering Studio for a fellowship last October. During the six month residency, the Tinkering Studio team started to create a new activity related to sound called "SOUND SAFARI." The activity uses a piezo sensor (or a microphone for picking up vibrations) to find new sounds from the world surrounding us. Many everyday materials can make great sounds.

26303061888_68841af8c3_k.jpg

First, we created a paper worksheet for sound archiving because we didn't have a way to record the great sounds participants made. On the worksheet, there were spaces for writing down the name of the sound, what was used to make the sound, what did it sound like, and a space for drawing the setup for making the sound.

38282240655_11db0a243b_k.jpg

The updated worksheet was laid out like a trading card and overall looked much nicer than before!

During this time, the worksheet wasn't very attractive. But even so, it worked for saving and archiving the sounds. Our next step was to create an application to archive sounds on a computer or iOS device.

We found a device that could make this possible. The "iRig" can connect a piezo sensor to iOS devices and record the sound from the piezo. It's a little bit tricky because the piezo sensor only makes a little bit of electricity. We needed to find something that could connect the piezo sensor to the computer. Otherwise, the signal of the piezo sensor would break the computer easily.

We then started to create a computer app using the iRig. I used "openFrameworks," an open source C++ toolkit, to make the prototype app. We repeated the prototyping cycle of making and updating the prototype app, using it on the "floor" of the museum, and discussing what worked and what didn't.

This is the first prototype app.

40634605182_cd75ea54a1_k.jpg

Making "good affordance" on the app was a challenge for me. I realized the app should be as non-verbal as as possible. When I created the app, I realized that I'm dependent on the commonsense of my mother language when I create something (for example, the name I would assign to a category on the UI). So, I tried to not use language as much as possible in the app user interface. The challenge with minimal text is if the app lacks sufficient explanation for how to use it. However, lack of text was a design strategy that makes the app more universal and makes "good affordance" for everyone.

When a participant assigns a color and name to the sound, they will think, "It is my sound!" At first, we thought about using a typing function on the app, but decided that it would not be a good user experience. But we learned that giving a name to a sound is really functional for prototyping.

26303060038_189f771032_k.jpg

Every participant approaches the app in a different way. Some people are really into finding a new sound with the piezo sensor and aren't that interested in recording. And some people are into making their own music like a DJ with the playing function in the app.

25043188377_63c42c5aa0_o.jpg

Touch displays and iOS devices still have a few challenges with controlling in the app. But even so, using an iPad is very intuitive and it seems easier for young kids than with a mouse and keyboard on a normal computer.

Through creating the app, we realized there are many possibilities for the app and the activity. But it is still a work in progress and not a final product.

39356547845_ff6a25ba35_o.jpg

The Tinkering Studio is a great place for participants and creators to "keep making." This approach to working was quite new for me because I always have deadlines when making something. At first, I felt it was easy because I can take as long as I want to make something. Now I realize it is great because we can focus more on the "user experience" with this approach. We can continue to find new problems and new points of view each time we test.

I learned many things from my experience spending time with great tinkerers who work at the Tinkering Studio. I hope I can visit again soon!

私は昨年10月から今年の2月末まで、San FranciscoのExploratoriumにあるTinkering Studioにインターンとして訪問し、開発中だった新作ワークショップ「SOUND SAFARI」の制作に参加しました。SOUND SAFARIは、「コンタクトマイクと呼ばれる物理的振動を音(電気信号)に変換するデバイスを使い、身の回りのものの持つ『新しい音』を発見する」という内容で、発見を通じて「音が鳴る仕組み」について考えたり、「身の回りのものが持つ特徴(テクスチャなど)」に目を向けることなどを実施の目的としています。

まず初めに、私たちは「発見した音の名前」「どうやってその音を発見したか」「その音を発生させた時の状況の図解」などを記入する「音のレシピ(作り方)を記入するためのワークシート」を制作しました。そのワークシートを見ることで、身の回りの素材への理解がより深まったり、他の人が真似したり、アレンジを加えたり出来るようにすることが目的としていましたが、このワークシートでは「音を見つけた瞬間」を瞬間的に記録することは難しく、もっと素早く、もっと簡単にアーカイブする方法を探すことが必要となりました。
そこで次に私たちは、コンピューターで発見した音を録音できるシステムを整えることにしました。少しトリッキーだったのは、ピエゾマイクに使用されている「ピエゾ素子」は、振動を感じると微弱な電気を発生させそれを音に変換するため、そのままiOSデバイスやコンピューターに接続すると、デバイス側を故障させてしまう恐れがあることです。私たちは「アコースティック・ギターなどの音をコンピューター上に録音/アンプリファイする」ためのデバイスである「iReg」というプロダクトを発見し、それを利用することとしました。加えて、Openframeworksというプログラミングツールキットを使用して、録音用のコンピューターアプリケーションを制作し、録音できるシステムを整えました。

そこから、制作したシステムを「フロア」と呼ばれるワークショップを実施するためのスペースで実際にワークショップに参加してくれるビジターと一緒に使用し、その時のフィードバックを持ち帰ってメンバーと話し合い、アップデートを加える、という作業を繰り返しました。アプリケーションにおける「良いアフォーダンス」をデザインすることは、私にとってチャレンジングでした。特に、自分の母語と違う言語環境でスムーズに使えるアプリケーションにするため「ノンバーバルに理解できる」ようにすると、言葉にすると当たり前ですが、実際にそれをきちんと実装すること想像以上に大変なことでした。

音に「タイトルと色を割り当てる」ことが出来る機能を実装したことで、参加者が音に対してこだわりや思い入れを持つようになってくれた、など、何度もアップデートをした結果、様々な機能が追加/修正され、それによって参加者との様々な関係性が生まれました。
ある参加者は、「録音した音を再生する」というシンプルな機能を使ってDJの様に新しい音楽作りに没頭したり、またある参加者は、同じ素材から異なる音を発見するために録音した音を何度も聞き返したり、ワークショップ参加者のそれぞれ違ったワークショップへのアプローチは、開発を進める上でこの上なく興味深く、アップデートのための最高の資料となりました。タッチディスプレイを使用したバージョンなども制作し、SOUND SAFARIは今後ますます進化していくことでしょう。

Thinking Studioは、ワークショップの作り手であるスタッフ、ワークショップ参加者、そのどちらもが「作り続ける」を実践するための場所として、これまで常に仕事で締切に追われていた私にとって新鮮で刺激的でした。参加当初は「締切がないなんて、なんでも簡単にできるじゃないか」と思ったものですが、今となってはこの方法が「いかに大きな成果を生むか」「いかに大変か」を身にしみて実感しています。常に新しい問題を発見し続け、新しいことにチャレンジし続けることは、「参加者の体験(ユーザーエクスペリエンス)」に重要視する上で最も誠実なアプローチと言えるでしょう。

様々な経験をさせて頂いたTinkering Studioでの学びを、自分の仕事や職場で活かすとともに、またこの素晴らしいメンバーと一緒に「新しい何か」を作れることを楽しみにしています。

23
Apr/18

Sound Safari - experiments with environment and activity design

IMG_9707

IMG_9658IMG_9657

After our initial Sound Safari sessions with visitors we decided to make adjustments to the workshop environment, tools and materials to contextualize and support the explorations we want to see. Making small changes and carefully observing how they affect the tinkering experience is a core part of our ongoing iterative activity design process.

Environment - Introducing visitors to pick-up microphones
IMG_9665

We always like to work with artists using their art pieces as inspiration and to immerse visitors in the exploration. For this environment we were lucky to get one of Bryan Day's sound sculptures loaned to the Tinkering Studio. We placed the interactive sculpture near the entrance to the workshop area to introduce visitors to the idea of carefully investigating objects through sound as well as sound harvesting with pick-up microphones. By playing with the sound sculpture, visitors got the concept of amplifying and investigating otherwise inaudible sounds with pic-up microphones without much explanation by facilitators.

IMG_9731
mother and daughter's first exploration of hidden sounds

IMG_9681

A table with interesting and unusual materials and a hand held pic up mic with a speaker next to the art piece invited visitors to explore more, guided by their own interests. The starter station and art piece helped visitors become comfortable with the tools and materials and open up to the idea of working at their own recording station for a longer amount of time.

Software tools - shaping the sound exploration towards quick recording and collecting sounds
During the first visitor tests we found people engrossed in exploring sounds but recording them wasn't interesting to them. A new custom software by our collaborator Keina Konno set up on a recording station for two people significantly changed the way visitors were exploring sound.

IMG_9715

interface for representing and arranging recorded sounds on the screen

IMG_9676

young visitors quickly took ownership over the software tool

Just like the environment in the physical world, the software environment guides the investigation. In this case the software was set to record short 5 second sound samples, each sound was represented by a circle on the screen. This set up encouraged visitors to collect sound snippets and jump back and forth between investigating the object and recording.
We saw a few visitors thoughtfully naming the sounds and arranging sounds on the screen and some made discoveries about combining sounds by playing them back at the same time.

We also saw more complex experiments when visitors were working in groups of two, four hands help to arrange sound makers and position the microphone carefully to make small adjustments to the sounds.

Next steps
-Creating with sound
We would like to see more visitors create sound arrangements or even personal stories and sound collages. Our current tools and software don't quite support creativity with sound enough and we observed that most visitors couldn't sustain their engagement when they were done discovering new sounds. We will put effort towards creating context and tools for visitors to use their sounds in personally meaningful creations our next phase of prototyping.

-Engaging young kids in sound explorations
One of the strong suites of Sound Safari that remains through the different iterations is that younger visitors are immediately engaged and motivated to explore, often driving the exploration while collaborating with their parents. We would like to build on this and revisit the idea of sound harvesting, maybe with a simple software tool that allows to capture video or images together with sound recordings.

This project was made possible through the generous support from the LEGO Foundation and the Simons Foundation.

The LEGO Foundation
Simons Foundation Hi Res

05
Apr/18

Coding and storytelling with video sensing

Video sensing is a type of augmented reality that allows a software to "see" objects in front of the camera and respond to them on-screen with animations, sounds, or any code one can imagine. We use a video feed instead of using a MakeyMakey or other keyboard input devices. Interacting with a computer in such a playful and creative way seemed like a rich tool for computational tinkering activities. We decided to familiarize ourselves with this aspect of Scratch and have been exploring video sensing with Scratch for about a month now (see here and here).

Real world paper objects merged with computer animations on a computer screen.

I created Day to Night as an example of a starting place for visitors. The grass was handmade and imported as a sprite, and the Deanna sprite was left over from a Scratch project that AiR Tim Hunkin made.

video-sensing-setup

Techniques and Scratch Assets
We learned that Scratch can detect motion as well as color. The background color changes to a 50% opacity blue when the video sensing picks up a specified color (in this case, the "moon" on a stick that is light yellow). The code block below is assigned to a large, transparent rectangle sprite. It also triggers the bat to come out and removes the sun.

Screen Shot 2018-04-09 at 4.24.08 PM
This code is assigned to the transparent sprite.

Screen Shot 2018-04-09 at 4.24.29 PM

The sun sprite appears on whether the transparent sprite detects the moon color.

Screen Shot 2018-04-09 at 4.24.40 PM
The bat sprite's appearance is also dependent on the presence of the moon color.

We believe that video sensing can be added to the library of computational tinkering projects. Video sensing encourages visual story telling along the same lines as some of our collaborators like Becca Rose.
As we think about bringing Scratch video sensing out on the floor for visitors, I'm interested in creating inviting set-ups for visitors that encourage a wide variety of possible outcomes.

Where Do We Go From Here?

Sounds. After sharing this prototype during a team share-out, we discussed what visitors (or Tinkering Studio members) might want to add to the set-up. Sounds were mentioned as a must have, including daytime sounds (chirping birds, feet walking on grass) and nighttime sounds (crickets, owl hoots).

Characters. With more sounds come the opportunity to add more characters, or sprites in the Scratch vernacular. Populate the scene with more animals and environmental details.

Half-baked Ideas

Scenes. A walking sprite could transition between scenes, like Eric Rosenbaum's Big Tape Adventure. Perhaps the Deanna sprite could go on a walking tour of San Francisco or the Exploratorium, or even go on a safari or to outer space.

Collaboration. Inspired by the drawing-style of exquisite corpse, each participant could contribute an object and code what happens when the object is placed on the screen. In a museum setting, we could add to the same project throughout the day and different visitors could contribute to a larger, collaborative story.

Simons Foundation Hi Res

This project was made possible through the generous support from the Simons Foundation.

Pages