Hol dont o yourhat s andgl as s es !
Vi r t ual Real i t y Augment edReal i t y Wi nt er2016
Hello Everyone! Who moved my cheese?
VEJ Vol. 5 Issue 1
As I look out our VEJ window, I realize the landscape has changed, and so too . . . the view! There are four reasons for my comment: (1) Virtual Reality Headsets (e.g., Oculus Rift, HTC Vive, Samsung Gear VR); (2) Augmented Reality (e.g., Microsoft HoloLens); (3) Low-end Virtual Reality (e.g., Google Cardboard, View-Master); and (4) Other Innovations yet to be released (e.g., Amazon patent). Twenty-one year old Palmer Luckey created a Kickstarter campaign with a goal of $250,000. He raised over $400,000 the first day, skyrocketing this technology forward. Two years later, Mark Zuckerberg purchased the company for $1.6 billion. And thus, the race began to create the first consumer virtual reality headset. Oculus Rift hit the market during the first week in January 2016 as a consumer product. With the arrival of Windows 10, Microsoft touts an entirely new technology with Holodeck. This augmented reality superimposes digital information on real world objects. The HoloLens headset is a complete computer, unlike the Oculus Rift that requires the purchase of a desktop computer, likely to set you back at least $1,000. At the extreme low end, Google Cardboard has created a template to hold as a cardboard mask. Users insert a smartphone device and use a variety of free Cardboard apps to access virtual experiences. For example, the New York Times, ABC News, Yale University Virtual College Trips, and the Seattle Space Needle have jumped onboard by providing virtual experiences. Likewise, with the help of 3D and augmented reality, businesses, such as IKEA, are using apps to create virtual shopping experiences that allow consumers to place furniture items in their own spaces before purchasing them. As soon as these new products come online, however, some believe they are already obsolete. For example, Amazon is already working on a device that will no longer require a headset or goggles to immerse yourself in an augmented reality world. Microsoft purchased Minecraft in 2014. Already Microsoft is demonstrating the Minecraft HoloLens opening up a realm of amazing experiences for players difficult to even imagine. All this begs the question . . . as schools begin to use Minecraft to teach core curricula, 21st century skills, and digital citizenship, how will this transform the traditional brick and mortar classroom? How will this impact student learning? Hold on to your hats and glasses. . . this is wildest ride in the metaverse.
Virtual Education Journal June 2014 In This Issue • • • • • • • •
• • • •
• •
•
• • • • •
While You Were Sleeping by Bob Vojtek VWBPE 16 Thinkerer Award Nominations Open Bandit Sailboat Racing Save The Date – Enchanted @ VSTE Augmented Reality by Bob Vojtek Virtual Reality Discussions by Kae Novak Cardboard, Google & iOS Apps by Bob Vojtek A Micro-History of Augmented Reality in Film, Television, and Pop Culture Bluebarker Lowtide (sl), Vasil A. Giannoutsos (rl) th 9 Annual Virtual Worlds Best Practices in Education Conference Woot, y’all by Scott Merrick OH (sl), Scott Merrick (rl) Virtual Reality by Bob Vojtek The Virtual Pioneers: A Report from the Field, Andrew Wheelock (rl), Spiff Whitfield (sl) Introducing Renaissance Gallery by Roxie Neiro (sl), Rosie Vojtek (rl) Mobile App Development Using International Collaboration by Bill Schmachtenber (rl), Dae Miami (sl) On Walkabout, Vol 6, An Archeological Discovery by Cyrus Hush (sl), Matt Poole (rl) You Snooze… You Lose: Machinima Showcase Submissions due We Love Hangin’ With Friends! Pics by Scott Merrick Where in the Metaverse is AvaCon Grid? EDTECH 532 Part 3 by Chris Haskell, Ed.D Write for VEJ
To Read VEJ online visit: http://www.virtualeducationjournal.com/
We know you will want to devour this issue – so don’t wait, take your first byte! You will discover, like always, VEJ is “Out of this World!”
For more information about ISTE SIGVE/VEN or to join the fun, visit: http://sigve.iste.wikispaces.net/
Keep Smiling J Roxie Neiro (SL), Rosie Vojtek (RL) Cover by Bob Vojtek (rl) BJ Gearbox (sl)
Follow us on Twitter @VEJournal or #VEJournal 2
©Vej is an Edovation Publication
While You Were Sleeping By Bob Vojtek (rl), BJ Gearbox )sl)
Robotics and software capabilities are encroaching into areas that most people would have guessed were “beyond the reach of robots.� As the capability, capacity, and processing speeds continue to increase, some believe that robots and software will take over jobs that were previously considered something only
3
a human could do. If the predictions are correct a significant number of jobs will become extinct. This trend will have a huge impact on K-12 and higher education. While we have been debating the best way to get teachers engaged and use technology in the classroom, a whole wave of technological advancements has occurred. It was cute to paraphrase the statement, “it took 20 years to get the overhead projector out of the bowling alley and into the classroom.” The reality is that robots, software, and algorithms in cloud computing are excelling at tasks that have previously been only the prevue of humans. Tasks such, as moving boxes, creating reports and driving cars were always going to be the jobs of humans. Robots have difficulty determining the edges of boxes especially when multiple boxes were askew. They also have difficulty determining the difference between several boxes side by side, and one large box. This is about to change as companies now report that their new robots not only recognize these edges, but they can organize and move boxes faster than a human. Since education is so focused on incremental change, our current educational paradigm may implode when high school, bachelor degree and graduate school graduates become unemployable. Just a few years ago the buzz was, “we need to educate students for the 21st century,” meaning we need to teach for jobs that do not yet exist. What if the reality is that jobs we have been confident would continue, become extinct? The skills required for these new jobs can’t easily be assessed with high stakes tests. Students will need to contend with significant ambiguity and the fact that there may be multiple solutions for a given problem. We are living in a world where every multiple choice question on an assessment puts a student one step closer to un-employability. Priority school districts that believe using scripted lessons is an answer to closing the achievement gap may find even their graduating students lack the ability to qualify for these new jobs. This likelihood is not just for high school graduates, but also for those that are graduating from college. It is not simply about graduating from a top school any more; but rather, how prepared are students for the kind of thinking required for the new technologies? Wired Magazine looked at LinkedIn
4
data to see where west coast high-tech employees graduated. They wanted to see, “if non-Stanford grads have a chance at Silicon Valley firms.” It turns out that San Jose State is a good bet just as is Stanford. In fact, “if you’re just looking at the sheer number of alumni connections to Cupertino. UC-Berkeley, UT-Austin, and Cal Poly-San Luis Obispo also appear to be Apple favorites. . . . As for the Ivy League, not one of the ancient eight makes the list for any of the tech companies under consideration.” Microsoft unwittingly created a technological avalanche. They created a gaming device called Kinect to use with their gaming console, Xbox One, with the intent of creating a quantum leap in game navigation. The Kinect system, “identifies individual players through face recognition and voice recognition. A depth camera, which ‘sees’ in 3-D, creates a skeleton image of a player and a motion sensor detects their movements. Speech recognition software allows the system to understand spoken commands and gesture recognition enables the tracking of player movements.” The resulting technology, a mere $130 add-on to an Xbox system, revolutionized robotics. Although it was not the intent, Kinect created a cheap solution for robots to be able to “see,” catapulting robotics technology forward. In agriculture, for example, “robots that can sow seeds in a farm, robots that can remove weeds, robots are used for fertilizing crops and robots that can even pluck fruits from trees and accumulate these at one place —
5
these were some of the finalist entries at the e-Yantra Robotics Competition 2013,” as reported in the Times of India. Fast food workers are clamoring for a living wage. Does it seem that $15 dollars an hour is unreasonable for the work food service employees perform? I think not. However, just as this debate escalates, a company in California has a robot that has the ability to custom create gourmet hamburgers at the rate of 360 hamburgers per hour. The manufacturer, Momentum Machines, explains that their robot does not call in sick, does not qualify for workman’s compensation, and requires no medical benefits. The company explains that currently a food service worker, working a line, costs the company $135,000 annually with salary and benefits according to the Momentum Machines website. Their prediction is that within two years their robotic system will be on par with that figure. The newly revised machine has the capacity to grind three types of meat. For example, you may request one third bison and two thirds beef. It grills the meat to your liking before assembly. Then after the grilling process your burger is assembled to your specifications. Tomatoes, lettuce, and pickles are all sliced immediately after the grilling. It places these selections on top of your burger, before adding the ketchup, mustard, and mayonnaise to the top of your bun, giving you the freshest burger possible. McDonalds is preparing to add kiosks to their restaurants after having deployed them off shore already. Chili’s Grill and Bar as well as Applebee’s are introducing kiosks and mobile ordering. The Wall Street Journal labels this as a
6
“minimum wage backfire.” Under the guise of improving the customer experience for McDonalds the move is also “a convenient way... to justify a reduction in the chain's global workforce,” claims the Journal. It continued, “The result of their agitation will be more jobs for machines and fewer for the least skilled workers.” It was one thing to program a computer to win at chess. It is another to program a super computer to blow away the top Jeopardy contestants. But what if you subscribed to a company that would take all of your raw business data and convert it into a clean, well-written narrative for your end of the year report? Narrative Science claims to do just that. The Chicago-based company began as a class project. Kristian Hammond and his Larry Birnbaum taught a class at Northwestern’s Medill School of Journalism, Media, and Integrated Marketing Communications. The students taking the course were a combination of programmers and journalism students. One team of students created prototype software, Stats Monkey. According to Wired Magazine the software was, “designed to collect box scores and play-byplay data to spit out credible accounts of college baseball games.” Automated Insights, a competitor to Narrative Science, was used in a test between a software generated game recap and one written by a journalist. In Christer Clerwall’s paper, Enter the Robot Journalist, he found, “the text written by a journalist is assessed as being more coherent, well written, clear, less boring,
7
and more pleasant to read. On the other hand, the text generated by software is perceived as more descriptive, more informative, more boring, but also more accurate, trustworthy, and objective. The result was essentially a tie when all factors were evaluated. Narrative Science, in addition to creating news stories from vast amounts of data, has a product targeting business communications. Quill is the name of their business-centric solution. The Narrative Science website explains, “Quill leverages natural language generation software to produce content which meets your communication goals, business rules, and overarching stylistic preferences, such as tone, style and formatting.” Finally, Quill automatically applies natural language to the most relevant information and assembles a narrative that is indistinguishable from a human-written one. Tesla Motors has just released a significant software update for existing Tesla Model S automobiles that provides a suite of new and updated features. Autopilot provides several autonomouslike features for the driver, or owner, since some features do not require anyone to be inside the vehicle. The website explains the Model S, “is designed to keep getting better over time. The latest software update, 7.0 allows Model S to use its unique combination of cameras, radar, ultrasonic sensors and data to automatically steer down the highway, change lanes, and adjust speed in response to traffic. Once you’ve arrived
8
at your destination, Model S scans for a parking space and parallel parks on your command.” This means Tesla has a near autonomous driving mode. The company recommends a driver be in the car and ready to take the wheel at any time. The new features include Autosteer (Beta), Auto Lane Change, Automatic Emergency Steering and Side Collision Warning, and Autopark. Autosteer, “keeps the car in the current lane and engages Traffic-Aware Cruise Control to maintain the car’s speed. Using a variety of measures including steering angle, steering rate and speed to determine the appropriate operation AutoSteer assists the driver on the road, making the driving experience easier. Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.” When Autosteer is enabled, Auto Lane Change allows the driver to simply engage the turn signal and the car will change lanes when it senses that it is safe to do so. Automatic Emergency Steering and Side Collision Warning alerts the driver if something is detected too close to the side of the vehicle. Autopark, as the name implies, allows the car to parallel park without engagement of the driver. Elon Musk, Tesla’s CEO said, “the parking feature is a ‘baby step’ toward his eventual goal: Letting drivers summon their self-driving, self-charging cars from anywhere using their phones. I actually think, and I might be slightly optimistic on this, within two years you'll be able to summon your car from across the country. This is the first little step in that direction.” This is important because these updates are for Teslas that are already on the road. Google has been working on an autonomous driving car but it does not intend to manufacture the automobiles, rather it intends to license the
9
technology. Both Google and Tesla believe that fully autonomous cars, what Musk describes as, “true autonomous driving where you could literally get in the car, go to sleep and wake up at your destination” – will be available to the public by 2020. Uber has a significant interest in autonomous vehicles. Uber currently gets a mere 25 percent of the fare with 75 percent going to the driver. If Uber can regain the advantage if they could eliminate the driver. Specifically, Travis Kalanick, Uber CEO recently made the statement that Uber will eventually replace all of its drivers with self-driving cars. Columbia University recently conducted a study suggesting, with a mere 9,000 autonomous cars, “Uber could replace every taxi cab in New York City – passengers would wait an average of 36 seconds for a ride that costs about $0.50 per mile.” At this year’s Consumer Electronics Show in Las Vegas, General Motors CEO Mary Barra said, “We're going to see more change in the next five to 10 years than we've seen in the last 50. To that end, Barra announced a $30,000 Volt that goes 200 miles on a charge and does 0-60 in 7 seconds. This beats Tesla to the electric car for the masses. For more about this read February 2016 Wired Magazine article. During the same show, Elon Musk stated that in two years Tesla would have the technology ready for fully automated level 4 self-driving cars. The U.S. Department of Transportation's National Highway Traffic Safety Administration (NHTSA) defines Level 4 as Full Self-Driving Automation: “The vehicle is designed to perform all safety-critical
10
driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles.” The elimination of jobs because of automation is a trajectory we are already on. The Gartner Group claims software and robots will replace one third of all workers by 2025. “New digital businesses require less labor; machines will make sense of data faster than humans can,” claims Gartner research director Peter Sondergaard. This time around the jobs that will become extinct will include many high-skilled jobs. This should be a wake-up call for education. The discussions about training students for jobs that don’t currently exist doesn’t usually consider that many of these new jobs would be for far fewer positions and how many existing jobs will become extinct. Teaching students in the same way we have always taught will make it exceedingly difficult for graduates to enter college or the workforce. Independent of transforming education, the future is likely to eliminate a significant number of summer jobs that currently allow high school students to learn the basic values of the workplace. It is time to look at what is truly happening in the 21ST century and abandon the perspective of what schools needed to teach that was formulated in the 1990s. The 21ST century is not evolving in the same way we envisioned. If we do not seriously rethink what and how we teach, our students will be ill-prepared for the future they will inherit.
This article has been edited by permission from “Virtual Reality Magazine” by Bob Vojtek (release date February 2016).
11
Virtual World’s Best Practices In Education
VWBPE16 Thinkerer Award Nominations Open The Thinkerer Award is presented to an individual whose deeds and actions have shown consistent selfless service towards the promotion of learning, community, and educational practices, and who exemplifies the spirit of cooperative development within immersive environments. The award is for lifetime achievement rather than for a single contribution. Recipients of this award are not simply outstanding professionals in their field. Award recipients must characterize transformational leadership qualities to: envision and guide change; • enhance the motivation, morale, and performance of both peers and pupils; • promote best practices and continuous improvement; and • inspire others through their words •
and actions. The Thinkerer Award is presented in honor of Dr. Selby Evans, Prof. of Psychology at Texas Christian University in Fort Worth, Texas from 1964-1990, specializing in research design, statistics, and computer applications. Dr. Evans has been an inspiration to many within the field of immersive environments and continues to this day to inspire, guide, and enhance opportunities for changeseeking educators to meet, collaborate, and experiment.
12
If you know of someone who deserves to be recognized for their contribution to the field of education and immersive environments, we encourage you to submit them as a nominee for the 2016 Thinkerer Award. Nominations are currently open until February 7th, 2016. Nomination criteria and the submission form can be found on our website at http://vwbpe.org/at-a-glance/thinkerer-award. Final selection of the award recipient will be conducted by the VWBPE Organizational Committee by Sunday February 21st, 2016 and will be announced publicly during the conference closing ceremonies on March 12th, 2016.
Watching the Bandit Sailboat Races in the Fruit Islands on January 16, 2016. To join the fun check out Virtual World Sailing.
13
SAVE THE DATE! Madames and Monsieurs! It is with deepest pride and greatest pleasure that we welcome you to be Enchanted @ VSTE! VSTE VE PLN presents their Annual Fundraiser with a magical space that will be sure to stir your wildest imaginations and grandest of fantasies. A night in this mystical castle will leave you breathless. Join us for fine friends, fine dining, and dancing as we celebrate the past year with photos and hourly Prize giveaways. Mark your calendars for January 29, 2016 from 5 – 8 PM SLT as we invite you to become Enchanted at VSTE! http://maps.secondlife.com/secondlife/VSTE%20Island/62/102/22
14
Augmented Reality By Bob Vojtek (rl), BJ Gearbox (sl) Augmented Reality (AR) an exploding technology, is gaining converts quickly with its ability to superimpose content over a real environment. Microsoft has added an entirely new application category, Microsoft Holographic, to the Windows 10 operating system. It superimposes holograms onto the real world environment. Augmented Reality is creating paths that have significant potential for education. Microsoft launched Windows 10 on July 29, 2015 and one of the applications that is creating significant fanfare is Holodeck, which will require HoloLens, a headset computer. It will come to fruition first for developers. The HoloLens will cost approximately $3,000 and should ship in the first quarter of 2016. The HoloLens was a hit at the launch and like Kinect technology for the Xbox gaming console, has the potential to be a disruptive technology. HoloLens, is a sleek, flashy headset with transparent lenses. You can see the world around you, but suddenly that world is transformed -- with 3D objects floating in mid-air, virtual screens on the wall and your living room covered in virtual characters running amok. Unlike VR, where you are enclosed in the environment, HoloLens uses a transparent lens allowing content to be superimposed over the real world environment. HoloLens maps the environment to know where walls (vertical surfaces) and tables (horizontal surfaces) are located. This invisible map allows the user to “attach� items to walls and tables.
15
The company is not trying to transport you to a different world, but rather bring the wonders of a computer directly to the one you're living in. Microsoft is overlaying images and objects onto our living rooms. As a HoloLens wearer, you'll still see the real world in front of you. You can walk around and talk to others without worrying about bumping into walls. The goggles will track your movements, watch your gaze, and transform what you see by blasting light at your eyes (it doesn't hurt). Because the device tracks where you are, you can use hand gestures -- right now it's only a midair click by raising and lowering your finger -- to interact with the 3D images. These images are opaque and superimposed in front of, or on top of the mapped version of the real world. So, in the case of one demo, a designer is creating a new gas tank and fairing for a motorcycle. The design appears on top of the real-world motorcycle frame such that the eye can’t tell the difference between the real motorcycle and the design elements. The view through the HoloLens makes it appear that the designer is building directly on the real world motorcycle frame.
16
Microsoft has partnered with several major corporations to utilize 3D design tools to natively create architectural, mechanical, medical, and entertainment models. Microsoft believes that superimposing digital data on top of real world content has a multitude of uses. A second Microsoft demonstration superimposes graphics and instruction in a real world environment. In this case demonstrating how to repair a clogged kitchen sink drain. The view through the HoloLens displays graphical instructions for a user to remove the p-trap to eliminate the clog. The instructions appear directly on top of the real world drain in the users home, showing the direction to unscrew the p-trap fittings. Another demonstration video shows how it utilizes a digital augmentation with a Minecraft build. The design appears to be sitting on top of a coffee table in a living room. Imagine how this will help in architectural design. The image could just as easily be a 3D architectural model on a conference room table. The ability to see a building design set in the real world environment is also a possibility. This technology will allow individuals that don’t have well-honed visualization skills the ability to see the finished product before the start of construction. Microsoft believes that mixed reality can be used to create new experiences that will contribute to advances in productivity, collaboration, and innovation. To that end Microsoft held an academic research RFP. They received over 500 qualified proposals. The five winning proposals received $100,000 and two Microsoft HoloLens development kits each. The award recipients were:
• Open-Source Investigations in Mixed Reality: Interactive Art, Visualizations, and Expressive Interfaces on HoloLens, Golan Levin, The Frank-Ratchye STUDIO, Carnegie Mellon University • Augmenting Reality for the Visually Impaired with Microsoft HoloLens, Emily Cooper, Wojciech Jarosz, and Xing-Dong Yang, Dartmouth College • Collaborative Analysis of Large-Scale Mixed Reality Data, Joseph Gabbard and Doug Bowman, Virginia Tech
17
• Immersive Semi-Autonomous Aerial Command System (ISAACS), Allen Yang, Claire Tomlin, and Shankar Sastry, University of California, Berkeley
• HoloLens Curriculum for Trade-Based Education, Andy Mingo, Tawny Schlieski, Nikki Dunsire, Shelley Midthun, and J Bills, Clackamas Community College and Intel
Let’s recap, 500 plus qualified entries, winning entries from: Carnegie Mellon University, Dartmouth College, Virginia Tech (although Doug Bowman has since been recruited by Apple), University of California, Berkeley, and Clackamas Community College and Intel. What ? Clackamas Community College? Way to go Clackamas! Beyond the need for expensive headsets and displays, there is a new wave of augmented reality that simply needs a smartphone or iPad and something called a marker. A marker is usually a sheet of paper, business card or catalog page that has a unique set of graphics. It is similar in function to a QR code. The marker must be unique so that the correct model associated with that marker displays through the phone or iPad. The possibilities are amazing. IKEA uses this technology with their print catalog. Ikea explains that, “by scanning selected pages in the printed IKEA catalog or accessing the pages in the digital publications you can view images, films and 360° room sets, and get to know the stories behind the products. You can also place selected furniture in your own room with the help of 3D and Augmented Reality!” The procedure is to put the marker, or in this instance the catalog page on the floor where you want to place the furniture. Load the application and view the marker on the phone using the camera. The application interprets the marker. Looking through the screen displays a full-size piece of furniture right in the room. The camera feature allows a photograph to be taken with the furniture in place. If there are color options, for example, you can slide your finger across the screen and switch the display of color options. By sliding or twisting your fingers on the screen you can manipulate the furniture piece. Sliding two fingers across the
18
screen moves the furniture piece in the room and twisting two fingers on the screen rotates the item. In this way you can “move� the furniture into place in your room to get a sense of size. IKEA has found that 14 percent of its customers end up taking home furniture, which turns out to be the wrong size for its intended location.
From a slightly different perspective, Intel says your next laptop or tablet may have 3D sensors that let it recognize gestures or augment a real scene with virtual characters. Intel has developed two types of depth sensors. One is designed for use in place of a front-facing webcam, to interpret human movement such as gestures. The other is designed for use on the back of a device, to scan objects as far as four meters away. Both sensors allow a device to capture the color and 3-D shape of a scene, making it possible for a computer to recognize gestures or find objects in a room. Amazon on the other hand has received patents that focus on gestures to provide input in an augmented reality environment. Amazon has submitted patents that it believes will eventually eliminate the headset, displaying content
19
right in your room. Amazon hopes to render Microsoft’s yet unreleased technology obsolete.
With so many tech giants jumping on the VR and AR bandwagon it would make sense that Apple would have a presence. Although not formally announcing specifics, according to Piper Jaffray analyst Gene Munster, Apple is preparing to enter a whole new space, and the company is making acquisitions and poaching employees away from other firms to do it. “Based on recent acquisitions of augmented reality companies, hiring of a key Microsoft HoloLens employee, and conversations with industry contacts within the virtual and augmented reality spaces, we believe Apple has a team exploring the AR space,” explained Munster. This article has been edited by permission from “Virtual Reality Magazine” by Bob Vojtek (release date February 2016).
20
Virtual Reality Discussions!!!!!! By Kae Novak (rl), aka Kavon Zenovka
What better place to have discussions about virtual reality then in a virtual environment!
Every third Monday of the Month, ISTE Mobile Learning Network and ISTE Games and Simulations Network will be having a discussion on virtual reality in Second Life. We’ll be discussing emerging technology‌well as it emerges.
21
We’ll also talk about what is doable for you as educator. Is it Oculus Rift, a Samsung mobile device or google cardboard? This technology has so much potential. So, let’s start exploring! Take a look at the flickr site we just started and feel free to join it. https://www.flickr.com/groups/2873206@N25/pool/with/22985571561/ Have your headset and mic handy to join in the discussion with your avatar in Second Life! Register and download Second Life software here: http://secondlife.co m IM Cyndyl Enyo or Kavon Zenovka for a teleport to our location or follow the SLurl here: http://bit.ly/isteml n_SL
22
By Bob Vojtek (rl), BJ Gearbox (sl)
23
Google Cardboard is the ultimate low-cost version of VR. It is literally a cardboard fold-up with cheap plastic lenses that encases a smartphone to view stereoscopic images and video. It looks like a hand-held mask similar to a snorkeling mask. A phone app splits the screen to view content in 3D. The fold-up templates are available online so that anyone could theoretically create their own Google Cardboard by downloading the template and cutting out the cardboard pieces. This free availability has enabled organizations to print their own branded versions of Cardboard. The easiest way for the average person to get Cardboard is to order a kit from Amazon or one of several manufactures. Probably the best way to visualize the Cardboard concept is that it is a 21st century version of the iconic View-Master. In an amusing twist, Mattel has given View-Master the first facelift in 30 years. Google and Mattel have partnered and updated the iconic View-Master for the VR generation. The original View-Master was a small hand-held device that allowed you to insert a disk with stereoscopic photographs to display an image in 3D. So it should not be a surprise that the new View-Master VR, a $29.95 device made of plastic, cradles a smartphone and works the same way as Cardboard.
24
Google Cardboard is catching on rather quickly and gaining traction with support from such heavy hitters as ABC News, New York Times, and Verizon. Even the Space Needle in Seattle sees Cardboard as a way to jettison its 1960s persona and encourage people to have a Space Needle VR experience. Colleges and universities are creating virtual reality tours to enhance their recruiting efforts. Yale University used a dozen Oculus Rift headsets on campus to provide prospective students a virtual tour of the university. Next year (2016) they are planning to mail out Cardboard headsets with assembly instructions and a link to the location of the files to view. This allows anyone with a smartphone to view the virtual tour on a Yale branded viewer. Google owned YouTube announced that it would support VR video. Additionally it will make every one of its existing video viewable through a VR headset. Google has been creating its own content to be used in Google Cardboard and is providing the specification and software through Jump. Jump is a platform that uses 16 specifically oriented GoPro Cameras to capture these immersive scenes and the software stitches the content together into a visual surround experience. As the technology matures, the cost of creating engaging immersive content may soon become affordable. Currently Jump is one of the most cost effective solutions on the market. Or you could go with a Nokia OZO Virtual Reality camera for $60,000. ABC News is creating VR news stories that utilize Google Cardboard. Their first immersive story, is a 360 degree virtual reality experience that transports viewers to the streets of Damascus. The site is http://www.abcnews.com/vr and explains the process to download the application and view the inaugural story. ABC News VR intends to bring virtual reality storytelling to the news division and beyond. “Our inaugural project is a special one, an immersive experience captured while in Syria,” announced ABC News President James Goldston, “from the Damascus Citadel and Souk to the Umayyad Mosque and the National Museum, Alex [Marquardt] transports viewers into the story, providing a depth of reporting -- and a personal guide-- unlike anything we’ve done before.”
25
So as not to be outdone, The New York Times announced a virtual reality project in collaboration with Google, which included the distribution of more than a million cardboard VR viewers, one for every print subscriber. The New York Times has made a commitment to create 100 VR stories using New York Times branded viewers. Its inaugural venture is a VR story called “The Displaced,” about children uprooted by war. Dean Baquet, the executive editor of The New York Times, said the magazine had “created the first critical, serious piece of journalism using virtual reality, to shed light on one of the most dire humanitarian crises of our lifetime.” Jake Silverstein, the magazine’s editor, explained, “The power of VR is that it gives the viewer a unique sense of empathic connection to people and events.” It has huge potential, he said, to help bring subscribers news and stories from the most inaccessible places. [Editor note: As we go to press there are currently 8 stories on their website at http://www.nytimes.com/newsgraphics/2015/nytvr/.%5D ] Google has a pilot program for education with something they call Expeditions. These are education-oriented videos recorded in immersive 3D. This pilot project is creating self-contained adventures, in the form of short videos of famous places around the globe. For example, Travel & Leisure announced how their readers can explore Buckingham Palace using Google cardboard. Google has launched a worldwide tour allowing schools to experience the closest thing to a virtual field trip to occur in the classroom. Google brings Cardboard, smart phones, and a wireless network into the school so that no school resources are required not even an internet connection. The pilot demo is self-contained. They lead students through an immersive adventure to places students may never be able to go. For more information, visit https://www.google.com/edu/expeditions/ . The teaser for Expeditions states, “Imagine visiting the bottom of the sea or the surface of Mars in an afternoon. With Expeditions, teachers can take their classes on immersive virtual journeys to bring their lessons to life… Expeditions is a virtual reality platform built for the classroom. We worked with teachers and
26
content partners from around the world to create more than 100 engaging journeys - making it easy to immerse students in entirely new experiences.” The availability of Cardboard centric content and the affordability of the Cardboard device still leave one element to grapple with… access to the smartphones to drive them. Google suggests that when families in your community consider trading their old smartphone in to buy a new one, they instead donate the phone to the school. The phone does not need a cellular account or use cellular data; rather it just needs to take advantage of a wireless internet connection to view the content on Cardboard in 3D.
If schools can obtain the smartphones and use Cardboard in the classroom, we have the possibility of a fundamental leveling of the educational playing field where students that would never have the opportunity to travel could be immersed in virtual travel and have experiences that could closely replicate “being there.”
I remember my first teaching assignment in inner city Los Angeles… I took students in my scale model club to my beach apartment in Hermosa Beach on the Strand. Three of the dozen junior high students in this group had never been to the beach, had never seen the Pacific Ocean, just miles from their inner-city homes. Imagine the possibilities if they could walk through the streets of Paris with the help of an inexpensive device. While attacking the low end with Google Cardboard, Google is among a small group that invested more than $500 million in a company called Magic Leap. The South Florida Business Journal is reporting Magic Leap is “close to inking a $1 billion financing round.” Magic Leap has made very little information public about
27
what it does, but making a $500 million investment, Google clearly believes Magic Leap is working on something worthy in the world of augmented reality. So why isn’t it “game over?” Google appears to be gaining world dominance in both virtual reality and augmented reality. Google Cardboard is an affordable solution for some aspects of virtual reality and smartphone apps bringing usable augmented reality to the masses. Well, the issue is motion-photon-latency, obviously. The difference between a $500 headset attached to a $2,000 computer and Google cardboard is twofold. First, what is referred to as latency, is the time needed for a user movement to fully be reflected in the screen. Google recommends that users not attach Google Cardboard to their head. The reason is that when the device is attached to the head and the user moves their head, it often happens quite quickly. If the cardboard has to be held by hand against the face, there is a tendency to twist from the waist, which happens more slowly than twisting at the neck, thereby lowering the effects of latency by making slower movements. The second aspect of Cardboard that currently limits the immersive experience is that the experiences are somewhat predefined in that the user is stationary with the ability to “look” around, not to walk through a scene. This is in contrast to a fully immersive experience such as the Oculus Rift where there are additional inputs that may allow you to more fully interact within the experience. An input device could propel you through the experience, such as using the arrow keys to move forward, backward, or sideways. In a simulation with Google Cardboard you would likely be in a fixed position with the ability to turn around to see what is behind you or look up to see the vertical extent of the environment. With more sophisticated immersive experiences, you would likely be able to explore on your own. An Oculus Rift demonstration of an Italian villa allows you to explore the building and grounds by using arrow keys to glide through the entire space as you see fit. You can even ascend a staircase by moving forward at the base of the stair. For examples of other Oculus Rift demos see VR Circle’s list of top rated Oculus Rift VR Demos.
28
Still, looking at Cardboard with a cost-benefit lens, there are strengths to being able to see around a setting without moving through the space. Clearly the New York Times and ABC News VR believe that the new opportunities of telling VR stories are worth the investment. The reality is that traditional movies or television shows inhibit our ability to move. We count on the production to “show” us what we need to see rather than determining on our own what is important. The new freedoms of VR storytelling have the potential to be game changers! This article has been edited by permission from “Virtual Reality Magazine” by Bob Vojtek (release date February 2016).
29
If You Had Fun Last Year . . . You won’t want to miss the VWBPE Social activities this year!
Save
The Date!
30
A Micro-History of Augmented Reality
in Film, Television, and Pop Culture By Bluebarker Lowtide (sl) Vasil A. Giannoutsos (rl)
Contained herein are examples of Augmented Reality featured in movies and television. They say some of the things you see on TV will one day be real. Augmented Reality is one of these all too familiar science fiction ploys or fantasy gimmicks being made real.
31
One of the more recognizable forms of Augmented Reality (AR) is JARVIS from Marvel’s “Iron Man” and “The Avengers” films. JARVIS is Iron Man’s personal Artificial Intelligence computer that responds to Tony Stark’s commands and requests. JARVIS takes the form of a Heads-Up display within the Iron Man helmet or his sunglasses in the later movies. He is able to communicate to him through voice and detailed graphs and meters of the important statistics of the super suit. As early as the 1960s Augmented Reality (AR) hit television screens with Hanna-Barbera’s “The Jetsons” with holographic television and special tabletop consoles that altered the configuration of the rooms, furniture and clothes. Live action shows such as “Dr. Who”, “Star Wars” and “Star Trek” featured many graphical displays that were always done in post-production that the actor couldn’t see, but still had to interact with. The famous “Star Trek” Holodeck is a perfect example of virtual reality and augmented reality coming together to help the characters do battle simulations, diplomatic conferences and (even in data’s case) jump into books. The next real emergence of AR came in the 80s during the development of
32
experimental and avant-garde music videos, I’m sure. But two great science fiction films, some of the last before everything went computer generated were two intensely action-packed films. I recommend both if you haven’t seen yet them. “Blade Runner” and “RoboCop” involve high technology displays on computers and heads up displays in the cars. Seriously, I am convinced the scene where they are playing some camera footage in “Robocop” is where they picked up the hand movements for when you operate touch screen smart phones. Approaching the 90s, we find a burst of movement in the gaming industry as game developers are starting to move away from 16-bit arcadestyle games and third person platforms to 3D CG first person shooters with advanced displays to help you in the art of war. Though not actually technological available to use at the time, the concepts of the player being the game character in a commutative property sort of way, makes it Augmented Reality for that game. With those displays giving data and changing the environment so we can see corners, look for clues, or find health packs in the No Man’s Land, many of us would not have made it past the first level. Nintendo even did a couple of games that allowed a person to hook up their Gameboy Advanced to a game and have it function as an additional map, sensor, or controller that interacted with the environment in the game. This may be considered a peripheral, but the transfer of data, information and environment interaction is what separates AR from just a regular computer mouse or printer. This input and output of data back and forth becomes more beneficial for the user than just an output or just an input. Having the data and information react real time is also another key feature of AR that many forget. If it is not happening now and altering the way you perceive reality than it probably isn’t AR. Now I am not saying that an altered state of reality through chemicals, medicine or other means is the same thing. Augmented Reality comes in mainly one of three forms. Augmented Realty can be classified as Low-Tech, Mid-Tech or HighTech. When we say Low-Tech, it really means little to no technology is
33
involved to accomplish the effect of performing Augmented Reality. An example would be a role-playing game such as “Dungeon and Dragons” or pretending the scenario that your students are scientists at Jurassic Park identifying dinosaur DNA. Such primitive illusions are an effective, Low-Tech way to perform a simple form of Augmented Reality. These scenarios or adventures change the surroundings and the framing for doing certain things that affect what is actually happening. In DND, you roll dice to determine actions and movements. Using the Dinosaur DNA, students take what they have learned and applying it to a fictional, yet practical way that such information could be used in the real world if dinosaurs did still exist.
Mid-Tech concerns the majority of all those headsets, such as Oculus Rift, Google Glass, Google Cardboard, Microsoft Hololens, and various other technological devices being constructed to project an AR to the user. Even apps can be downloaded to a phone or a tablet that, with special markers or cards, can make images appear on the device that are interactive – such as “Letters Alive” by Alive Studios for Preschoolers up to the third grade (http://alivestudiosco.com/). This mild interactions and ability to choose the inputs and outputs are what separates the Mid-Tech from Low-Tech
34
categories. With Low-Tech, the user has to provide the output and work from there, in Mid-Tech both the input and output of information is provided and constructed. The bleeding edge is considered High-Tech – where we might be seeing the dawn of such impressive technologies! With the advancement of Augmented Reality and Virtual Reality technologies, the users have the ability to create a whole new experience that encompasses and/or tricks all the senses incorporating ‘close-to-full-immersion’ with machines that put you relatively in that reality for all intensive purposes. The only thing close to this would be “The VOID” and they are still in the alpha stages with hopes to open a facility in the summer of this year in Utah. Forgive me, but I feel that many biology teachers are screaming at me saying that such a thing is purely fictional. So allow me to address the Dinosaur DNA a bit differently. Now we could pay to have student mini-kits to determine the sex of birds, but we would need to take that out of the budget because its expensive. Surely, any teacher in public education understands the woes of budget constraints. So instead of using kits, students can receive a folder or pictures of X and Y chromosomes and by pairing them and counting them we can determine the sex of a bird without even having to get our hands messy with collecting a specimen sample. Heaven forbid a bird bit a student and the parents sued the school. It’s much safer to pretend to be Ornithologists then to deal with all the red tape. Anyhow, I believe I jumped off topic, where was I? Oh, yes the turn of the Century and the ringing in the New Year and millennia where everyone thought the Y2K bug would be the end of all computers. Oh, good times! By the 2000s, technology was gaining heavy spotlight as newer and better cell phones and the technology for computer generated imagery and video compositing was getting faster and stronger. Both animated and live-action TV shows and films were aiming towards a veritable bright and shining future. “Men in Black”, “Minority Report”, “A.I.
35
Artificial Intelligence”, “The Matrix,” and more all satisfied our needs for some science fiction bedlam while showing us fascinating worlds of the good and bad of technological progression. The Matrix Trilogy alone was set with the premise of the world we currently live in as an Augmented Reality that is projected to humans to hide the apocalyptic reality of a world run by machines, a very deep concept. Even the whole plot of James Cameron’s, “Avatar,” involves the main character living in an Augmented Reality as his mind is being transported through technology into that of a Na’vi (that would be the most extreme form of Augmented Reality in my opinion).
All of these amazing things over the years have given just a little spark to all that have seen it. It is amazing to see how films, television and pop culture can inspire and influence such marvelous things that were once considered the works of fiction. Who would have thought that we would see such inventions and technological advancements in our lifetimes? In as little as roughly 50 years we are seeing things from “The Jetsons” and “Star Trek” become reality. Maybe one of these days I will get to see robot maids and flying cars; won’t that be something?
36
Heck, there is already talk about commercializing space travel and the colonization of Mars and Venus! If “The Martian” can survive on Mars by himself on a movie screen, then just imagine what 50 years down the road from today will bring? Makes you wonder what kind of things will exist? Well, looking at films, television, and the pop culture of today, there have been some interesting ideas: a pill that makes you smarter; robots with human emotions; and so much more including trailers for upcoming films and TV series in 2016. We are living in a future that only few have dreamed of. It is, however, what we dream and do today that will shape our future. I am sure I heard that quote somewhere, leastwise I am probably paraphrasing Walt Disney. LOL I’ve watched practically all of his clips.
VEJ satellite office at VSTE!
Be sure to visit us at: http://maps.secondlife.com/secondlife/VSTE%20Island/19/169/22
37
It’s the 9th Annual Virtual Worlds Best Practices in Education March 9-12, 2016 2016 Theme : Horizons The VWBPE Conference is a completely virtual conference that is conducted using simulated environments. Participants experience the conference through a virtual reality type setting including conference rooms, theatres, exposition halls, meeting spaces, and other types of venues similar to a brick and mortar type conference. The conference is free to attend. The cost of the conference is covered by sponsorship and donations. Register and see all the latest news from VWBPE!
38
Woot, y’all.
There’s lots going on at AvaCon Grid! A walking tour of ISTE VEN’s New Virtual Presence in the Metaverse! By Scottmerrick Oh (irl Scott Merrick)
Several months ago, after discovering the Open Simulator project that is the brainstorm of longtime pioneers Chris Collins and Joyce Bettencourt, along with their capable team members at AvaCon, LLC, I approached them to ask whether or not they would be interested in hosting some space on the grid for ISTE Virtual Environments Network.
39
They couldn’t have been more gracious. Before I knew it, I was able to teleport to a region--not a parcel, but a full region-sized property--set aside for ISTE VEN. Back in 2011, when ISTE decided to pull out of Second Life, they continued to support our community by providing rent for a parcel on Eduisland 9, Serena Offcourse, (Mary O’Brian, rl), designed and built a really nice headquarters for us there. Faced with this new and relatively vast expanse of flat green earth under bright blue sky, we called Serena into service once again. She set about terraforming a rolling land surface and installing some gorgeous landscaping, while I planted a temporary freebie building in one corner to serve as a meeting space and Mary Howard (irl Mary Howard) jumped in to help as well. We had some fun with that, but everyone was aware that it wouldn’t be enough as we grow. Enter Helena Kiama, (Barbara Seaton, rl). Helena offered to create a new headquarters, and of course we said “go for it.” Little did we know how wonderful the final product would be! Let’s take a little walking trip and get the lay of the virtual land, shall we? For this series of pics I logged into AvaCon Grid (instructions for getting a free educators’ account are up at: https://www.avacon.org/blog/avacon-grid/ and you will be glad you did).
40
I’ve landed in AvaCon Grid at the welcome/tutorial area. This is AvaCon’s Community sim landing zone, and to get to our beautiful new HQ within it, I can do one of a number of things:
• Click “World/Teleport to my home” since I’ve set my home to HQ • Use the OpenSim slurl in the map, or
• Open up the map and double click where I want to go. I think I’ll do the first one:
41
There. I’m home. Landed. Let’s zoom in and pan around to see me, Scott Merrick (clever alias, eh?).
I look a little dubious, don’t I? My general modus operandi
42
is to work on my avatars to make them look as much like I do in my “meat space,” my real life, as the viewer will allow. I generally look much more buff and younger, but the general design direction is pretty constant. I don’t have my Second Life ponytail in AvaCon Grid, but I’m content for now. Let’s zoom out to see more of the building:
Walking into the right-side (house left) wing, the visitor comes upon blank screens.
43
Clicking on each one brings up the html-on-a-prim. They load pretty fast, in 5-7 seconds each for me, and each screen then becomes a fully usable web browser.
44
From left to right the current configuration displays:
1. The Virtual Environments Network Padlet with all the volunteer mentors listed. You can join this list at http://padlet.com/spiffwhitfield8/venmentor 2. The Virtual Education Journal, presented regularly since 2011! http://virtualeducationjournal.com 3. The Massive Multiplayer Online Calendar, with events from all over the metaverse--http://bit.ly/moocalendar 4. The ISTE VEN Weebly, a wealth of info in a neat and tidy package: http://venetwork.weebly.com/ , and 5. ISTE, the organization that brought us together in Virtual Worlds and kept us together lo this long: http://iste.org
Popping over to sit at the conference table, only the first of the many meeting places at ISTE VEN AvaCon. As of this writing, the building has only been in place around two weeks, lowered down from the sky where Helena worked on it. I’m sure that we will continue to add to its contents now that it is in place, but I surely do love the openness and the flow of it just as it is.
45
The other meeting spaces are settings for conversations between two avatars or many. Let’s take a walk to see those!
46
We approach a guiding sign. Every 4th Tuesday of each month we present Machinma night. We’ll go there, with maybe a couple side jags on the way.
47
Here’s our first little side-trip. We’ll follow the paving stones to a cozy little spot for 2 to four (more if they sit on the ground) avatars.
48
I love me a little fireplace. I have my sun settings to mid-day, but a few clicks can make this a very cozy little nighttime meeting place.
Back on the path and down it, there’s another sign.
49
But, let’s go straight instead and see what this inviting looking space past the bridge could be. Ah, it’s a party space. It may need a dance floor and some groovy colored lights: Note to self.
Next time I want popcorn, a hot dawg, or ice cream, I know where to come, though. We’re back along to trail, past that “machinima” sign, and what will we see? Ah, the movie area! I’ll have a seat. VERY comfy.
50
Panning around to the screen:
and in‌looks like a Guild meet-up in World of Warcraft. Holiday in theme!
51
This will definitely benefit, on movie nights, from setting the environment sun settings to midnight. But for now, let’s make a quick run around to see other meeting areas.
52
Up those stone stairs‌
The Adirondack chairs, familiar seating from ISTE VEN in Second Life. Do we need our open campfire here?
53
Here’s a treehouse space for a one-on-one chat.
And a lovely pagoda with turquoise Adirondacks and carpet.
54
Feeling nautical? It’s only stationary, docked at the edge of a lovely pond, so the trip won’t be long, but it’s a fun place to chat. I’m thinking midnight for this one, too, ya think?
A beautiful and peaceful waterfall with cozy seating. Conducive to reflective discussion and quiet dialogue. Come and visit. Once you explore ISTE VEN in Avacon Grid, don’t be a stranger. We hold Open Office Hours the second and third Tuesdays in Second Life or in AvaCon Grid, as the whimsy, or as the topic, takes us. Then, as mentioned, the 4th Tuesday of each month our own Gridjumper hosts Machinima Night. All of these events occur at 5pm SLT/Pacific Time (8 pm, ET) . Come hang out with us and talk about ways we can leverage the power of virtual worlds for education and community!
55
Thanks so much to leadership team members. Helena, Selena, and Mrs. Howard, for their contributions to this enchanting and at the same time utilitarian build. Of course, it could not have even been begun, much less realized, without
the immense generosity of AvaCon, LLC. And finally, and always, thanks to Roxie and BJ (Rosie and Bob) for the long and continuing story of our work, alive and documented, in Virtual Education Journal! Additional Links: Join ISTE and the Virtual Environments Network: http://www.iste.org/lead/become-a-member If you already have an OpenSimulator account that is Hypergrid enabled, you can Hypergrid jump to AvaCon Grid using http://grid.avacon.org:8002 . Visit ISTE VEN in Second Life: http://maps.secondlife.com/secondlife/EduIsland%209/46/90/22 Life(s) is wonderful!
56
Virtual Reality By Robert Vojtek (rl) BJ Gearbox (sl)
After many false starts, Virtual Reality (VR) now has promise with the backing of some significant players, including Facebook, Sony, Samsung, and HTC. In essence, VR immerses the user in a 3D world with stereoscopic “glasses.� Google is attacking the VR possibilities on multiple fronts. On the low end is Google Cardboard, a cardboard fold-up that uses a smart phone to display stereo images. On the high end, Google has invested $542 million in Magic Leap. Magic Leap intends to take virtual reality to a level beyond the current projections of Oculus Rift. The Oculus Rift, is a headset that looks much like a diving mask with a split screen computer monitor inside. Immersive Virtual Reality has been around for decades. The reason for a spike in enthusiasm has much to do with faster, cheaper processing which creates significant gains in quality at a lower cost. The basic elements of these products include a twin set of monitors, one for each eye, inserted into a goggle or helmet; an input device such as a gaming controller, keyboard or other hand-held input device; and a sensor system to determine the location and orientation of the headset (which way are you looking). One other element that all of these systems have in common is that the immersion is created by shutting out the outside world. The helmet or goggles completely eliminate the ability to see anything other than what is being displayed on the monitors in front of each eye.
57
There are several companies that are vying to be THE solution. The most notable, discounting Google Cardboard, is probably the Oculus Rift. The Kickstarter campaign for Oculus VR set a goal of raising $250,000 and surpassed that by raising more than $400,000 on the first day. This feat catapulted Oculus forward and within two years Facebook bought the company for $2,000,000,000. The social network is paying $400,000 in cash plus 23,100,000 Facebook shares for the company, with a further $300,000,000 in incentives if it hits certain milestones in the future. This started a veritable arms race to become the first consumer virtual reality headset. Although Oculus may be the most recognized headset, it is not the only player by far. Other major players include Sony PlayStation VR, HTC Vive Pre, Samsung Gear VR, FOVE VR, Zeiss VR One, Avegant Glyph, Razer OSVR, and the Freefly VR headset. A field this crowded ensures a faster development time of refinements. Like VR of the 1980s, early versions of the Oculus Rift Developer’s Kit created a user experience referred to as the screen door effect. The resolution and refresh rate were focused on developing content and the lens/monitors would be analyzed separately. Apple refers to their displays as retina. The idea being that the pixels are so close together that the brain no longer sees individual pixels. When Steve Jobs launched the iPhone 4, and with it the first Retina display, he described it as having a screen with so many pixels packed closely together that they were imperceptible to the human eye at a distance of twelve inches. He
58
went to great lengths to explain that, because the iPhone 4’s screen packed in 300 pixels per inch, most people wouldn’t see them at all when the phone was a foot from their eyes.
The Oculus Rift has two displays and when you take even a high-resolution display, dividing it in half often means it has pixels that are detectible by the human eye. Oculus is striving to have the best possible display within their given price point for the retail product. Some companies are now thinking that having the display in front of the eyes in a headset will be a short-lived solution. Magic Leap purports, based upon patent information, to beam content directly into your eyes so that that brain cannot differentiate the digital from the real.
59
Virtual reality developers are primarily looking at three uses in the short term including gaming, simulations, and movies. Movies have transitioned significantly since the early days without sound. Initially movie cameras were stationary, but soon at the director’s discretion they panned from side to side. Once the movement of the cameras started, the ability of the movie to tell a story changed significantly. Today we benefit from 3D movies where the viewer can see a depth of the image that was not possible with traditional 2D movies. Additionally, we now have surround sound where the viewer can hear the direction of a sound making the experience much more engaging. The 1996 movie Dragonheart, featured Sean Connery as the voice of Draco a dragon, is still used as an example to demonstrate surround sound. As Fernby Films website explains, “Draco’s wing beats and deep voice throb from every channel as he flies about the sky; it was the moment during which I realized just how surround sound could – and should – be used.” This newly added auditory layer can give an incredible sense of space and is sometimes referred to as 3D audio hacking your brain.
60
With surround sound capabilities added to an immersive visual experience it is no wonder that Oculus has created a new creative team called Oculus Story Studio. The team is tasked with utilizing VR to create a film experience unlike any other: one that completely immerses the viewer in the interactive films with its revolutionary technology. The utilization of VR in movies will create a complex way to tell stories and view content. A virtual environment that is digital will be very different from characters in a scene. When we watch a movie or television show, the director dictates what we see and when. You hear a sound and the scene zooms in on what makes the sound. With VR the viewer may decide to meander around the scene choosing what is important and what to concentrate on. When hyperlinks were first introduced many felt that people would no longer be able to have a linear experience with text. Having links embedded in the text of the story that were previously notes at the end of the story would leave the reader disoriented. Allowing the viewer to move through an environment probably makes more sense for a football game rather than an adventure movie, but one has to admit the possibilities are intriguing. Gaming was the first focus of virtual reality and overlapped by simulations. The military has been utilizing 3D virtual simulations for years. Army researchers at the U.S. Army Edgewood Chemical Biological Center are using 3-D software solutions to supplement more traditional training and simulate real-life scenarios without the danger in the field. Over the last two years as the developer’s kit has been made available there have been a multitude of games created for the Oculus Rift their website, https://share.oculus.com/ lists pages and pages of games that are already available. This article has been edited by permission from “Virtual Reality Magazine” by Bob Vojtek (release date February 2016).
61
The Virtual Pioneers A Report from the Field By Andrew Wheelock (rl); Spiff Whitfield (sl)
Virtual Pioneers Gala
Introduction The Virtual Pioneers have continued their educational adventures and hijinks in 2015. Started in 2006, the Virtual Pioneers have had bi-weekly tours and events that explore history and culture. This year we continued our explorations. In the past I have reviewed the sims we have visited. I have created a quick reference link for those who would like to explore these. I would like to take this time instead to provide some evaluation of how I see the future of history and culture explorations using virtual environments.
62
History Education and Virtual Environments I’ve always felt that students don’t appreciate history and it's important implications until they can visualize it. Historical biographies (ex. The Diary of a Young Girl: Anne Frank) and historical fiction (ex. My Brother Sam is Dead) have been our traditional pathways to this understanding. Next came film and video. John Huston’s The Red Badge of Courage was the first movie that stirred the deep longing for history in my young mind. And here we are… at the next frontier: Video Games and Virtual Environments. As I have found through my work with Virtual Environments our students are thirsting for this next frontier in learning. They have been learning about history tangentially through gaming for the past decade or more. Most notably, Civilization, Medal of Honor, and Assassins Creed are just a few of the high-powered video games that have powerful historical narratives embedded in them.
Houston, We have a Problem The only problem that we have now in Social Studies education using virtual environments and gaming, is that these technologies are not being embraced by a large majority of educators. It’s a BIG problem, but an understandable one. It’s understandable for 2 reasons: 1. Educators have been bombarded with heavy-handed reform measures that have kept them trying to keep up with the demands of curriculum, assessment, and evaluation procedures rather than innovative ways to work with students. 2. Technologies like virtual environments and gaming are beyond teacher’s technology comfort zones. These two issues have kept virtual environments and gaming out of the schools by and large. However, “the powerful play goes on” and teachers are starting to realize the potential that awaits . . . over the rainbow. One of the major factors in this is the explosion of Minecraft. Parents are finally getting the message from their children that there is real, substantial learning that is going on in a virtual environment.
63
And while I am a fan of Minecraft, I see the clunky builds that students create and think... OpenSim, Unity, and Second Life building tools can push the boundaries of student learning much further. See my article with Mary Howard So let’s bring this back to the Virtual Pioneers. Last year’s tours have shown that Second Life and graphically intense programs can offer amazing ways to 1920 Berlin Project Tour
teach students history and culture. With the possibility of Oculus Rift on the
scene we may have hit a motherload of possibility. Our tour of Luxumbourg 1867 and the ever popular 1920 Berlin Project are leading the way in highly detailed, historically accurate, and visually stunning builds that can create an amazing learning adventure. Please join us as we begin our next season of highly educational and entertaining tours and events. Education and possibility await!
64
Luxembourg Tour (above).
Buckingham Palace Tour (above and to the right)
65
Introducing Renaissance Gallery By Roxie Neiro
66
On Saturday, January 16, 2016, I attended a Renaissance Gallery event at the Chilbo art museum. The Renaissance Gallery is owned by Rachel Corleone. She is standing on the stage in the previous picture with pianist, Arisia Vita.
It was a lovely afternoon performance. Attendees were able to enjoy the music as we strolled through the gallery looking at the amazing collection of paintings from the Renaissance period. Rachel told me that she “opened [the museum] to show paintings that I really like, and that were from the Renaissance period. They illustrate both pagan and sacred themes.” Maggie Lamore, an attendee, said “I'm so excited she's doing this I hope she does more; the gallery is reintroducing itself to the neighborhood.”
67
Rachel said that the “paintings are on permanent exhibit.� However, in another part of the museum, The Chilbo Road Press, there is currently an exhibit with photographs by Bill Brand (1904-1983). Rachel said that the photo exhibits will change from time to time. Be sure to check out the Renaissance Gallery at Chilbo. Rachel does personal gallery tours to explain the different pieces of art in her collection. If you are interested, IM Rachel (and tell her, VEJ sent you!) Thank you Rachel and Ari for a beautiful Saturday afternoon!
68
Mobile App Development Using International Collaboration By Bill Schmachtenberg (Dae Miami sl)
INTRODUCTION On the news, it is frequently reported that politicians are worried that students in the USA are simply not keeping up in terms of their education, and that as a result we will not be able to compete in the future with our European or Asian counterparts. For me, these discussions miss the point about true Twentyfirst century skills. It is not about competition, but rather about online collaboration. For several years now, I have worked by myself to crank out apps for Apple’s App Store as an Apple developer. At the beginning of July, I felt that I needed to practice what I preach and try to create an App with Marianne Hellberg, a teacher and app developer in Sweden, and she agreed. I found that there were advantages and disadvantages to working with another teacher in an app development project. ADVANTAGES AND DISADVANTAGES OF COLLABORATION I had always been a fan of the Learning Company and Broderbund series of interactive books, so Marianne and I chose to develop an interactive book for our app. We found on the Unity Asset Store, a forest model that had already been built, so we purchased that asset to act as the backdrop for the app. We decided to call the app 3D Forest Explorer (See picture below). I also suggested some other assets for the app, and Marianne got Bo “Dixie” Albinsson to create the
69
music for the app. So, combining resources really helped improve the app, and reduce the cost of developing the app. Marianne also provided technical expertise to improve the app. My apps were also first person perspective educational games, but Marianne showed me how we could introduce avatars into the app, and even how to have the user choose a boy or girl avatar. She also improved the navigational system for the app. When it came time to create the book for the story, I was thinking of using Unity’s GUI code since I had done it earlier, but Marianne convinced me to go with the new UI in Unity 4.6. This UI would allow all the graphics including the book and pages to automatically scale to the screen resolution of the user’s device without a lot of coding. She even showed me how to add animations and sounds to the app, and then how to animate NPC (non-personal characters) to the app. Another big advantage to developing an app with another teacher was being able to share the project files and exploit time zone differences. This allowed work to continue even if one of us was busy or sleeping! My biggest contribution to the app, was proofreading the story in the book and fixing grammatical errors such as punctuation. Even though Marianne speaks fluent English, my writing was better than hers since English is my primary language. We also agreed that my company could better market the app on the App store, but she would market the app on Google Play and Amazon. By collaborating, we could get better distribution of the app than either of us could do individually.
70
3D Forest Explorer is an interactive book adventure produced as a result of international collaboration.
The down side to collaboration was that each of us had to give up a little of our creative freedom, and compromise was essential. For example, I had a great animated dinosaur that I wanted in the app, but Marianne felt it did not mesh with the cartoony feel of it. So, I removed the dinosaur vowing to make him appear in a later app. Marianne wanted the player to freely explore the app with little direction, but I felt putting in paths provided structure to the story. After all, most books are linear in the way the reader follows the story line. I got my paths, but we did put a few objects off the paths for the player to find. We also had to agree early on about the perspective from which the player experiences the app. I wanted the story to be written from a first person perspective, but Marianne wanted it
71
from the viewpoint of the Mage, a magician in the story. We eventually went with the Mage (See picture below). We also had to agree on some technical details. I wanted simple clicks on objects, but Marianne wanted ray-traced graphics. Ray-traced graphics require the avatar to be facing the object it is interacting with and to be at a close distance to the object. My simple clicks did not. Marianne made the argument that in real life you cannot touch an object 20 feet away, so we went with raytraced graphics. Normally, it takes me about a month to develop an app, and because we had to negotiate on the story line, it took about the same amount of time to come up with 3D Forest Explorer.
The Mage appears to help the player in the story.
72
RESOLVING CULTURAL DIFFERENCES Another issue arose when we were developing the app, which deals with cultural differences. There is one part of the app when the player collects wood and starts a fire with matches. I was concerned towards the end that since we were pitching this app to elementary school students, that we might be encouraging young children to play with matches and fire. I was considering removing the fire scene, or putting liberal disclaimers at the beginning of the app and in the app description on the App Store. Marianne told me that in Sweden, young children are taught how to start fires with matches at an early age. Marianne felt that such disclaimers were not necessary. To resolve this, I discussed the issue with our very own RoseAnne Vojtek (aka, Roxie Neiro sl, who is an elementary principal in rl), who reminded me that fire is in Minecraft and is not a problem. So, the fire scene was left in. Our app was submitted to Apple at the end of July and was approved in the first week in August.
You can try it by searching for 3D Forest Explorer on the App Store, or by clicking on this link for iPads: https://itunes.apple.com/us/app/3dforest-explorer/id1024690730?mt=8
For IPhone users, click this link: https://itunes.apple.com/us/app/3d-forestexplorer/id1024690730?mt=8 3D Forest Explorer
73
On Walkabout by Cyrus Hush (aka Matt Poole) Volume 6 -- An Archeological Discovery! We are far from finished with our treks across all of the original Linden continents, but the exciting thing about exploring is stumbling across something quite unexpected that requires a detour. Such is the case with our next destination --stumbled upon quite by accident although some of you will undoubtedly recognize it! Lying to the south and east of the proper continents of Nautilus and Satori is a strip of contiguous sims. Too small for a proper continent and not on any of the old continent maps anyway, this strip of land lacks the uniform style and theme to be one of the many McRental communities that dot the seas... so what is it?
74
To answer this knotty question, let's take a look at an old Linden Map... a map of the grid, but not a map of the Second Life grid!
75
That's right... this is a map of the Teen Grid! We have found the lost Teen Grid, or what's left of it! Once upon a time there was a separately-maintained Second Life grid for people aged 13-17 that had precise safety controls in place and was used quite extensively by educators --The Teen Grid! TG was opened on February 14, 2005, and was unfortunately shut down as a separate entity on December 31, 2010 because of what Linden Labs described as confusion with development and improvement for two different systems. (http://en.wikipedia.org/wiki/Teen_Second_Life)
76
However, the "physical" Teen Grid mainland was ultimately imported into Second Life, and now exists as a virtual historical monument off the eastern coast of Satori. Let's check it out! We make landfall at the far northern tip. Hyperion is a commercial/residential area much like any mainland sim, but with a more cohesive structure --reminiscent of a planned community. As a matter of fact it reminds you a lot of the northwestern Sansara colony of Bay City which, as it turns out, was also once part of the Teen Grid, and was in fact directly attached to Hyperion! There are plenty of avatars about, but they do not appear to be either teenagers or educators. We move on... As we work our way south, from time to time we pass by what appear to be disused welcome centers. Several sims appear to be named for English towns. Brighton and Camden sport some impressive architecture and a quaint little cafe.
http://maps.secondlife.com/secondlife/Camden/199/83/25
77
http://maps.secondlife.com/secondlife/Brighton/177/123/26 Opera Sim boasts an impressive bridge, among other things. This sim was primarily occupied by The United Federation of Sims when it was part of the Teen Grid. (file://localhost/from http/::secondlife.wikia.com:wiki:List_of_Teen_Grid_Regions). UFS was/is an experiment with virtual government in Second Life.
78
http://maps.secondlife.com/secondlife/Opera/94/181/40
http://maps.secondlife.com/secondlife/Dougall/81/159/55
79
Next stop is Dougal Sim, described as "The public stage of Teen Second Life, where any member can hold a free event without reservation or having to own the land. (file://localhost/from http/::secondlife.wikia.com:wiki:List_of_Teen_Grid_Regions). At the southeastern end the land changes to white in a group of sims that evoke the Winterlands of Sansara. Here we come across a ski resort!
http://maps.secondlife.com/secondlife/Sierra/199/181/87
80
Of course we have not even scratched the surface of exploring even the ghost of the Teen Grid, but those of you who remember or who used the teen grid in its heyday might find this an interesting field trip. You might even find some useful resources still existing! As always, thank you for your company and I'll see you on our next expedition! Your friend, Cyrus
*********************************************
VWBPE 2016 Machinima Showcase Submissions Due January 31, 2017 Submission Information at:
http://vwbpe.org/presenters/guidelines/machinima
Please note this is NOT a contest. The purpose of this event is to share, enjoy, and learn from each other. We are proud to be able to recognize the effort and talents of educators and students, and we appreciate the opportunity to view cutting-edge projects and films. Thank you for submitting your machinima to VWBPE 2016.
81
We
Hangin’ With Friends!
ISTE SL Headquarters - Office Hours January 12, 2016. Picture by Scott Merrick.
82
Where in the Metaverse is AvaCon Grid?
Be sure to check the new ISTE VEN Headquarters on the AvaCon Grid! This beautiful building was built by Helena Kiama (sl), Barbara Seaton (rl). For more information about the Avacon Grid, visit https://www.avacon.org/blog/ and register for your free account at https://www.avacon.org/blog/ The Virtual World’s Best Practices In Education Conference (VWBPE16) will hold events on the AvaCon Grid during the conference March 9-12, 2016. Looking forward to seeing you at ISTE Headquarters and VWBPE16. For more information about the conference visit www.vwbpe.org.
83
EDTECH 532
Educational Games & Simulations
Part 3
Chris Haskell, Ed.D Clinical Assistant Professor Dept. of Educational Technology Boise State University SL: Skype: AIM:
Dalai Haskell dalaihaskell haskellboise
When last we left Episode Two of Boise State University’s EDTECH 532 Educational Games and Simulations course [See April 2015 VEJ, pages 30 – 34 & June 2015 VEJ, pages 102 – 104] a mysterious green orb turned Rapha into an alien.
VEJ is excited to present Episode Three in this issue in serial form. You will be viewing the actual screenshots and the dialog captured from chat in their Second Life classroom. This episode concludes the 16-week course in splendid detail. It does, however, state, “to be continued” so stay tuned for more!
ENJOY!
84
85
86
Write for VEJ We are especially interested in how educators are using virtual environments and/or digital game environments to motivate students from early education through college. What’s working? Not working? What are the best practices when using virtual environments and/or digital game environments for instruction as a teaching tool? How can teachers/administrators harness the power of these environments to motivate students? How can these environments be used to promote local and state standards and curricula including the common core standards? How are virtual and game environments/worlds being used in teacher prep programs? How can we best promote the effective use of virtual and digital game environments to enhance and engage teachers and students in quality learning experiences. Please submit articles, approximately 500 – 1800 words (in Cambria 14pt). Please include pictures and graphics with the highest resolution possible (using png, tiff, jpeg) to: rvojtek@edovation.com. Be sure to put “VEJ” in subject line. If you have questions, email rvojtek@edovation.com or give Roxie Neiro (sl) a notecard in second life. You can find more information and see previous issues of VEJ at www.virtualeducationjournal.com Be sure to visit us at http://maps.secondlife.com/secondlife/EduIsland%209/21/39/22 in Second Life, our website at http://virtualeducationjournal.com/, and follow us on twitter @VEJournal and #VEJournal. To learn more about ISTE SIGVE events visit http://sigve.weebly.com/.
87
88