Since I began working with Science and Math teachers this summer using iPads for technology integration, I had heard about this micro-photography.  Well, I was finally able to track down the video and the info of where to buy the pieces to make it happen.  Here’s the link to the lighted jeweler’s microscope on Amazon.

I plan on doing this with my iPad2 within the next couple of weeks.  I’ll let you know how it goes and how it works.

Anybody out there tried this with your iPad or iPhone?


Today, through Zite app on my iPad, I learned about QR Voice from Marcel Duran that allows QR codes to be translated into an audio recording. Here’s how it works:

QR voice encodes a given text message into QR code that once scanned by a QR scanner smartphone application reproduces the message with a synthesized voice.

Currently, the message is limited to 100 characters or less, but I’m betting that will be extended to at least Twitter-speed (i.e., 140 characters). The FAQ page recommends the following iOS apps for QR code readers.

For iPhone, free tested apps

I’ve been playing with a QR code for use with early childhood learners, such as preschool or kindergarten learners, where they can’t read yet. I had been recording some audio with the web app iPadio, but its playing on iOS devices has been inconsistent since it relies on Adobe Flash. So, here’s a code I made using QR Voice that played just fine inside my current QR code reader of choice Qrafter from my iPad. See if it works for you. I did still have to press “Go to URL” on the app in order for it to “read” the QR code aloud. I’m interested in hearing how yours might work, though.

I really like the potential for this kind of application. This could be really good for early childhood with students to have some elements read aloud to them. It could also be a good tool for visually impaired students in combination with other types of tools. The novelty for teachers and students certainly can’t be denied either. Now, having the QR code results read to you is pretty cool. Some other fun, novel ideas the FAQ page suggests are:

  • Leave voice messages in holidays/greetings cards
  • Attach printed qr-voices next to sculptures/arts describing what it is, useful for accessibility
  • Send funny messages through e-mail by inserting the generated qr-code as image attachment

Here’s a slide with a QR code that I often use to demonstrate self-checking for students. Students can calculate the division then scan the QR code to see if they got the right answer.

I’m piggy-backing on this same idea with this QR code below. The answer now gets read to you. How awesome is that?

If you try this out with your Android phone or Blackberry or other apps, please let me know what works for you and what didn’t. I’m really interested in finding out which QR code readers are working for folks and which ones are problematic. Also, if you try out this technique with students, let me know how it goes, too.


My colleague Kyle Pace mentioned to me on Twitter that because you can change the language on this through Google Translate, he had mentioned this tool to a number of foreign language teachers.  That’s another great idea.

As I continue to explore ways to make mobile teaching and learning work, I was extremely excited earlier this week when our Tennessee Board of Regents Assistant Vice Chancellor for eLearning gave me an AppleTV and said, “Use this.”  I watched good friend Tim Blais demo how to mirror your iPad2 onto an HD TV or HD project with the AppleTV.  So, a couple of nights ago, I decided to give it a go.

After I (1) updated my iPad2 to iOS5 and (2) plugged in the AppleTV, I couldn’t get it to work.  I searched the web and was pretty sure I was missing something simple.  What I found out was I wasn’t the only one having a little trouble.  Below are the instructions for making this work.  I knew Tim was making it look easy.  Plus, all of this was confirmed by my friend Daphne Brown who got it all working with help from Tim. 🙂

Updating Your iPad2

  1. Your iPad2 (or iPhone 4s) has be running iOS5.
  2. Backup, then update.

Updating Your AppleTV

  1. There is a glitch here: My AppleTV was running version 4.3 (and it says this is up to date). Unfortunately, this is not true.  It needs to be updated to version 4.4.x and it can’t do this over wifi right now.
  2. Close up of a plug following Micro-B-type USB ...

    Image via Wikipedia

    So, you have to plug your AppleTV into a computer running iTunes for it to update.  A couple of things to note here:

    • Launch iTunes first, then plug in your AppleTV.
    • You do not need the power cable for your AppleTV.
    • You do need a mini USB cable (that’s it to the right) to connect your AppleTV to your computer running iTunes.  Unfortunately, this does not come with your AppleTV. So, I had to look around to find one.  Check your cellphone (if it’s not an iPhone) or another small electronics device to see if uses a mini USB cable.  I was able to hijack one from from my digital camera (I think its the digital camera cord.  I can’t confirm since I just keep them all in a drawer) and plug it in. I was also going to check my wife’s Nook if this didn’t work. Make sure it’s the right cord, though.
    • The plug on the back of the AppleTV is small and just above the HDMI slot.
    • Your iTunes should immediately recognize the AppleTV and offer for you to “Restore and Update.” This is what you want to do.
    • When complete, unplug the USB cable from your computer.  You do not need to eject or trash the AppleTV.  Just unplug the cable. (I know. How inconsistent?!)

Plugging in Your AppleTV

  1. Now that you have it updated the device, you will be able to plug it into your TV.  Follow the onscreen prompts.
  2. You will need to enter your wifi network name (e.g., 2Wire406 or your school’s network) and the password, if it is required.
  3. Turn on Home Sharing, and the AppleTV will ask you for your Apple UserID, which is your iTunes user name, and password. This should be the same one for your iPad.  These two have to be the same.
  4. Check to make sure AirPlay under Settings >> General is on as well.  It should be by default.
  5. You can also check under Settings >> About the version of AppleTV you have.  It should now say 4.4.x, which is what you need.

Mirroring Your iPad2

  1. On your ipad, double click the home button and swipe from left to right. You should be able to see the iTunes player controls.
  2. You will also see an icon that looks like a rectangle with a triangle. This is the AirPlay icon.
  3. Touch the button and choose AppleTV. A new option for Mirroring will appear. Slide this to the on position. Your iPad will appear on the screen!

Take Note

  1. I also found out that YouTube videos on my iPad2 now have the AirPlay icon added to the playback bar by default now.  So, if I am on a web page with a YouTube video, I can choose AirPlay and it will automatically play directly onto my HDTV or project instead of playing on my iPad2.  You don’t have to go through the mirroring option at all.
  2. Of course, AirPlay will also allow you to play your iTunes music and videos directly to the external video source.
iPad wordmark.
Image via Wikipedia

If you haven’t seen this great/awesome/fun/cheap tip for making your own iPad stylus for about 10 cents I encourage you to take a look.  Great ingenuity!  Here’s the synopsis and the link:

I bought a pack of Scotch Brite sponges, cheap, penny pens from a local office store, and a small roll of craft wire. We made about 50 pens for $6.00!

via iLearn Technology » Blog Archive » Make your own iPad Stylus for less than 10 cents!.

Here’s another description from one of my fave sites, Lifehacker:

Image representing iPad as depicted in CrunchBase

Image via CrunchBase

Beginning this Sunday, June 19, I will be attending and presenting at the Tennessee Education Mobilization Summit hosted by Walters State Community College.  This is a program sponsored by the Tennessee Board of Regents eLearning initiative, Walters State Community College, the Mid-East Tenessee Regional P-16 Council for Excellence in Education, and the Hamblen County Department of Education.  In addition to a focus on mobile computing devices/technologies, there will also be an emphasis on Google Tools and the move to the cloud.  Here’s an abbreviated list of the topic and some of the presenters.

  1. James Kelley, Education Technology Consultant Higher Education Leadership & Creative Markets Apple Education Group
  2. Kevin Roberts & George Saltsman, Abilene Christian University (ACU)
  3. Dennis Bega, United States Department of Education, Atlanta Office
  4. Wade MaCamey & Lori Campbell (WSCC)
  5. Dale Lynch, HCBOE
  6. Glen Clem, Griffin Technology
  7. Tristan Denley, APSU Provost and Vice President for Academic Affairs
  8. Gerry Hanley & Cathy Swift, MERLOT – Multimedia Educational Resources for Learning and Online Teaching
  9. Bill Hughes & Debra Volzer, Pearson
  10. Scott Nance, GALE & Cengage
  11. Aimee Tait & Todd Svec, McGraw Hill
  12. Margaret Askew, Elsevier
  13. Terry Countermine & Carolyn Novak, ETSU Emerging Technology Center
  14. Karen Dale, Mobile Music Composer, CSTCC
  15. Terri Blevins, Practical Nursing Director, TTC Elizabethton
  16. Mohan Vasanth, MOBL21, Web Base App Development
  17. TBR Library Deans/Directors: TBR Mobile App Library

I’m really excited about heraring from Abilene Christian University’s team of CIO, etc. with their initiative. In addition, I’ll be presenting on a number of topics with Google Docs, QR codes, and my MOBL 21 pilots.  I think I’m also giving a hands-on iPad training … but I don’t have an iPad 2 yet.  Eek!  Gotta figure that one out.  What would you like for me to pay attention to and bring back?

CourseSmart this past spring released their mobile eTextbook readers. I have been using my textbooks for my graduate courses on my iPad through CourseSmart. (The texts I’ve adopted automatically appeared in my bookshelf.)

Just a few days ago in my email, CourseSmart informed me that they have added offline reading and annotations for their etextbooks.  Funnily, I hadn’t noticed that the CourseSmart app previously required a network for access. I guess I have always been at home or at the office when I was using the app and texts.  The instructions they sent me are below, and you should note that there are limits, that is, you can only use three sections or chapters at a time offline.

  1. From your CourseSmart bookshelf, select the desired eTextbook.
  2. Click the button to enable offline access.
  3. Select up to three chapters, or sections/ parts, at a time to “check out” and make available for offline use. This task can be repeated multiple times and the entire eTextbook can be made available for offline access, if desired.
  4. Once the chapters are successfully made available for offline use, you can access your chapters in your internet browser – even while offline.
  5. The offline Reader supports key features like notes, highlights, copy, and searching the current page.

Be sure to let me know if you’ve tried this and how it works for you.

ProfCast is a lecturecasting tool that is elegantly simple in its design.  Want to give a presentation and record the audio and slides synced together for playback?  That’s what ProfCast does easily. (It’s Mac and Windows, too.)

Now, they have are going mobile. Well, they have been mobile for a while with deployment.  Now, they are adding recording on mobile devices. Here’s the description that Humble Daisy sent in an email to me:

ProfCast Mobile is a full presentation suite for creating, delivering and recording presentations from your iPad, iPhone or iPod Touch.

Users can create presentations adding text, images,and transitions. When they are ready to deliver their presentation, they can simply plug their device into a VGA or Digital AV adapter and the presentation will be displayed on the external display. The best part is, ProfCast Mobile records the presenter as they give their presentation. The result will be a movie with the slides and audio in sync. Users can then share the recordings via email, download to their computer or post them on YouTube all from their iOS device.

ProfCast Mobile is using a crowdfunding model to pay for the development.  I encourage you to check it out.

Sometimes I think I’m the last person in the world to find out about something.  Case in point: Shakespeare in Bits. The folks at Mindconnex Learning have take Shakespeare’s most famous plays Macbeth and Romeo & Juliet (Read: those in high school required reading lists) and made them fresh.  I think what they’ve done is created a new equation.

Graphic novel  + Cliffs Notes + multimedia + some everyday language = Accessible Shakespeare

The Mactrack blog describes Shakespeare in Bits like this:

‘Shakespeare In Bits’ is a new, integrated approach to Shakespeare Studies that promotes quicker learning, deeper understanding, and greater appreciation of Shakespeare’s plays. Leveraging the full capabilities of iOS devices and the Mac platform, it presents an interactive, unabridged version of the play’s text alongside a fully animated presentation. It also includes a full range of innovative study features, including dynamic translations of difficult terms, full synopses and study notes for each section, and a character map highlighting the relationships between the characters.

The iPad prices are what I would consider steep. Macbeth iPad is $14.99. However, there is a Romeo & Juliet iPad Lite version for you to check out for free. They also have them for Windows and Mac desktop versions in their catalog and they are also in the Mac app store.  However, if you’re going to be replacing a book, mobile is the way to go. With their desktop versions, they do have institutional pricing.  However, I couldn’t tell on their apps if they were part of the Apple institutional app pricing or not.

I’ve decided to let my guilty-self off a little.  As I perused the Shakespeare in Bits blog, I found out Macbeth was new in January 2011.

Note: This is a cross-post from a guest blog post I authored for Next Gen Learning Challenges.

Mobile devices, like cell phones, smart phones, and tablet computers, stand to make significant changes in the ways in which teaching and learning can happen.  But there are some significant obstacles that have the potential to make mobile learning’s impact nearly nil.  Here’s five reasons that mobile won’t matter.

1.The “Off” button.

Cell phones have become ubiquitous. Tomi Ahonen’s keynote at mLearn Con documents just how much this is true.  The most startling viral message from this presentation: Worldwide, more people have mobile plans than toothbrushes. In the US, 75% of American teens have cell phones and almost 30% have smart phones with Internet capabilities.  In college, the numbers appear to be much higher (e.g., cell phones, smart phones). Plus, we’ve recently heard that the iPad’s adoption rate is faster than even that of DVD.

Unfortunately, in schools the most common response to cell phones is to turn them off.  One of the most pervasive and persistent barriers to meaningful technology integration with learning has continued to be a lack of access to technology.  However, with cell phones and smart phones, educational institutions finally may not have to foot the bill for a one-to-one technology initiative.  In fact, there is growing evidence that data use, such as access to the Internet and text messaging, is becoming as important or more important than voice capabilities.  Also, a substantial number of white and minority students use smart phones as their only home access to the Internet. If our singular response to these devices is to require they remain off, we have missed a substantial opportunity.

2. Assessment tunnel-vision.

Remaining solely focused on objective assessments limits the possibilities for teaching and learning. Sticking with multiple choice test questions on mobile devices, that is strictly using the device as a delivery mechanism for assessment questions, promotes the devices as one-dimensional. Using the multimedia capabilities of mobile devices, such as photo and video capture, offers learners the opportunities to represent their knowledge in multiple appropriate ways.

Moreover, this moves us away from a one-directional vision of teaching, where the teacher pushes instructional content out to the student, the student responds, and the teacher evaluates the response. Instead, we are moving toward a project-based learning approach, where students are able to create artifacts to demonstrate their learning.  Using mobile computing devices as creation tools can open the possibilities for representing students’ learning.

3. Drill and practice persists.

Using mobile devices for drill and practice, pejoratively referred to as drill and kill software, is limiting as well.  Much of the technology integration research from the 1990s and the first decade of this century has reported that drill and practice software is prevalent in classrooms today. While drill and practice software applications offers learners to memorize factual knowledge, reducing cognitive load and response rates, simply using apps to practice is insufficient.

One of the significant advantages that one-to-one initiatives offers is the ability to adapt instruction for individual learners.  Similar to the selective release options available in course management systems, providing specific students with remedial content and practice or advanced students with additional learning opportunities is possible by delivering the learning content directly to the individual’s device.  MOBL 21 is one of the first applications of this I’ve seen, as you publish content specific to individual learners.  However, as more CMSs become mobile-ized (for example, Blackboard Mobile Learn), the selective release features of these systems will also be able to be leveraged.

4. Believing it’s only for instruction.

While potentially powerful, believing that mobile computing devices are strictly delivery mechanisms to access learners outside of class time will also make mobile learning less meaningful.  Mobile technologies and the apps that are available on them have the potential to significantly change the way people work — beyond communication.

These devices and apps are mobile electronic performance support systems. Moving aside from the uber-obvious world of corporate business, mobile computing offers frontline vocational and trades workers access to significant support.  These apps provide just in time scaffolding to help individuals perform at higher levels and more efficiently. For example, ICD-9 On the Go offers quick access to medical billing codes and a number of other features, and Builder Pro provides over 400 formulas used by contractors and access to building codes as well.

5. Textbooks = mobile content.

In the early days of the web and the early days of elearning, many believed we could publish existing legacy documents.  In elearning, we referred to these as “shovelware,” because all folks did was use a big shovel to to pick up as much content as possible and shove it onto the Internet.  However, we have learned that learning content must be designed and segmented appropriately.  We’ve learned that content should take advantage of the medium.  I hope we’ve learned from these mistakes.  We cannot simply take our textbook content and port it to mobile devices.

In a recent post, I questioned the current definitions of mobile learning.  A comment from Dr. Chuck Hodges, a colleague at Georgia Southern University, suggested, “Mobile learning requires certain design elements and certain context of use elements.”  Inkling (iTunes app page) is one of the first examples that has taken this to heart. Taking advantage of the iPad, Inkling provides both content and embedded media along with navigation and annotations. Moreover, textbooks and instructional content should be available in a variety of formats available for reuse across multiple mediums and devices.

This list is not intended to be exhaustive.  What other elements must change for mobile computing devices to matter?  What have I not considered?  Please contribute to the discussion in the comments below.

[Image (cc) from]

The Hubble Space Telescope (HST) begins its se...
Image via Wikipedia

This week is Vacation Bible School at my church Bartlett Methodist.  The exciting and fun theme for the kids is Galactic Blast!, which has been a blast.  I have been leading the Discovery Time, which is focused on science and particularly earth sciences and physics.  So each night during the week, we have been experimenting with a individual experiments and then we have a whole group time, too.  On Tuesday night, though, the individual experiment didn’t take long and I wanted to show the kids some of the most recent images from space, especially some of those from the Hubble Space Telescope.

I decided to bring my iPad and project some of the images for the kids.  Like others have discussed, you can’t just project on your iPad.  In fact, the individual applications have to release the video out (check out this spreadsheet for the list). One application that I’ve really had a lot of success with video out is GoodReader.  It’s a great application, and it worked really well for me at VBS. I was able to project an image onto the screen, then blow it up by “pinching.”  This was particularly effective when I was discussing the maria and craters on the lunar landscape.  Since my iPad is not a 3G, I relied on transferring all of the images and videos directly onto the iPad with GoodReader, too, while syncing.

In one session with kids, I was running a little ahead.  So, I also unplugged from the projector and used the Planets app to show the kids individual images of the planets.  I was able to walk around with my iPad among the kids and they got a closer view, too.  All in all, the iPad and the space images were both hits. Have you used your iPad, iPhone, or iPod Touch with less formal learning situations?  Let me know in the comments.