Digital Distractions & Digital Opportunities

The Academic Technology Committee at UHS has been engaging in a lively discussion about digital distractions and digital opportunities.

An article that highlighted one end of the debate examines how some faculty at Northwestern University have started to ban technology in lecture courses as a result of the rampant distraction stoked, in part, by the ubiquity digital technology.

As a counterpoint, an article by Sal Khan asks educators to rethink our assumptions about what a class or school should be. The neuroscience of attention reveals why lectures are ineffective and how digital technology can help facilitate more active learning.

Inevitable, technology – whether in education or in society at large – isn’t an either/or proposition. Grey area abounds. Which is why robust discourse is so essential if we are to figure out the best ways technology can enhance teaching and learning – and the ways it can derail it.

What do you think?


Technology Alone is Not Enough

In my conversation during the 20/20 Symposium with Matt Crowley, who works in the Manufacturing Design department at Apple, I was most interested in how he echoed Steve Job’s sentiments that technology alone wasn’t enough.

When introducing the iPad 2 Steve Job’s said, “It is in Apple’s DNA that technology alone is not enough—it’s technology married with liberal arts, married with the humanities, that yields us the results that make our heart sing.” He seemed to being saying the best ideas and most innovative products emerge from the intersection of technology and the humanities.

When I asked Matt in what capacity the liberal arts and humanities have impacted his role as a designer, he said that it’s been essential. From his experience, you can’t design technology for people unless you understand the unique culture and history of the people you are designing something for. One needs to start out taking an anthropological approach by asking questions and observing. An anthropological perspective, combined with training in the humanities, create a synergy that isn’t possible by technological training alone.

With all the talk about the importance coding, and the decline in the arts and humanities majors, it seems vital to reiterate that one of the most iconic technology companies is a result of the convergence of technology and art. So, are there merits to learning coding? Absolutely. But your coding will be taken to new heights if it’s immersed in art history, literature, philosophy and performing arts.

Learning Computer Code

In a recent CNN article Douglas Rushkoff wrote about the value of learning computer code.  We are living in a world that is increasingly being defined by computer programs. “Code is the stuff that makes computer programs work — the list of commands that tells a word processor, a website, a video game, or an airplane navigation system what to do.”

Computer code is a cornerstone of our information ecosystem. By learning to code we are developing an aspect of digital literacy and increasing our job prospects.

At CodeYear over 300,000 people, including New York’s Mayor Michael Bloomberg, have signed up to receive free interactive coding lessons each week from the web-based tutorial, Codeacademy.

Another way to way to learn computer coding is with the iPad app Codea. Codea is remarkable code editing app that lets you create interactive simulations, games and just about any visual ideas you have.

“If you know how to code,” in Rushkoff’s words, “you can get a high-paying job right now, or make valuable stuff right now. You will understand more about how the world works, and become a participating member in the digital society unfolding before us.”


The Human Brain in the Digital Age

Cathy N. Davidson’s book, Now You See It: How the Brain Science of Attention Will Transform the Way we Live, Work, and Learn, is about the human brain and human potential in the digital age.

The heart of the book focuses on how the phenomena of “attention blindness” shapes our lives. In order to focus and pay attention to any one task we filter out many other things that are happening around us.  As a result we have blind spots. But we don’t all filter in the same way.  Our focus is idiosyncratic.  While attention blindness pigeonholes our perspective, Davidson argues that the digital age is providing new ways of seeing and learning that’s based on multitasking our attention.  Social media is allowing us to aggregate perspectives and generate a bigger and more accurate picture by seeing together.

While digital tools offer ways to mitigate the problem of attention blindness, our institutions of learning and work are still designed to meet the social and economic needs of last century. How do we prepare students for the challenges and workplaces of tomorrow? Now You See It provides glimpses of the future by highlighting visionaries and pioneers who are helping to shape the nature and direction of education and work.

Social Media and Learning

We are witnessing the emergence of something profound: humans, historically divided by geography, culture and creed, are beginning to connect and collaborate on a scale never seen before. The driving force behind this creative wave are digital tools and networks that allow new forms of collaboration and knowledge creation.

What starts out as social networking is evolving into social production.  We’ve witnessed how self-organizing groups, leveraging social media such as Twitter, Facebook and Wikipedia, have launched revolutions throughout the Arab world and created the most importance reference work in the English language in less than 10 years.

The opening paragraphs from my latest PBS article on the case for using social media in education.

Learning in a Digital Age

“Education,” scholar and writer Ralph Ellison once said, “is a matter of building bridges.” And perhaps, no bridge more important than the bridge to the future. As educators, it is our responsibility to prepare students for the world of tomorrow. Yet tomorrow isn’t what it used to be. 

This is the beginning of an article I wrote for PBS’s MediaShift website on the importance of teaching a new kind of literacy in our emerging digital age. If you’re interested in reading more click on the red word article above (which is a hyperlink to the article).

Child-Driven Education

Education scientist Sugata Mitra tackles one of the greatest problems of education—the best teachers and schools don’t exist where they’re needed most. In a series of real-life experiments from New Delhi to South Africa to Italy, Mitra gave kids self-supervised access to the Web and saw results that could revolutionize how we think about teaching.


The iPad is helping to launch a revolution in learning and education. While the portability and long battery life are great features, the real benefits are in the touchscreen interface and software. The App Store provides a broad spectrum of applications that appeal to a wide range of learners. We all have idiosyncratic ways of learning, organizing, and demonstrating what we’ve learned. And the iPad is one of the most powerful tools we have in creating a customized learning experience. Historically, there has been a standardized way of learning, organizing, and demonstrating skills and knowledge. As a result, the diversity of human thought and expression was filtered through an educational sieve. While this may have been appropriate for the past, there is increasing evidence that this won’t be tenable for the future. Our digitally networked society is requiring people to apply their idiosyncratic forms of creativity, problem-solving and collaboration to the challenges of our times. The iPad is emerging as one of the most powerful tools we have for learning about and shaping the world we live in.