Ed Tech #5: InsertLearning

What is it?

InsertLearning is a Chrome extension that you can use to insert your own educational content into a webpage. If you go to InsertLearning’s homepage, you can check it out before you download and install the extension.

How does it work?

Below is a screenshot from a sample lesson. If you want to see the teacher view, you can use this link (you will probably be prompted to install the extension and login before you can view it). It’s an NPR article on the Declaration of Independence:

Screenshot of InsertLearning that explains each of the icons/things you can insert

 

On the left, you can see what all you can insert into a webpage. I annotated what each of the icons mean.

  • Assign Lesson.  You can share the lesson with the students, either by sharing it directly to Google classroom, or by sending students to this link: insertlearning.com/signup and providing them with a class code.
  • Highlight Text. This is pretty self-explanatory.
  • Sticky Note. This allows you to type your own commentary on the webpage, or you can insert media, like a YouTube video, embed code, or (I like this alot) you can record a video of your own and insert that.
  • Assessment Questions. These can be either open-ended questions or multiple choice questions. You can give these questions point values and view student achievement in your InsertLearning dashboard.
  • Discussions. You can insert a discussion prompt; students can respond and then view their classmates’ responses.

Strengths, Limitations, and Applications

The free version of InsertLearning lets you store 5 lessons. Upgrading to a $40/year plan gives you unlimited lessons, which seems reasonable.

InsertLearning has a Google integration, and so teachers/student sign in with their Google accounts. It also works well with Chromebooks, and can be shared directly to Google classroom. Because of this Google integration, you’re able to create “enhanced” Google docs. In other words, you could create a worksheet with Google docs, and then turn it into something more interactive by inserting sticky notes, video explanations, assessments, discussions, etc.

I really liked this tool and think that it could have a number of applications, for K-12, higher ed, and in an online context. When I taught political science I would frequently assign students online articles– and they weren’t always easy reading. With InsertLearning, it would be simple to turn those reading assignments into something more interactive. If the article referenced a concept I think they would find difficult, I could insert a YouTube video or record my own quick explanation. If the article brought up something controversial, I could add a discussion question. If the article was lengthy and I wanted to make sure they made it to the end, I could add in assessment questions throughout. Overall, I love that this tool turns reading into a more interactive and social experience, which would undoubtedly enhance student engagement with and retention of the content.

Continue Reading

Ed Tech #4: Packback

What is it?

This week, I chose to review Packback. Packback started off as an e-textbook rental company, and was awarded an investment by Mark Cuban on Shark Tank in 2014. (This article is an interesting read). As the article discusses, Packback recently switched gears to something entirely different: its new mission is to encourage curiosity in college students; it uses AI to “grade” curiosity.

How does it work?

Professors create a professor account, and then invite students to join their online community by providing them with an access code. Students have to pay $18 a semester. This is a screenshot of a sample community:

Screenshot of packback community

 

Students post and respond to questions. Students get a certain number of “sparks” to use – sparks are similar to “likes,” but they are designed to be used to indicate what questions “sparked” your curiosity.

Packback’s algorithm grades student posts and assigns them a curiosity score. Students’ curiosity scores are displayed in a learner leaderboard. The curiosity scoring system is based on three criteria: presentation, credibility, and effort. For presentation, the algorithm looks for formatting, legibility, and supplemental materials (like videos or images). For credibility, the algorithm looks to see if the post contains reliable sources, and also checks for “behaviors” that often go hand-in-hand with credibility, like the time of the post and the depth of the post. Lastly, for effort, the algorithm looks to see if the user added new insight to the post or just provided a straightforward answer. Packback claims their algorithm was derived by identifying what high-quality posts had in common, and that the algorithm performs quite similarly to how a human grader would score/rank posts.

Strengths, Limitations, and Applications

Packback claims it can create high-quality discussions by: (1) “scaling personalized feedback,” (2) “analysis of posts,” and (3) “managing a large number of students.” Packback also claims that, “While the Learning Management System forum serves the purpose for in-class logistics (ex: Where do I find the case study? What’s on the exam?”), it’s not possible to conduct a high quality academic discussion (or grade for it) due to a lack of quality control, lack of feedback delivery, and lack of a technological capability of assessing quality.”

I disagree with this statement (except maybe in the case of a very large class). In smaller or average-sized college classes, I do think it is entirely possible to design and conduct high quality academic discussions via an LMS. Instructors are very capable of performing quality control, giving feedback/guidance to student responses, and assessing quality (without an algorithm) in the context of an LMS discussion board. I also think instructors can achieve the higher levels of Bloom’s taxonomy using well-designed traditional assessment methods, and even large classes can potentially use TAs or group work to encourage high-quality discussion.

That being said, I do think Packback could be useful in a very large class (at least 50+). Class discussion is virtually impossible in such large classes without a tool like Packback, so in that regards, Packback fills an important gap.

Many of the other educational technologies we’ve talked about in this class are virtual “aids” – they help us do what we’ve always done, just more efficiently (hopefully). I think Packback is a little different in that it has a unique starting point, which is: get students asking questions. And not just asking any questions, but guide students into asking good questions. This represents a shift in the conversation. Instead of being focused on what students can answer or retain, we focus on teaching them how to question their world, and how to be curious, critical thinkers.

I’ll leave you with this TedX video by the founder of Packback:

Continue Reading

Ed Tech #3: Nearpod

What is it?

Nearpod is a tool that lets teachers create and share interactive lessons. These lessons can be live or student-paced. Teachers can sign up for a free account, or there are paid accounts / site licenses available, too. You can search for, modify, and use already-created Nearpod lessons, or create your own. To make your own, you can upload Google Slides, PowerPoints, PDFs, or Sway. In the free account, you can add in interactive quizzes, open-ended questions, and polls to these slides. To get more of the advanced features (like the student-paced mode, “virtual field trips,” fill-in-the-blank questions, etc.), you have to upgrade to a paid plan.

How does it work?

In the live lesson option, teachers give students a code. Then, students open the Nearpod app or website (the app/website is compatible on iOS, Android, etc). When students type in that code, it syncs their device to the teacher’s presentation. In other words, if the teacher flips to the next slide, the presentation on the student’s device goes to the next slide, too. You can have assessment activities (multiple choice questions, polls, etc.) built in to the presentation, so teachers get feedback on student comprehension as they progress through their lesson. In student-paced lessons, students still get the code, but can advance through the presentations on their own and teachers get the assessment data after the student completes the lesson.

This is what the teacher dashboard looks like:Screen shot of the teacher dashboard in Nearpod

 

You can see the roster in the bottom left, the code (for students to join the presentation) in the top left, and then in the top right, the +Add Activity button. You can use that button to insert an assessment into the presentation.

Strengths, Applications, and Limitations

One drawback to Nearpod is that it is primarily geared to K-12. I know some college faculty would be turned off by the cutesy graphics, and even though it let me select “higher ed” to search for already-created Nearpod lessons, all the ones it found are labeled (and clearly intended for) grades 9-12. I think Nearpod could be very useful in higher ed applications, and so I would like to see it expand its target audience.

If you’re using Nearpod in an in-person class for a live lesson, I imagine one of the biggest concerns will be keeping students on task. As you can see in the screenshot above, the teacher view will show which of your students are logged in. If a student opens another app, the Nearpod app will close and the teacher will see that student is no longer logged in. (This Nearpod article explains it). I think that accountability feature could come in handy!

Lastly, I do think it would be nice if the student-paced option was available in the free account, since a student-paced lesson would work best for an online course. This is especially true because (according to this Nearpod article) you can add audio to a slide. I think this could make a particularly engaging online lesson video. Essentially, students could progress through a video presentation of PowerPoint slides, but have built-in breaks throughout the lesson for assessment questions or other forms of engagement.

Continue Reading

Ed Tech #2: Padlet

What is it?

Padlet describes itself as somewhere between a doc and a website builder that can be used for things like a bulletin board, a blog, or a portfolio. It reminds me somewhat of a collaborative Pinterest board. It gives you a simple, intuitive platform to share content with others. The basic account is free, but you gain access to even more features if you are willing to pay for an account (such as the “Padlet Backpack” education account). It has iOS, Android, and Kindle apps.

Creating a Padlet

To create your padlet, you choose a title, theme, and layout. You can also choose your “reaction options.” People can like, vote up/down, give 1-5 stars, or grade (which assigns a numeric score to a post within a padlet).

Here’s what some of the layouts look like:

Screen shot of the 5 different Padlet layouts available

And here is a screenshot that lists the type of content you can add to your padlet:

List of options to add to Padlet, including links, video and photos, drawings, and more.

 

I particularly like padlet’s robust privacy options: you can make your padlet private, password protected, access with link/QR code only, or public. You have even more privacy/security features if you upgrade to padlet backpack. Padlet also has exporting options (CSV, PDF, image, Excel spreadsheet, etc.) that could come in handy.

Sample Padlets

I think one of the best ways to get a feel for all the different things you can do with padlet is to just click through their gallery. I also found this particular padlet useful- it’s a padlet that compiles various educational padlets. I also created a sample padlet, which I embedded below. You can also access my sample padlet through this link, or if you were to download the padlet app, you could scan a QR code that I provide. Check it out and feel free to try out the collaboration features! I used a variety of images, videos, links, and a few gifs so you can see how padlet displays the different types of content.

Made with Padlet

 

Limitations, Strengths, and Applications

I think Padlet has a variety of applications both in online and in-person classes. For instance, I could see it being used for individual or group presentations or projects. Groups could be assigned a topic, and then work to create a padlet that represents that topic. Students could then view and comment on each post within the padlet. You could also use padlet to organize a classroom survey or contest (kind of like I did with my sample padlet), create a flow chart with the “canvas” layout, or tell a story using the “stream” layout. I think padlet is the type of tool that would work well for open-ended assignments- I bet students could come up with some creative ways to use padlet that we wouldn’t even have considered. I also like that padlet makes it easy to share and link to other padlets, which encourages collaborative learning. Padlet addresses accessibility as padlets can be read with most screen readers. However, keyboard access is only available for logging in and navigating the dashboard (it’s not possible to create/edit posts with keyboard only). As for other limitations, a few times, my padlet wouldn’t load a gif or a particular image, which was a little annoying. I also wish more of the features were available in the free version.

I’m looking forward to hearing other ideas on how you think padlet could be incorporated in both in-person and online courses in the comments!

Continue Reading

Ed Tech #1: Quizlet

What is it?

Quizlet is a website that allows you to create “study sets,” and then gives you various options for practicing and mastering those study sets. Simply speaking, it’s an online flashcard creator (although you can do quite a bit more with it). Anyone can set up a free account, but you can get an annual membership for $19.99/year if you want more features.

How does it work?

Quizlet offers video tutorials, although it is intuitive enough that mastering the basics without viewing tutorials is definitely possible. I created a sample set, using cooking terms. Here’s a screenshot of what it looks like when you enter terms:

Screen shot that shows Quizlet interface with terms on the left, and definitions/images of those terms on the right.
Creating sets in Quizlet

 

You can see that Quizlet lets you create a title, and change visibility settings. You can also import your list of terms from Word, Excel, Google Docs, and other sources. You can easily rearrange the terms, flip terms and definitions, use languages other than English (helpful for foreign language courses), or add pictures or audio to the definition column (you can see the small thumbnail of my images to the right). Quizlet made it particularly easy to add images to my set, since it will search CC-licensed images for you. As you can see at the top, Quizlet also has a “diagram” feature so that you can learn terms associated with a diagram. Another feature I like is that you can search for sets people have already created. No need to re-invent the wheel if Creative Commons resources are available!

Once I’ve created my set, Quizlet gives a number of ways to practice and master those terms:

Screen Shot of Quizlet study options

The “Learn” study option combines the flashcards, write (which is fill-in-the-blank), spell, and test (multiple choice) options. “Match” and “Gravity” are simple games that also help students practice the terms. They can be used collaboratively through Quizlet live or a Google classroom integration. Students compete with one another with a classroom leaderboard.

Check it Out

You can share your Quizlet set via Facebook, Twitter, email, or through a link. You can also embed your set directly within a website, as I’ve done here with my sample set (click “Choose a Study Mode” to see what some of the other study modes are like):

Limitations, Strengths, and Applications

Quizlet is designed to help students master and practice terms. As such, it will never move up past the “remember” and “understand” levels of Bloom’s Taxonomy, and it would not be my choice for helping students learn difficult content. But, what it does, it does well. I also like that this a tool students can easily use for their own studying; it doesn’t have to be teacher-directed or integrated into a course for students to take advantage of it.

If instructors do wish to use Quizlet in online learning, Quizlet can be integrated directly into Canvas (the LMS that my institution uses). Since the Quizlet app in Canvas doesn’t link up to a Quizlet login, you have to make your set publicly available and then search for that specific set. In Canvas forums, some instructors complained that they couldn’t find their sets because of this limitation. Consequently, if I had a faculty member who wanted to use Quizlet in their online course, I’d probably recommend bypassing the Canvas integration and just embedding their Quizlet set directly into a page in Canvas or sharing the link with students.

As for its practical use, obviously, instructors could simply create and offer Quizlet sets for students to use, ungraded, as they study for an exam or review content. Actually assigning assessment value for Quizlet activities could be more challenging. A few ideas: instructors could ask students to create and share their own sets, or take advantage of the classroom leaderboard feature.

Overall, I think this is a great tool for online instructors who want to give students a more interactive option for independent review of terms or diagrams. Too, Quizlet would be particularly useful in online learning modules that depend heavily on terms or vocabulary. For example, nursing faculty could use it for a medical terminology course or Spanish instructors could use it for vocab practice. Otherwise, its application is more limited to individual student study and review.

Continue Reading

Remixes and Mashups

My Remix/Mashup for ED 677

 


Reflection on my Remix/Mashup

Before I completed a couple of readings on remixes and mashups, I already had an idea in my head about what they were. If you had asked me to define each of these terms, I would have said that remixes take an existing artifact and present it in a new way. Mashups take multiple artifacts and combine them into one artifact. The readings I completed supported my simple distinction. For example, Gil (2017) described that “a ‘mashup’ combines services from different websites into a single website.” According to the Wikipedia mashup article that focuses specifically on education, remixes have two or more data sources. Remixes re-create an artifact in a way not originally intended by the user. So, in my mind, in the most basic sense, this is how remixes and mashups differ.

So what do they have in common? The key to both remixes and mashups is the evolution of an artifact’s meaning. Murray (2015) describes that, “artists have consistently challenged the idea that meaning ascribed to objects is permanently fixed. All cultural artifacts are open to re-appropriation. As with much else, technology has made this process easier and more visible.” The tools we have available now make it easier to assign new meanings to existing artifacts to create remixes; combining old artifacts together with different meanings to create mashups. With both remixes and mashups, we can use existing content as a springboard for new ideas and new content. We can create with old creations.

I think the animation I created for this assignment is somewhere between a remix and a mashup. It’s a remix in the sense that I took a lot of my own work and presented it in a new way; it’s a mashup in that I took all of the semester’s work and condensed it down into a 2 minute animation. Rather than being a commentary or a parody or an artistic process, I envision it more as a curation or chronological display. This adds meaning to the pre-existing artifacts because it puts them together in one place where they were previously disparate.

Reflection on the Process

Even though this video ended up being less than 2 minutes long, it took me a really long time to create! I wanted to have the opportunity to be brief and to really drill down to the heart of each of the past semester’s assignment. I wanted to create a mashup of the content and ideas that I will remember and take with me after the end of this semester. Consequently, deciding what content to include and how to present it was a fairly time-consuming process.

Moovly was also a new tool for me, so it took a little while to learn it. I initially tried Video Scribe (which was used by a student in the 2016 cohort), but decided it was a little too complicated for my purposes and the learning curve a little too steep. Moovly allowed me to create my moving infographic and ended up being a great tool for what I had in mind.

The reason I wanted to create what I would call an “animated infographic” was twofold: (1) I hadn’t had a chance to use an animation tool yet this semester, so I wanted to pick a technology that would challenge me, and (2) I wanted to challenge myself to tell a “brief” story instead of my usual long story. I always have a hard time being concise, but the participatory storytelling project in particular really reminded me of this particular tendency. So I chose a different approach to storytelling (brevity) than I have used in previous assignments. My story, then, is a story of small epiphanies. Each assignment gave me at least one “a-ha!” moment, and I think that in the future, it will be useful to me to have all of these discoveries curated into this brief chronological display.

References

Gil, P. (2017, April 21). What exactly is an internet mashup? Lifewire. Retrieved from https://www.lifewire.com/what-is-an-internet-mashup-2483413

Mashup (education). Wikipedia. Retrieved from https://en.wikipedia.org/wiki/Mashup_(education)

Murray, B. (2015, March 22). Remixing culture and why the art of mashup matters. Tech Crunch. Retrieved from https://techcrunch.com/2015/03/22/from-artistic-to-technological-mash-up/

Continue Reading

Participatory Storytelling

Media Presentation

Click on the image below to visit my media presentation of our participatory story.Participatory Storytelling


Deconstructing the Process

Choosing the Form of Media

I knew before I began that I wanted to pair our Twitter story with some form of images. I think that, in part because it was Twitter-based, the cohort spent a lot of time “setting the scene” or developing descriptive passages. Consequently, I wanted the opportunity to create a graphically-rich representation of the story. I imagine that we were all picturing some sort of scene or item or object in our heads when we read or wrote these descriptive passages. Taylor and Williams (2014) explain that, “McCall Smith… cites the critical role of the reader’s imagination in bringing these miniature tales to life,” and so I wanted to try and capture the imaginative scenes we were thinking about as we wrote the descriptive passages.

I wasn’t quite sure how to do that, so I started by brainstorming what kinds of tools would allow me to neatly integrate images into the already-crafted narrative. I considered doing some sort of “timeline” feature (such as the one here), as I remember a classmate used it in one of her posts in my summer Digital Citizenship course and I thought it was very visually appealing. However, it did seem more appropriate for a project with a clear chronological organization, so I kept looking. I found that another student in last year’s ED 677 cohort used a very interesting tool– Adobe Spark. Since I’d never used it before, I thought it looked like the perfect tool for what I was hoping to accomplish.

I found Adobe Spark very easy to use– and actually pretty fun! It was very simple to integrate Creative Commons licensed photos- although I spent probably more time than I should’ve browsing through the images to try and find ones that matched what I was picturing in my head. I experimented with the different themes as well as the different ways to display text so that the end product offered visual variety.

Reflection on the Twitter Storytelling Process

I had a hard time with this project initially, and this blog post and media presentation through Spark came together much easier for me than did the weeks of tweeting leading up to it. Logistically, I struggled to fit in two tweets per week because I primarily work on classwork over the weekend. I would login with the intention of completing my second tweet, but wouldn’t be able to because I would be breaking the “no consecutive tweets” rule. I also often found myself confused about what was happening in the story, which made it difficult for me to write tweets that I was confident wouldn’t derail the storyline further or cause even more confusion. Thus, the process of coming up with two tweets a week was sometime frustrating.

In general, I also think it was difficult for me because storytelling as a whole is outside of my wheelhouse, and this form of storytelling even more so. According to Taylor and Williams (2014), “Mitchell says crafting stories for Twitter requires a completely different approach to novel writing. Above all, he says compression is the key, and modification of the narrative is often required.” If you look at any of my previous blog posts (this one included), compression and brevity are really not my strengths. I always prefer to use more words rather than fewer words, so the shortened nature of each Twitter contribution was a challenge for me. In addition, I am much more comfortable with nonfiction than I am with fiction. I do think I’m a creative person, but not when it comes to telling stories or creative writing. I liked what Alexander (2011) said about this type of storytelling: “The social media world has made the outer frontier of stories porous. Where a story begins and ends, what the container is that holds a narrative: these questions are more difficult to answer than before” (p. 125). I found Twitter to be a challenging “container” with which to hold a story– I like continuity, order, organization, and a plan… which is tough with 140 characters and 10+ authors!

I do like what Alexander (2011) said, though, regarding “collaborative spaces.” He explained that, “One model for understanding storytelling in a social media world, one where content and audience interaction is distributed over multiple sites and across time, is that of the networked book,” and that we should think of the networked book “…as a platform, whereupon visitors build materials in a collaborative space” (p. 127). Twitter became a platform for collaboration, and it is kind of neat to look at my Spark presentation above and consider that it all started with one tweet and was created solely through collaboration.

References

Alexander, B. (2011). The New Digital Storytelling: Creating Narratives with New Media. Santa Barbara, CA: Praeger.

Taylor, A. F. and Williams, M. (2014, Sept. 30) Alexander McCall Smith on the art of Twitter fiction. ABC.net. Retrieved from http://www.abc.net.au/radionational/programs/booksandartsdaily/beta-nav/alexander-mccall-smith-on-the-art-of-twitter-fiction/5777056.

Continue Reading

Augmented Reality

Augmented Reality in Action

According to Davis, “Augmented reality, at its most basic form, is defined by the incorporation of something virtual into something pre-existing, thus amplifying the experience.” Augmented reality offers us an opportunity to enhance natural experiences or static images with virtual experiences and additional information. In other words, the Auras below allow us to link digital content to something physical, which enables the information to be displayed and the story to be told in ways that were previously not possible (Mills, 2012). 

Disclaimer: make sure you follow vmw1925 on Aurasma so that you can view the following Auras! I also recommend you click on the image to enlarge it before viewing it with the Aurasma app.

Map of US national parks, indicated with tree icon

This first image is a map of the location of US National Parks. When you view the image with Aurasma, you’ll see two arrows appear: one for the past, and one for the future. Clicking on the arrows will take you to a YouTube video about the history of the parks, and a Ted Talk video about future potential for the parks.


 

Vintage poster advertising Grand Canyon National Park; depicts a sweeping canyon and sky.
A short video opens, showing a time-lapse of light coming up over the canyon.

 

Vintage poster advertising Yellowstone National Park; shows the geyser Old Faithful spewing up into the sky.
Links to “10 Things You May Not Know” about the park, and links to the park’s frequently asked questions.

 

Vintage poster advertising Great Smoky Mountain National Park.
Links to top attractions in the park, and opens a PDF with hiking trails.

 

Vintage poster of the American Southwest desert scape, advertising Arches National Park.
Opens a video of the park taken by a visitor to the park, using a drone.

 

Vintage poster advertising Glacier National Park, shows a mountain reflected in a lake.
Discusses the history of the park; opens images of the building of the Going to the Sun Road and a video with a Park Ranger explaining the road’s construction. Double tap the second icon to view the Park Ranger video.

 

Vintage poster advertising Mt. McKinley National Park, shows a Dall's sheep in front of a mountain.
Opens two news stories about the park.

 

As evidenced by the auras above, AR can be used to integrate all kind of digital information into the physical world. User generated content, news stories or current events, hiking guides or visitor information, historical facts, and fun trivia can all be a part of an image with the use of AR technology. I will leave you with this one last aura to end your virtual experience of the story of US National Parks (click on the new image once it appears):

Image that says "Find Your Park."


Deconstructing the Process

Augmented reality is frequently applied to travel experiences (Graham, 2010), and so that is why I decided to explore that concept further in my AR post, just to see what kinds of materials could be integrated. My husband and I have traveled to quite a few US National Parks, and I appreciated their story in the broader story of US history. I noticed that over the past few years, the iconic “vintage” posters of each national park have been more popular, and I think it’s partly because they capture a particular feeling about the park and/or convey the park’s particular story. Since education (along with preservation) was one of the original intentions of the park system, I thought it would be fun to use these vintage posters as the trigger images for my auras. I tried to attach different kinds of information to each trigger image, so that I could practice adding various overlays and actions.

While I think that AR has exciting potential, I struggled with the technology in its current form. As the ASTE Presentation (2012) describes:

The biggest drawbacks to AR, right now, are access to technology and complexity. While there are a handful of platforms that lower barriers for participation, many exciting new applications of AR may be out of reach for many educators due to the level of technical skill required to build on many platforms.

I tend to agree with this quote. I believe that good instructional design seeks to limit extraneous load, but the extraneous load for AR is quite high  for both the creator of the content and the user. I know that I spent much more time learning the technology and trying to overcome glitches than I did developing the educational content, which is not ideal– the educational content, not the technology, should be the star of the show. Similarly, I think AR can also limit what kinds of materials we use. Trigger images have to have specific characteristics, so if the best image educationally doesn’t work technologically, you will have to use Plan B (which again, is not pedagogically ideal). I would also have some accessibility concerns, since it isn’t easily apparent to me how my above auras could be transformed into something equally accessible to all students.

Because of these experiences, I think there is a danger in AR becoming the use of technology for technology’s sake. As with all kinds of design, we must be intentional in our use of technology and in our technological choices to make sure that the technology serves to support and enhance the content and does not make it more difficult to access the content or distract from the objectives of learning.

References

ASTE 2012 Presentation (2012). Seeing more: Augmented reality. Retrieved from https://sites.google.com/site/disruptingtheinstitution/seeing-more.

Davis, M. (n.d.). Augmented reality. Retrieved Aug 20, 2012, from http://www.pages.drexel.edu/~mcd332/Augmented.htm.

Graham, S. (2010, Nov 12). 7Scenes: Augmented reality authoring for digital storytelling. Electric Archaeology. Retrieved from http://electricarchaeology.ca/2010/11/12/7scenes-augmented-reality-authoring-for-digital-storytelling/.

Mills, M. (2012, July 19). Image recognition that triggers augmented reality. Ted Talk. Retrieved from https://www.youtube.com/watch?v=frrZbq2LpwI.

 

Continue Reading

Cultural Storytelling

The Curation


A Reflection on the Curation

The story above can best be classified as a “long form curation.” According to Content Curation Techniques, when you pull from multiple sources and tell a narrative or a story, you’re creating a long form curation. I’m not sure you can classify my curation as “storytelling,” because it doesn’t necessarily have the beginning, middle, or end that Content Curation Techniques describes, but it is also not the “short form curation” because of the abundance and variety of sources curated to create a narrative.

Kanter’s process of curation most closely resonated with what I worked for in the above curation. She describes content curation as:

“the process of sorting through the vast amounts of content on the web and presenting it in a meaningful and organized way around a specific theme.”

Rather than present my readers with a collection of links and allowing them to draw their own conclusions, I worked to “cherry pick” the content that its “important and relevant to share,” and put the resources in context with “organization, annotation, and presentation” (Kanter, 2011).

I found that most of the sources I curated in the above story did something similar to my process and to Kanter’s process. They chose an “angle” from which to discuss the topic, and then pulled in sources and statistics and content around that theme. This is the “sense” that Kanter describes: they chose to leave some things out, and chose to include other things. Most of them also chose a “side” almost in an attempt to persuade their audience that either the millennial generation indeed deserves this reputation, or that the millennial generation is treated unfairly. A few also chose to tell the story of the middle ground, pointing out that while a few common traits can probably be identified, it usually doesn’t work to vastly generalize a large group of people into a monolithic identity. In all of these, I found that they utilized the element of conflict: they juxtaposed what is “said,” or what “people commonly believe,” against what they say is “true.”


References

Content Curation Techniques. (2013). Retrieved from http://curationtraffic.com/podcast/content-curation-techniques/

Fry, R. (2016, April 25). Millennials overtake Baby Boomers as America’s largest generation. Pew Research Center. Retrieved from http://www.pewresearch.org/fact-tank/2016/04/25/millennials-overtake-baby-boomers/

Hess, S. (2011, June 10). Millennials: Who they are and why we hate them. TEDxSF. Retrieved from https://youtu.be/P-enHH-r_FM

Hill, C. (2016, June 21). Millennials engage with their smartphones more than they do actual humans. Market Watch. Retrieved from http://www.marketwatch.com/story/millennials-engage-with-their-smartphones-more-than-they-do-actual-humans-2016-06-21

Kanter, B. (2011, October 4). Content curation primer. [Web log comment]. Retrieved from http://www.bethkanter.org/content-curation-101/

Main, D. (2013, July 9). Who are the millennials? Live Science. Retrieved from http://www.livescience.com/38061-millennials-generation-y.html 

Rose, F. (2011, March 8).  The art of immersion: Why do we tell stories? Wired. Retrieved from http://www.wired.com/business/2011/03/why-do-we-tell-stories/

Stein, J. (2013, May 20). Millennials: The me me me generation. TIME. Retrieved from http://time.com/247/millennials-the-me-me-me-generation/ 

Steinburg, S. (2015, August 21). Millennial vs. Boomers: Habits and characteristics. Parade. Retrieved from https://parade.com/417128/scott_steinberg/millennial-vs-boomers-habits-and-characteristics/

Tanenhaus, S. (2014, August 15). Generation nice. The New York Times. Retrieved from https://www.nytimes.com/2014/08/17/fashion/the-millennials-are-generation-nice.html?_r=0

Taylor, T.C. (2016, March 23). Workplace flexibility for millennials: Appealing to a valuable new generation. Thrive. Retrieved from https://www.adp.com/thrive/articles/workplace-flexibility-for-millennials-appealing-to-a-valuable-new-generation-3-324

Continue Reading

Elements of Digital Storytelling

The Stories We Tell

What’s the point?

What is a story? Since that’s not an easy question, maybe it’s simpler to talk about what a story is not. A story is not a data point, an anecdote, spectacle, or even a simple narrative (Alexander, 2011, p. 13, and McClellan, 2007, p. 69). A story is not always fiction, nor is it always nonfiction. We see stories in literature, but also in business, journalism, marketing, and certainly politics. A common theme I found as I read and collected resources on this topic is that what separates “story” from “not-story” is simple: a story has a point or a purpose (Nick Montfort in Jenkins, 2010, Part 1; Alexander, 2011, p. 13).

This “purpose” and the progression of a story often takes a similar form. In Kurt Vonnegut’s lecture (Comberg, 2010), he points out that stories usually take a character from a low point to a higher point, with some sort of critical juncture, tension, or problem along the way that must be resolved. Stories have a particular arc: exposition, rising action, climax, falling action, denouement (Melcher, 2012). I studied music in undergrad, and one feature of music theory that always stuck with me was the importance of dissonance. A chord expresses tension, and then when the tension is released, the music (the story) is propelled forward. Stories have a purpose or a point; non-stories lack this purpose or intention.

Stories Connect Us

Among their many purposes, stories serve a social purpose: they connect us to others. Considering music again, I had a vocal instructor who always said that you can’t sing “O Holy Night” without going for the high note (in the line “Oh night, divine”… you know which one I’m talking about), because that’s what the audience is waiting for. They’re waiting to see if you will make yourself vulnerable enough to attempt it, and then they feel fulfilled by your performance and their participation in your performance when they connect with your vulnerability. (There’s also a bit of “tension” in here too, as I think they are also waiting to see if you’ll mess it up).

McLellan (2007) explains that stories can become a conversation between the storyteller and the listeners (69). In the video by Melcher (2012), the narrator explains how researchers identified specific brain chemistry changes that occurred in response to a story. In this video, a researcher monitored the brain activity of people as they watched a story about a terminally ill child and his father. As people watched the story, the researcher observed that viewers experienced the release of specific neurochemicals associated with compassion and empathy. People who watched the video, too, were more likely to donate money or take other action as a response to the story. In other words “stories transport us into other people’s worlds” and in doing so, help us connect with others, even if we don’t know them and will never meet them (Melcher, 2012).

Digital StoriesImage of sunset that says, "Digital Storytelling: Intimate yet Participatory"

Like traditional stories, digital stories also have a point and connect us to others. In many ways, the purpose and point of stories remain the same, whether they are told orally, in writing, or using digital means. According to scholar Tom Abba, “Homo sapiens has always been a storytelling animal; so is homo digitalis” (A.C., 2015). Digital stories, then, are simply the art of telling stories with digital tools (Alexander, 2011, p. 3). 

McLellan (2007) contrasts digital storytelling with digital spectacle, explaining that, in spectacle, the audience members are observers. She continues: “By contrast, digital storytelling is far more intimate and participatory, with less flamboyance, yet with deep and lasting power. Ultimately, digital storytelling seems to reach people more profoundly than spectacle” (69). I think the core difference between digital stories and traditional stories, then, is twofold: digital stories have the potential to be more participatory, but also more intimate.

Reaching Up and Reaching Down

I envision that an oversimplification of the history of story and storytelling could be divided into three overlapping and co-mingling eras. The first era is perhaps the era before our reliance on the written word, when oral stories among small groups of people were the primary means of storytelling. In the second era, we got books, literature, mass media, film, and radio. Now, in this third era, we have digital storytelling, which as described by Alexander (2011), relies on tools like blogs, Twitter, Wikis, social images, Flickr, Facebook, Podcasts, web videos, etc. (p. 47-91).

So what separates this third era of digital storytelling from the previous two? In my second “era” of storytelling, most stories (films, books, movies, etc.) were transmitted one-way, but this new form of digital storytelling has the potential to increase the participatory nature of stories. According to Dean Jensen, before, only a select few individuals had the means to craft a story and get it out. Now, “almost anyone can create a story and get it out to a potentially unlimited audience. The fundamentals of storytelling are beginning to change” (Jenkins, 2010, Part 1). Clay Shirky explains that, in digital storytelling, we have more stories that overlap, told in multiple times, with more users actually participating in the story (Jenkins, 2010, Part 1). Barriers that the typical “storyteller” used to face are being reduced, as “digital storytelling makes it possible to capture, archive, and retrieve stories with greater ease and flexibility than ever before. And digital storytelling techniques make it possible to present and share stories with exceptional power” (McLellan, 2007, p. 73)

I think that digital storytelling also reflects a desire to return to the first era of storytelling, when the experience between audience and storyteller was more intimate. Digital storytelling is an attempt to reach “up” to reach more people and participate with wider numbers of people across broader platforms and with less linear restrictions, but it is also an attempt to reach “down” in order to connect more intimately with specific personal stories on a human level. As Lambert explains, digital storytelling is an attempt to “bring back orality in the communal circle as a core human activity” (Jenkins, 2010, Part 1).

Digital Stories and Education

Storytelling is considered “the” original form of teaching by some, and has numerous pedagogical merits. Stories were are are often used to teach us lessons, beliefs, and digital storytelling boasts many of the same benefits (Educause, 2007, p. 1).

When students are the creator of stories, they’re empowered and have the opportunity to find their own voice. This fosters “a sense of individuality and of “owning” their creations.  (Educause, 2007, p. 2). According to McLellan (2007) digital storytelling helps students develop critical transdisciplinary skills such as mastery of technology, collaboration, self-direction, personal initiative, and visual literacy, which are transdiciplinary skills necessary for student personal and academic success (p. 68).

When students are the audience of a digital story, they’re asked to participate, which taps into emotional learning as well as logical, sequential learning (Educause, 2007, p. 2). Students will be more profoundly impacted by a lesson or a topic if emotional learning is considered; if both reason and emotion are required for engagement (McLellan, 2007, p. 71). 

References

A.C. (2015, November 23). The real future of electronic literature [Web log message]. Retrieved Nov 26, 2015 from http://www.economist.com/blogs/prospero/2015/11/interactive-fiction?fsrc=scn/tw_ec/the_real_future_of_electronic_literature

Alexander, B. (2011). The New Digital Storytelling: Creating Narratives with New Media. Santa Barbara, CA: Praeger. 

Comberg, D. (2010, August 30). Kurt Vonnegut on the Shapes of Stories [Video file]. Retrieved from http://www.youtube.com/watch?v=oP3c1h8v2ZQ

Educause (2007, January). 7 Things you should know about digital storytelling. Educause Learning Initiative. Retrieved from https://net.educause.edu/ir/library/pdf/ELI7021.pdf

Jenkins, H. (2010, August 23). How new media are transforming storytelling in four minutes [Web log message]. Retrieved Aug 20, 2011 from http://henryjenkins.org/2010/08/how_new_media_is_transforming.html

McLellan, H. (2007). Digital storytelling in higher education. Journal of Computing in Higher Education, 19(1), 65-79. Retrieved from http://link.springer.com/article/10.1007/BF03033420

Melcher, C. (2012, October 3). Empathy, Neurochemistry, and the Dramatic Arc: Paul Zak at the Future of Storytelling 2012 [Video file]. Retrieved from http://www.youtube.com/watch?v=q1a7tiA1Qzo

Continue Reading
1 2 3 5