Archive for March, 2009

March 29, 2009

Digital Learning Environments Events Series Update-Seattle

DLEbanner2.gif
The second Digital Learning Environment Event, held on March 26 in Seattle, was kicked off with an inspiring keynote address by Dr. Barbara Grohe. Dr. Grohe is Superintendent of Kent School District in Kent Washington. Her dynamic address included highlights of the amazing things her district is doing in the area of technology.
Grohe began her address on the subject of 1:1 computing: Private schools are implementing this much better than public schools where leaders are still debating 1:1 adoption. Dr. Grohe says most of the world has passed up that conversation and already realize we need to have 1:1 initiatives. It’s happening now in India and Africa with the One Laptop per Child (OLPC) project.
Dr. Grohe’s vision and leadership have brought her district into national distinction as leaders of technology implementation and 1:1 computing. (In 1998 Dr. Grohe was honored as the National Superintendent of the Year) Kent educators believe that preparing students for future success means making technology an integral part of K-12 education. Funding for several projects (classroom presentation stations in every classroom, replacement of outdated student computers, teacher laptops, and one-to-one initiatives) was secured through the passage of a Technology Levy in 2006. Voters were promised; that with the passage of the levy, the projects funded would be focused on helping student achieve 21st Century skill sets such as digital-age literacy, inventive thinking, effective communication and high productivity.
Planning and implementing such an enormous task has been quite a tremendous undertaking. Dr. Grohe shared with the audience a few things that they have learned through the implementation process so far:
• Grohe says, “In times of crisis, you have to narrow the focus”
• If you don’t focus on the learning the best teachers won’t come along with you. Those teachers are concerned about how kids learn-they need to know how these new ways of teaching with the aid of technology will change teaching and learning to make it better.
• Focus on the curriculum. Curriculum becomes the beginning of the discussion as a result of having the technology available to them. Look at the way technology can change the very nature of curriculum to help you teach better, not just different
• It’s not about the “stuff”; it’s about how to make the teaching and learning more effective.
• Talk quietly, do everything step by step.
• Optimize the teachers you have on staff. Help them understand what strengths they bring to technology.
• Identify your “maniacs”. These are the incredible people with a mission. Get out of their way and they will help you get where you need to be.
• Identify the others who are determined that your plans will NEVER succeed. Having an enemy is a true asset because they, as Grohe put it, “will ask the question your friends won’t” They will be great assets when they are brought into the decision making process.
• Chart your course. Know where you want to end up. Staying the course is the hardest part. The job of the administrators and the boards is to keep moving forward-never go back.
• Understand your obstacles. The administrators’ job is to get those out of the way. Grohe refers to this as “rubble removal” – so the “maniacs” can complete the mission
• And finally, Dr. Grohe suggested that you then have an obligation to share with your colleagues your lessons learned and she directed us all to the Kent School District website for more information on their technology program and implementation.
The rest of the day included more classes for participants to cycle through four 50 minute “classes” which included Science, Math/ Language Arts and Professional development. The day culminated with a wrap up presented by representatives from Smart and a drawing for some amazing prize which included a Smartboard and an hp tablet laptop!
I’m looking forward to my next Digital Learning Environment adventure in Boston next week. If you are in that area, or if you live near Boston, Phoenix or Pittsburg, consider joining us at the DLE event in your city. Find out more here: http://www.guide2digitallearning.com/

9:38 PM Permalink
March 22, 2009

Digital Learning Environments Events Series Update

DLEbanner2.gif
The keynote speaker at the first Digital Learning Environments Event in Chicago on March 19 was Superintendent Steve Baule from District 201 in Westmont Illinois. Baule began his address by asking what we can do more effectively with technology. Answers included extending the school day with things like online encyclopedias and virtual courses, and using standardized testing like MAPS (Measures of Academic Progress) for higher reliability and faster results.
He mentioned these keys to successful technology use and planning:
• Focus on student learning and the districts strategic needs
• Create robust infrastructure
• Train everyone with adequate ongoing multi-level training and support
• Constantly Review/Revise/Supervise/Inspect
• Support your decision with data
In the area of data, Baule stressed that quantitative data is important for technology planning as well as new technology adoptions. He mentioned gathering data on the following:
• How often or long technology is used
• Changes in test scores or attendance rates
• Impact on family contacts
• Changes in graduation rate and grades
After the keynote address, participants left their “homeroom” and cycled through four 5o minute “classes” which included Science, Math/ Language Arts and Professional development. With 5 minutes passing time these participants stepped into our students’ shoes for the day. In my class, Language Arts, their behavior was good-I had no referrals for the day!
Lunch was a nice sit down meal at the hotel’s restaurant. I enjoyed dining with a high school teacher from northern Wisconsin. Although we teach in geographically different parts of the country, she in Wisconsin and I in California, we discovered our school districts had much in common and shared many of the same challenges including declining enrollment, shrinking state resources, rising transportation costs (we are both in rural environments) increased difficulty in meeting the rising requirements of NCLB. Although we were both feeling more challenged in our teaching careers than ever, we were both optimistic that in spite of these challenges our students will still learn and grow and that we and our colleagues will continue to give our all to help our students succeed.
I came away inspired, just like I am every time I have the opportunity to connect with professional, dedicated and talented educators around the country. I look forward to my next Digital Learning Environment adventure in Seattle next week. If you are in that area, or if you live near Boston, Phoenix or Pittsburg, consider joining us at the DLE event in your city. Find out more here: http://www.guide2digitallearning.com/

7:47 PM Permalink
March 20, 2009

After Effects: From Fan to Feared to Favorite (plus tips)

We’ve all been there, watching a film when an amazing special effect blows your mind – leaving you to wonder how did they do that? Well, several years back, I started asking fellow editors and educators this very question – and again and again I heard the same response: After Effects. Want to motion track? After Effects. Want to green screen? After Effects. Want color correction? After Effects. Want an intergalactic light saber fight scene with explosions and an amazing 3D camera move? After Effects.
I started to see a trend . . .
Satisfied with this answer – I happily downloaded the free 30-day trial of AE (that’s After Effects for short) from Adobe’s website. However, my initial enthusiasm soon waned, well, plummeted actually. Almost immediately after launching AE, I had a common new user experience – I will politely dub “After Shock”. To explain, as a full-time teacher of Adobe software for years, I had taught literally thousands of people how to use Photoshop, Illustrator, and/or Premiere Pro. Some would even say I’m bit of an Adobe zealot: I’ve beta-tested new releases, done workshops all over, and even trained new Adobe employees through the Digital Media Academy. Indeed, when it comes to Adobe software no mysterious button, workflow, or special effect is safe from my twisted desire to know everything an application can do.
But here was After Effects, and it appeared to be a different animal entirely. I must confess, I was a grown man . . . and I was afraid.
uh-oh.jpg
UH-OH: at first glance (especially with all the panels displayed). After Effects can look a bit, well, intimidating. But fear not, while AE takes some getting used to, learning this app is well worth the effort.
Like most who experience such After Shock, I did my best to poke around and bend After Effects to my stubborn will – but with little success. For those comfortable with other Adobe apps, there are some strange and downright spooky moments to be had when first looking at AE – for example, creating a new project does not involve a settings menu, there is no razor tool to cut clips with, there are over 200 effects each with a range of adjustments (allowing for literally millions of possible combinations) . . . and seemingly as many shortcuts. Clearly, this was not my beloved, intuitive Photoshop.
So given the choice of abandoning my quest to reach AE special effects wonderland – or to fight back the fears and plod on – I looked at every AE website I could find, read every book I could get my hands on, watched DVD tutorials, took a class with my fellow Adobe Education Leaders, and even bothered contacts at Adobe for more info. It was not always a smooth journey, my friends, but along the way I came to three important conclusions:
1) AE is just as amazing as they say.
2) AE can be easy to learn – with the right approach.
3) I could have realized #2 a whole lot sooner.
Essentially, in looking back at my AE travails, I am a bit embarrassed to admit that I slowed down my own progress by forming some common AE misconceptions. So for those of you just setting out with AE (or hoping to someday), I hope this list of “5 Tips for Learning After Effects” below might make your own AE quest much easier (and possibly save you a few months of mind bending toil):
5 Tips for Learning After Effects
1) Know your DV basics first. As a longtime editor, a good understanding of DV was the only thing I had going for me when faced with AE – and probably the only thing that kept me going early on. Basically, if terms like 24fps, interlacing, NTSC, or compression are entirely new to you, help yourself out by visiting some useful websites that define DV terms and concepts:
For just the bare bones of DV, you can start with:
http://en.wikipedia.org/wiki/Digital_video#Technical_overview
For the hardcore user, checkout the extremely thorough DV primers on Adobe’s site: http://www.adobe.com/motion/primers.html
2) Know what After Effects is (and is not) for. Think of AE as a dedicated special effects application for individual shots and short animations – and here’s critical part – you typically perfect these shots in AE and then export them to your preferred editing application (e.g., Premiere Pro). In other words, AE is a great enhancement to (but not a replacement of) your editing software. This paradigm shift is really important– because AE is not really designed to: capture footage, make a bunch of tight cuts, work with transitions, etc. as you would with a dedicated editing application. Because AE is dedicated to special effects, it is appropriately different in many respects – and truly does have a logical structure and workflow (e.g., its object based timeline). By embracing these differences and the rationale for them, you’ll be far less likely to fall into the common trap of thinking “why doesn’t AE work like my editing software?”
3) Know just enough of the AE keyboard shortcuts to be dangerous – and realize that this does not mean that many. While certain shortcuts are essential to AE, most are simply there to save you from a deep dive into the pull down menus and an extra click or two. Do not feel that you need to know a hundred shortcuts to be an AE editor. By learning just 10-20 of these clever little guys, you’ll soon adapt to a new way of editing – and find yourself having a much better time. To get you going, here are 10 shortcuts that I particularly like (and that took a while to discover):
When getting started on a project:
With a new blank project, import your master video clip (a.k.a. “plate”), and drag it straight to the comp timeline. This method is often preferable to creating a new composition first because, by dragging your video file to the comp timeline, it creates a new composition that automatically matches the chosen video clip’s duration, scale, frame rate, and pixel ratio.
When making edits in the composition timeline:
Page Down moves the current time one frame forward
Page Up moves the current time one frame backward
; toggles the view to a full zoom in or out at your current time.
Ctrl + [ trims the “in” point(s) of the selected layer(s) to the current time – and as you might expect it has a twin . . .
Ctrl + ] trims the “out” point(s) of the selected layer(s) to the current time.
Ctrl +D duplicates selected layers or effects
Ctrl + Shift + D duplicates and cuts a layer at the current time. It’s as close to a razor tool as you will find in AE.
When animating/keyframing:
U shows only the keyframed properties of a selected layer.
Alt + Drag selected keyframes stretches (or squeezes) the distribution of selected keyframe groups uniformly. This can save a ton of time when retiming a complex multi-layered effect.
4) Start simple, and I mean super simple. With all that you can do in AE, it’s tempting to try to make first project something colossal. So while making an HD sequel to the movie “300” (green screen and all) is certainly do-able in AE, it would lead to more than a little frustration for a newcomer. (Not that I’m speaking from experience . . . ahem). Try experimenting with a standard definition project with a few foundational elements – 3d space, keyframing, text animation, camera moves, etc. and you will have a much easier and more fulfilling sense of what can eventually be done on the grand scale.
5) Use the wealth of AE resources– and take a class. The incredible range of AE means that its structure has a corresponding range of complexity – which can be tricky to figure out at first. To this end, I am all for books (e.g., Adobe’s Classroom in a Book series), web tutorials (e.g., www.videocopilot.net), DVDs (Total Training), etc., but when it comes down to it – there is nothing like project-based, hands-on training. When things get confusing, there’s simply no replacing a live instructor. A live instructor can not only answer individual questions that might take hours to look up online – but also show you techniques and workflows that simply translate better in person. Moreover, if you can take an AE class that is project-based, you’ll be able to incorporate your own special effects ideas into the training – and make it far more likely to have your individual needs met.
spped.jpg
In After Effects, you can get into the exciting world of 3D, motion graphics, and special f/x pretty quickly by using the range of resources out there. Couple these resources with a class – and you are on your way.
Looking back, I certainly took the long way to get there, but I am happy to say that After Effects is now my favorite application to use – and to teach. I am excited to have clients pleased with AE results – and students creating some of those the same special effects I first fell in love with on the big screen. Hopefully, you’ll be able to learn from my initial After Shock travails and get to where you want to go with AE a whole lot sooner with these 5 Tips for Learning After Effects.
All the best,
Kevin McMahon
Video Production and Graphic Design Instructor
Bellarmine College Preparatory

2:14 PM Permalink

Student Introductions to Real World Clients

DMOC
Two to three times a year, I teach a class for the Digital Media Studies (DMS) program at the University of Denver called “Web Building and Site Management”. In this class, undergraduates are introduced to the concepts of building (mostly) static websites with a strict, standards-based approach. The IDE used is Dreamweaver CS4 with “Design View” forbidden from use. There is a full introduction to pre-production planning using Fireworks and final design work is done through Photoshop. We also touch upon the Flash Platform and the integration of audio and video within a website. This probably does not differ much from most introductory website creation and management classes offered at universities across the world.
One aspect of the class that I find to be unique (and the point of this article), is that these students are assigned actual clients in groups of two or three and are tasked with providing them a completed website as a final project. I work with an organization on campus called the Digital Media Outreach Center (DMOC) to get this all together. The mission of DMOC is to provide digital media services such as website creation to “Colorado-based non-profit and not-for-profit organizations in a manner that also gives students and faculty opportunities to apply and extend curriculum-based learning to community-based projects”. For my class, they find clients and manage the student-client relationship as I teach the skills and concepts necessary to fulfill those requirements through the course.
In general, this approach is both beneficial for the students and for the clients. Students receive a hands-on educational experience in both the subject matter and client relations. Clients receive a simple- yet fully functional, standards-based website to promote themselves and interact with their members and clients. A few problems do occur, from time-to-time, but most are easily resolved and the students come away from the class with a greater level of experience than they otherwise would have.

12:56 PM Permalink

THINKING Pt. 2 – Storyboards? Sort of…

If we are helping to build a non-linear world then why do we often insist on starting at step one?
This thought came to mind because I was coincidentaly privileged to be a part of two different projects which were experiencing very similar problems at exactly the same time. It was a moment in which I found myself challenging my own basic assumptions and the results were both eye opening and freeing.
ideas3 web.jpg
Project number one – Roxana Hadad, an AEL in Chicago was working with students under Greg Hodgson, an AEL in England, to teach Greg’s students game creation using Flash CS3. She was looking for volunteers to assist and I offered to cover using Flash’s drawing tools. The online session was fabulous, the conversation dynamic and the students were wonderful. The prospects for success looked solidly positive. I followed up a few weeks later and learned that apparently the students were very engaged when actually game building but were opposed to planning and storyboarding. Hold that thought.
Project number two – my own grade eleven students were planning their first full Flash animations. With lots of support and more than a little “nudging” by me they got their so-called storyboards done. There were a few well done boards but for the most part they were pretty haphazard – a poor showing with very little enthusiasm for this whole planning business. Hmmm-mm – this sounded a lot like Greg and Roxana’s experiences. Two entirely different groups of high school students were behaving in very similar ways when faced with a problem – was I missing something?
As I pondered this situation I realized a few things. First –I have years of experience doing this type of work. The students have very little experience. I not only know the software I also know the role and importance of the planning process, the problems these projects will likely encounter and how to solve most of those problems. In other words, I have learned on many, many levels, all about these projects. And – I did a lot of that learning by doing projects. I did not learn by reading books, watching tutorials, or listening to podcasts, I learned by doing. I freely admit that these supports were used occasionally but for the most part, I learned the most by doing, by falling flat on my face and by starting over – a little bruised but also a little smarter. It took years to learn the required lessons. If this was my process, then why should it be all that different for our students? Add to that the element of the creative process by which we come to our initial ideas and this situation becomes even more complicated.
Many seem to like using logic to find solutions to challenges. I have used that from time to time, but I have also had many solutions simply “appear” in my imagination, from out of nowhere, and I am not alone in this experience. If I ask my classes there will always be many who think logically and just as many for whom answers simply appear. What I love about this is the number of students who use both techniques – logic married happily to sudden insight. Shouldn’t we teach using both techniques combined with a lot of hands on experiences so that our students can slowly but surely build the skill sets they require in order to properly plan and storyboard? Does everything have to be done in the “correct order”? Would this approach work equally well with teachers who are just starting to learn this technology?
Twenty years in the ad agency business taught me the full value of planning, storyboarding, scripting and everything that goes with that. It also taught me to respect the creative process, a process that can be contradictory to the logic of planning followed by doing. Perhaps in this awkward reality there lies a better approach, an approach which tests some of my baseline assumptions about proper work flow.
What if students were required to create an initial, rough storyboard as a broad guideline for their project, followed by an intense period of hands-on building, experimenting and risk taking. Through this process they would generally follow their game plan while also trying and therefore learning, new techniques and ideas. Because they had to create an initial, rough plan the teacher should be able to avoid that dreaded student comment two weeks into a project… “I don’t know what to do, draw, write, build, create…”. (The comment that makes teachers crazy when it is repeated many, many times!) This is the approach I am currently trying with my grade eleven class, and the results so far have been encouraging. They are following their early ideas, for the most part but they are also allowed to revise those plans on the fly if they discover a better idea. The hands on process puts them to work quickly and they discover in very short order that their first plans may have sounded great but were not as great as they had hoped. There will be gaps, problems and issues galore but those will surface as the work unfolds. I will offer comments, support and critiques as they continue to put their projects together. At the conclusion of the project they will be required to do a reflective review of their project including a fully detailed storyboard for the “better version” of their project. That storyboard will not go into production. Instead, it will serve as an articulate statement of their learning and experience. It is a backwards way to teach storyboarding but I am hopeful that it will be a more engaging and fruitful process.
What if we were to not teach teachers, but engage teachers, when presenting this technology? What if we were to get them to use Photoshop to work on personal pictures and then turn those images into posters they create for themselves. What if we were to show them this technology can be fun and engaging as well as productive before we told them how to use it as a teaching tool? Would having fun reduce some of the barriers thrown up by fear and worry? I suggest, it would help to do so. Does getting people quickly engaged overcome other barriers to learning? My experience says, yes, but I freely admit to being a hands on person so I know my answers are skewed by this.
I know I am not alone in considering this backwards approach. The idea of backwards design, by which you first define the goal and then figure out how you are going to get there, has been around for a long time. Thinking up a solution through sudden insight and then working backwards to confirm that the solution really does everything you need it to do is yet another approach like this. If it engages the learner, prompts them to move forward on their own, to ask new and old questions and to actively learn in a positive environment, then I am for it. Whether it puts step one first matters les and less to me as I continue to test my basic, baseline assumptions about people, learning and school. I invite you to challenge your assumptions… it can be engaging, revealing and freeing. Do not just cut the grass.

9:48 AM Permalink

THINKING Pt. 1 – Not Just Cutting The Grass

“If it ain’t broke, don’t fix it.”
The fellow sitting next to me clearly had not agreed with my comments. We were at the Adobe Summer Institute, it was the wrap up session and I had asked a question which tested an underlying assumption. Apparently that was not a good thing to do. To me this phrase and this attitude is the very anathema of all that is creative and insightful. Being creative frequently means testing your most basic, fundamental baseline assumptions.
lawn mower shadowed final small.jpg
The value of this testing lies in the truths that the process can reveal. For example, you might use a classroom that appears to be attentive as a positive reference point. The trouble is, you are assuming attentive = learning. Experience shows that the equation does not always work. Beware, though, doing this testing is not easy. If you test these baseline assumptions then you may lose your reference points and that can leave you in what I call free fall. You have released yourself from the norm, and you cannot know where you will land. If you ask these questions you may discover that the products you create so carefully are irrelevant. You may learn, as did one of our clients, that your assumptions are blocking your view. One of our designers and I were sitting chatting with this client. He proceeded to describe a situation in his factory and then offered us three different options to solve the issue. Which of these, he asked, would we use? Here is the funny thing – neither Danny, the designer, nor I knew in advance that he was going to ask this question. We could not have rehearsed our answers, and so it was rather dramatic when we both, simultaneously, replied, “Don’t do any of these. Remove that step from the process – you don’t need it.” Two designers using design thinking and no assumptions led us both to this interpretation. It was not what our client had expected. His deep immersion in his processes had convinced him that there were no other interpretations available. We changed that, without even meaning to, and our client learned a valuable lesson. So- how does any of this relate to cutting the grass?
It is early spring time up here in southern Ontario – time to re-find the garage. That meant finding and moving the lawn mower… and that is what started my line of thought. Looking at this simple machine I wondered just how often do we take these activities and machines for granted. Do we ever ask if there could be a better way or do we just cut the grass? As I thought about that I recalled one situation in which the creators of a product had definitely not just cut the grass, and Flash CS4, their product, had become much stronger because of it.
It was at the 2008 Summer Institute and we were being shown a preview of Flash CS4. The Flash team leader was on stage demonstrating several new features. These were fabulous, powerful new features that took my breath away. The Flash team had challenged their baseline assumptions – about how a piece of software works and Flash CS4 had grown because of that. Flash CS4 is so much more accessible that teachers should now consider it when experiences with past versions may have suggested otherwise. But, what about that idea of testing your fundamental, baseline assumptions?
I see many applications for this thinking, especially in our school settings. Teaching can never be a problem with a single answer. Our students learn in so many different ways that we must always try to reach out to them using different approaches… but are our approaches really all that different? Use this testing to see if your “different ways” are different and to see if they actually work. The answers may require you to revisit some very basic beliefs. Read on in THINKING Pt.2 – Storyboards? Sort of… for those answers and more.

9:39 AM Permalink
March 17, 2009

Digital Learning Environments Events Series

DLEbanner2.gif
Hp and Intel are the major sponsors of the 2009 Digital Learning Environment Event Series. Adobe, Microsoft, Smart Technologies, Dyknow, KNS and PASCO are partners in the events. These are free, interactive one day events taking place in Chicago, Seattle, Boston, Scottsdale and Pittsburg over a 7 week period starting March 19. The purpose is to provide a hands-on experience for K-12 decision makers in the area of technology integration into the curriculum. Attendees will experience state of the art technology solutions in lab environments in the areas of Science, Math, Language Arts/Literacy and professional development. The major goal is to learn how technology-rich learning environments enrich students’ learning experiences and help them achieve.
As an Adobe Education Leader with over 10 years of experience integrating Language Arts and technology I have been asked to provide the training for these events in the area of Language Arts/Literacy. I’ve just arrived in Chicago and am excited about the first event which will take place Thursday. I will be demonstrating how technology can enhance the language arts curriculum using Photoshop Elements and Premiere Elements along with other hardware and software solutions like Smart Technology’s Smartboards and Smartsync.
The Language Arts standards typically have 4 major components; reading, writing, listening and speaking. The tendency in most American classrooms is to focus on paper, pencil, and listening. This is understandable since reading and writing are heavily tested on our high stakes standardized tests. Teaching and learning with multimedia technologies can address the often overlooked standards of listening and speaking as well as deepening knowledge in the total core curriculum. Further,
Technology can be the hook, the spark that draws a student’s interest into the learning process. By its nature, technology embodies “active participation”. Students learn by doing, by exploring, by creating, and in the end, their creations are authentic outcomes that are valued and can be shared.
As I journey through these 5 events in 5 cities, I will share my experiences and observations. If you want to attend one of these events just register at:
http://www.guide2digitallearning.com/story/digital_learning_environments_2009_chicago

8:38 PM Permalink
March 6, 2009

Supporting Cell Phones in Schools

Bill Gates has been quoted as saying (before iPhone) “The computer of the future will be the cellphone.” The implications for educators is profound, and should have us re-thinking our attitudes and acceptance of cell phones in the school. I am not blind to the fact that there are sometimes problems associated with the cellphone in the schools, but we should address those by addressing the behavior, not the object. We don’t take away a pencil the student is tapping, we address the tapping behavior.
As an administrator for highly at-risk students in a Cincinnati charter high school, I found it much easier to have students use Google SMS to look up words and definitions when they were struggling with reading than using a book. Very few of these students would be caught carrying books home, but they would use their cell phone to help complete assignments.
As we look at HOW cellphones may be implemented today, we also look at Adobe and their role. Captivate lets us easily create micro-content with quizzes, saved in Flash. Flash itself let’s students see, create and engage with interactive simulations and games that can have a profound effect on learning. Many Web 2.0 sites are built in Flash, and extend the capabilities of the cellphone beyond what we would have thought possible a few years ago.
The typical smartphone has camera, video, keyboard and voice inputs. It has, through Web 2.0 apps, text (Jott), voice (gabcast) and picture (Flickr) outputs. Starts sounding a lot like a computer doesn’t it? Where will that lead us?
To read more, see an article I wrote for the Florida Education Leadership magazine.
http://www.homepages.dsu.edu/mgeary/vita/cell_phones.pdf
Mark Geary

2:45 PM Permalink