Frank Catalano is a prolific writer of analysis, commentary and other forms of journalism. His regular columns and contributions have appeared in a wide range of online and print publications, and he has co-authored two Dummies books on digital and web marketing.
Frank was a founding columnist for the tech news site GeekWire and, at various times, a GeekWire podcast host, contributor and interim deputy editor (2011-19).
I’ve worked from home as a remote executive or consultant for 25 years. Precautions for novel coronavirus and the disease it causes, COVID-19, are spurring others — often without another option — to work from home. Here are five brief tips for those doing it for the first time.
1) Get dressed for work. You may be tempted to wear PJs or lounge pants. Wear what you’d wear to the office. It’ll put you in the right frame of mind.
2) Communicate. Check in with your team, even briefly, at the start of each day. This is not just to let them know you’re online (a green Slack dot does that) but that you’re available to them. It doesn’t have to be lengthy. It just has to happen.
3) Take breaks. You’re still human, even if isolated. I take a brief walk or exercise every single day, and do flexibility stretching. It keeps my body and mind nimble. Do it in the middle of the day and you’ll be productive longer.
4) Set expectations. #Remotework is not 24-hour work. Physically move away from the laptop or home desk at the end of your work day, whenever that is, and let others know you’re available for emergencies, but need to wrap up. You’ll burn out less and be respected more, too.
5) Eat & hydrate healthy. It’s easy to pack on pounds if you don’t leave home. Stock up on low-calorie, high-nutrition snacks and drinks (I like baby carrots, rice cakes, bananas, apples & sparkling waters). It’ll also keep your energy levels stable.
There is nothing like going through an interview process to help crystallize your thinking about the practice of a profession. Especially if you have to look at it through the lenses of eight different people.
Recently, I went through an intense day of interviews for a strategic marketing executive position. The company is well established and known, so this was not a case where I had to provide Marketing 101 instruction. However, being grilled about your thinking by more than a half dozen intelligent pros, each with their own perspective, makes you realize the guiding principles by which you’ve worked aren’t necessarily thought about or communicated the same way by everyone.
I don’t write much about marketing—I prefer to analyze technology trends—but it’s been no secret that for most of the last three decades I’ve been primarily a marketing and business strategy exec or consultant.
So after a day of pop quizzes, here’s how I concentrate a handful of five essential marketing concepts into catch phrases to help explain them concisely to co-workers and clients.
There’s no trendy wording (I’m looking at you, “growth marketing”) or suddenly fashionable tactics masquerading as strategy (I’m looking in the mirror, co-author of the turn-of-the-century’s Internet Marketing for Dummies). But these are shorthand for what I think is good marketing that is both ethical and effective.
Good marketing is unique, believable and true.
If you want to have a positive, lasting impact, your marketing approach and messages need to have three qualities: be unique, believable and true.
Unique, in that you’re positioned in a meaningful way to customers in the marketplace and against your key competitors. Believable, in that what you claim is, or easily can be, backed up by convincing proof points. And true, in that what you say about yourself and others is factually true. This last is more than a critical ethical consideration: If you want repeat business, you can’t mislead customers.
You don’t get to choose one or two out of three. Unique and true don’t work well if what you claim as unique can’t be believed. Believable and true in combination can be replicated by competitors. Unique and believable alone… well, they underlie many scams.
Unique, believable and true. Use them as touchstones as you develop marketing strategies and messaging.
Marketing is applied psychology.
This often gets overlooked by quants fixated on manipulating data. Marketing is a human activity. It is about convincing thinking beings to make a choice that you, representing the organization, would like them to make.
Yes, data can steer you to the preferences of groups as a whole. But taking the next step in a process in your firm’s direction requires a decision on the part of a human being. Marketing can’t just herd. It must persuade.
Marketing is not a democracy.
Some are startled when I say this; others knowingly smile. Inside a company, good marketing decisions are not made by broad consensus or by focus groups or by an internal poll. As a whole, non-marketers tend to approve of approaches that are similar to what they already know, or what they subconsciously feel is “safe.”
Put another way: Just because everyone has appreciated a good ad doesn’t mean they understand the background and tradeoffs it took to create it. It’s similar to the fiction that just because everyone has gone to school, they know how to “fix” education problems.
I’ve seen companies pick internally popular taglines or product names so close to those of competitors that they’ve had to change them later—or deal with a legal challenge. In other cases, the result is a game of follow-the-leader, in which the company winds up mimicking a competitor who originated a brilliant idea. The follower will always be in the leader’s shadow.
Yes, a marketing executive absolutely should solicit and listen to input from a wide variety of internal and external sources, depending on the issue. We all have blind spots. Then that leader, or a very small team of knowledgeable execs, needs to make the final decision. But vote? It almost ensures marketing mediocrity.
Marketing is not a single tactic.
One of my pet peeves is the startup that claims, “We’re really successful and we do no marketing.” Wrong. You do marketing. What it usually means is that you do no paid advertising.
Marketing isn’t a single arrow. It’s a quiver. I often get into conversations, even with other marketers, in which it’s clear “marketing” is only defined by what the other knows. Paid display or search engine ads. Direct mail or email. Public relations. Social media outreach. Trade shows and events. Brochures and flyers.
Not all arrows are appropriate for every company’s target. But if someone claims they are successful and do no marketing, I first ask if—at a minimum—they encourage word of mouth recommendations from their customers to then promote to prospects. That, and any other tactic to influence purchase behavior, is indeed marketing.
By the time you’re bored, customers are finally aware.
Good marketing, especially good branding, requires consistency and persistence. You want to ensure efforts across types of tactics leverage each other, and are in place long enough that they break into prospect consciousness.
However, that takes time and discipline. If you’re inside a company you’ve likely heard a new slogan or seen a campaign so often that you’ve tired of it. I’ve repeatedly seen execs who want to “freshen” marketing approaches which aren’t old. They’re just overexposed to them internally. Unneeded changes are then made, and the campaign doesn’t work because the customer awareness timer has been reset to zero.
When I was a chief marketing officer, a commonly accepted metric within an organization of my peers was that company or product rebranding takes about two years for customers to finally “get.” I got that.
Why? Because, very early in my career, I was a Top 40 radio disc jockey and we had to play the current number one song at least once per hour. By the time we were sick of hearing it in heavy rotation, our listeners began to call in and request it nonstop. (This was especially a problem if you were a young DJ when “Disco Duck” was a hit. I was.)
This doesn’t mean a bad marketing or branding effort shouldn’t be killed if it isn’t working. But organizations shouldn’t confuse internal over-familiarity with external failure. Customers are busy and bombarded with other messages. Often, they just need a reasonable amount of time to catch up and latch on.
So these five are my handful of interview insights. The day-long exercise reinforced, for me, that good marketing is accurate information about a product or service presented in such a way (yes: unique, believable and true) that it gets a potential customer to take action.
Of course if something is crap, none of this will help. Because heavy promotion followed by lots of negative word of mouth from dissatisfied customers can bulldoze any marketing tactic the company thinks it might control. As an executive of an early client of mine observed, “Nothing will kill a bad product faster than good marketing.”
And that is a marketing mantra that will outlive all others.
In 1994, I got a call from an editor I knew at a Seattle-area newsweekly. Computers for personal use—and the companies that made them possible—were getting a lot of attention due to this newly accessible Internet. I’d been a full-time journalist and now worked in tech. Would I be interested in writing a snappy regular column explaining computer industry developments to mere mortals?
Sure, I said. It needs a name, he said. I first suggested “Dispatches from the Digital Frontier.” And then I offhandedly added, “Or you could just call it ‘Byte Me.'”
He eventually stopped laughing. That set the tone for more than 200 weekly “Byte Me” columns that ran in Eastsideweek and later Seattle Weekly through 1998, starting with one titled, “The Internet as Goat Trail.” It bemoaned the difficulties of using this largely free and open so-called “information superhighway” while still noting it “sparks the imagination of what could be.”
After the Seattle Weekly column ended came a several-times-per-week on-camera tech analyst slot at KCPQ-TV Seattle with a column on its website, then columns or regular contributions for Puget Sound Business Journal, TechFlash, NPR/KQED’s MindShift and finally almost eight-year-long stints with both GeekWire and EdSurge.
My current work with GeekWire and EdSurge wrapped up earlier this year. That’s led me to reflect on two-and-a-half decades of a side career analyzing and writing about technology developments and the industry for a general audience (after reporting on the original personal computer boom when I was a full-time broadcast news reporter).
Yes, a lot has changed from a technology standpoint. What was nerdy fringe is now geeky mainstream. What was desk-bound and disconnected is now portable and persistently linked. What was an annoying buzzword…is still that.
Yet as the major technology hype has shifted from personal computing and the internet to mobile devices, I’ve discovered that three universal truths endure when it comes to explaining tech to a general audience.
Under-the-hood interest is limited—but it is there. People do want to understand why a new technology, product or service is useful. But they do not want to be buried in technical detail.
Sometimes stories about what’s new go so deep in the weeds of the tech underpinnings (I’m looking at you, blockchain) there is no clear tie back to the practical benefits.
That doesn’t mean a writer should stick to describing what’s cool on the surface. Cool doesn’t always mean useful. There are many examples of “cool” products and services that required such contortions in customer behavior that much of the benefit was lost. This goes way back, and includes everything from earlyTimex Datalink smart watches to theRadio Shack precursor of today’s laptop computers. I owned both.
So surface or weeds alone don’t serve the non-expert, tech-interested audience. There needs to be enough factual foundation for understanding and a clear connection as to how that tech makes it possible for the useful stuff to be delivered.
It’s easiest to focus on the giddy change-the-world surface. It is oddly also easy (for the nerdy) to give an expert tutorial on the tech. What is hard is figuring just how much of the latter is required to help people understand the former without going too far in either direction. For the good explainer/educator, it’s a constant challenge.
The lure of the bright-shiny is eternal. Similarly, people want to hear about what’s new and exciting in tech. But they don’t want to be served a diet of empty-calorie hype.
This is when context and comparison helps. Sure that foldable smartphone or virtual currency is really neat. But what will it replace, or will it simply be another service or gadget that requires care and feeding? What else is needed to make it work well? What behaviors (see above) will have to change for it to succeed?
Often, unfortunately, the cool overwhelms the useful in gushing coverage of new digital developments. Sometimes, the stories are only about the wonderful assumed benefits of new tech with zero explanation as to the how. While the companies may like that, it’s a disservice to anyone truly curious about what new tech may mean.
Writing forces understanding. Good tech writing helps two audiences: the reader and the writer. I’ve found this to be the personally most valuable part of what I’ve done as a columnist and contributor over the past two-and-a-half decades.
It’s very hard to accurately explain a development in part of the tech industry without understanding it first yourself. Of course I can write a concise tweet based on something I’ve quickly read, or pen a brief story on specific spot news. But to fully explain a development to others in a column or other analysis requires digging in, excavating the most important parts and then building a coherent narrative upon that foundation.
This kind of writing is a wonderful discipline. It also gives me permission to dive into topics that seem interesting but I’d also likely never explore unless I had to write about them.
Twenty-five years ago, people needed to be educated about digital technology before they were even interested in using it. Today, the interest is already there, but the need for education and explanation remain. And it may be that tech has so overtaken our lives that—from personal data collection to remote social interaction—it’s the tech itself that now has the explaining to do.
Seventy pages of data is a lot of data. But that’s the fascinating substance of “The Common Sense Census: Media Use by Tweens and Teens,” a new report from the nonprofit organization. I’ve summarized the findings about digital homework for EdSurge.
But there is so much more.
It is, as far as the report authors themselves know, “the only nationally representative survey tracking media use patterns among a truly random sample of U.S. 8- to 18-year-olds.” The 2019 report also follows up on a similar 2015 report, so trends can be spotted.
A few observations that didn’t make it into my EdSurge summary:
Smartphone ownership is way up, and at earlier ages. From age 11 onward, a majority of kids own a smartphone in 2019. In 2015, the 50 percent mark wasn’t reached until age 13.
Kids’ use of screens doesn’t mean a lot of them are content creators. Screen time largely means game play, social media, or passive video consumption. Only 2-3 percent of kids’ screen use time, roughly, is spent creating their own writing, art or music.
Smartphones aren’t being used more for homework. Despite their near-ubiquity, overall time spent on smartphones for doing schoolwork at home is essentially flat between 2015 and 2019 while computer use time has increased. However, those in lower-income families spend more time than those from higher-income homes doing homework on smartphones (this last noted in my EdSurge piece).
There is much more in the report itself, including what kids think about multitasking with entertainment media while doing homework. I recommend browsing the full Common Sense report.
However, I’ve written and tweeted about its prevalence in education, a presence potentially more far-reaching than its huge popularity as entertainment. So it was with that perspective I walked into the new “Minecraft: The Exhibition” at the Museum of Popular Culture (MoPOP) in Seattle.
Not everything I saw or heard at MoPOP made it into that EdSurge piece, though. That included some additional salient observations by Brooks Peck, co-curator of the exhibit and MoPOP senior curator. For gameplay as well as education, Peck finds Minecraft has “unlimited possibilities.”
“Once people get the hang of it, they really want to push the boundaries. They’re like, ‘How can I mess around with this? What happens if I do this?'” Peck told me. “So there’s a ton of experimentation that goes on. I see that everyday in the gallery where I’ve set up these little experiences for people and they ignore them completely and just mess around. But you can’t break it. So that’s great.”
While there’s a lot that makes the digital tangible to fans, the exhibition also helps non-players — like parents and me — understand the appeal and reach of Minecraft. Think of it as Minecraft 101.
For example, there’s a physical crafting table for making things with natural resources that reproduces the virtual one inside Minecraft’s world. Yes, it’s cool for knowledgeable players. But it also serves an educational purpose.
“This is totally an opportunity for kids who are players,” Peck said. “It gives them the chance to be the expert, and they can sit down and show their parents or whomever what they know. They love that. They get so few opportunities to do that, to be the one in the know.”
That was pretty obvious when I visited the exhibition. Even though it was midday on an October Monday, there were lots of kids and parents, and the former led the latter.
Sure, there are those who may wonder about the real-life benefits of playing in a world where you’re manipulating meter-high blocks made of pixels. Yet Peck sees a different kind of learning going on.
“Among other things Minecraft has a really interesting connection to the real world in this idea: You live in the world you make,” Peck said. “That’s a literal fact in Minecraft. You make your house; you might landscape it; you live in this world that you built.”
If there’s a place for conspicuous tech consumption in education, it’s ISTE. The annual event– named for the association that runs it, the International Society for Technology in Education — is the world’s second-largest edtech conference and trade show, behind Bett in London.
This year’s ISTE was in Philadelphia, drawing more than 20,000 participants. If you’re in or interested in K-12 edtech and didn’t get a chance to walk the ISTE exhibit hall, here are five easily digestible observations. In, of course, tweet form.
I’d hoped to write this as an essay, but time likely won’t permit. So here’s a Twitter thread you might call ‘5 Exhibit Hall Trends You May Have Missed at #ISTE19.’ From which #edtech vendors didn’t show up on the exhibit floor, to the delay of the STEMpocalypse. Illustrated. pic.twitter.com/DDbofMCPQj
1: Size, or showing up, no longer matters. McGraw-Hill, Blackboard and others did not exhibit. Renaissance and still more reduced their booth size. Companies seem to finally realize that with lots of teachers looking for free stuff, this is an awareness show, not a lead gen show. pic.twitter.com/1k7kGP1mMW
2: Crowds indicate nothing, unless it’s not free. Packed booths for free tools are easy. Want to know what’s really trending? Looked for packed booths for fully paid products. (@LEGO_Education was one.) Side note: there were also fewer nice, free promotional items this year. pic.twitter.com/MADwYjhtyT
No, these are not scattered arrangements of alphabet blocks from an especially precocious early learning classroom. They’re abbreviations for major conferences for companies in the education industry.
I’ve been attending, and occasionally speaking at, these events and their predecessors for more than two decades. What sets them apart from purely teacher-focused conferences is that industry players aren’t viewed mainly as vendors who pony up for a sponsorship or exhibit booth. Instead, in nearly every case, they’re primary conference attendees.
The last time I took a detailed look at the industry conference landscape for EdSurge was in 2014. That was early in the days of low-cost cloud computing enabling a multitude of new edtech startups, really fast internet beginning its spread to schools, and Chromebooks starting their climb to ascendance in U.S. K-12 classrooms as a 1:1 mobile device.
Five years later, these developments didn’t just upend the classroom. They have upended the industry and how it’s reflected in the events that cater to industry companies, investors, and executives at every level from preschool to workforce learning.
If you work inside, or are deeply interested in, the industry, here’s a new assessment of a handful of the most prominent U.S. events. Yes, ISTE, EDUCAUSE, CoSN and other educator-focused conferences remain vitally important to the industry, too, and have gone through some of their own changes (for example, the “F” in FETC no longer means “Florida” but “Future of” Education Technology Conference). There are also other highly specialized and invitation-only industry conferences. But the general, high-profile events that follow are where more industry deals are done and difficult business decisions are discussed.
Here’s how the alphabet blocks are currently stacked: who’s on top, who’s fallen off the pile, and why you might want to play. Or at least go to watch the players.
ASU GSV Summit
If there was any doubt, this year’s event put it to rest: ASU GSV Summit is the must-attend conference for education technology investment, with business deals and policy issues buttressing the financial focus. Held in early April, ASU GSV’s 10th annual event in 2019 drew an estimated 4,900 people to San Diego with celebrity and political headliners providing a star-studded umbrella under which rainmakers prospered.
Its growth has been stunning. From 240 attendees a decade ago when first held by Arizona State University and the investment firm GSV (Global Silicon Valley) Advisors, ASU GSV Summit blew past the 2,000 attendees figure in 2014, outgrew its Scottsdale home, and now is bursting at the seams in San Diego. The event tweeted that it was “sold out” a week before it began this year and started a waiting list. And it’s not exactly the lowest-priced conference option: Last-minute walk up registration was a few bucks under $3,200.
Even if you didn’t attend keynotes featuring Tony Blair, Common, and—yes—Sesame Street, the programming provided measured insights on the state of the industry, ranging from “preK to Gray.” Everyone from large company CEOs and major investors to startups and nonprofit institutions attend. I found that if you simply stood in a hallway, it was virtually impossible to not see someone you knew (or wanted to know) pass by within five minutes. It’s sessile serendipity.
Deborah Quazzo, co-founder and managing partner of the Summit and managing partner of the GSV AcceleraTE Fund, credits much of the event’s popularity to how it covers learning at every level, from early childhood up. “I do think the overall market has moved to a position of seeing the critical integration of education and talent/workforce innovation,” Quazzo said. “I believe ASU GSV was ahead of the curve on that position and we are thrilled with its embrace.”
Draw: Raw financial horsepower and speaker star power.
Difference: GSV’s deep understanding of the financial end of “preK-Gray” learning.
Go: If you want to do a deal, raise money, or simply understand the current investment-and-education landscape, this is the one conference to attend.
Next: ASU GSV Summit, San Diego, March 30 – April 1, 2020.
As part of a 30 year history of SXSW conferences, SXSW EDU is one of the younger siblings in the South by Southwest family. But it’s grown from its birth in 2011, and gotten a little, well, more festive, as befits its family ties.
Like longer-running SXSW conferences and festivals, SXSW EDU’s home is Austin, Texas and it’s held in March. It precedes the more famous interactive and music events and is a bit more sane, at least in terms of size and late-night public behavior. (In the past, the city didn’t start shrink-wrapping parking meters and telephone poles to protect them until EDU was well underway.)
Yet SXSW EDU has tilted more toward a global teaching and learning festival as it has grown, even though it still attracts a healthy number of industry and government attendees from K-12 up through higher ed: about half educators, based on a breakdown of 2018 stats, and a third business and industry. In 2019, there were 8,300 registered for the conference, slightly up from 2018 but significantly up from the 6,000 who attended in 2014.
As EDU has aligned itself more closely with other South-by events over the past few years, conference keynotes, panels and workshops have been joined by new programming such as a hands-on Playground, and a free one-day education expo that’s open to the public and adds thousands more beyond the conference total.
For the industry, it’s less a setting for business and more to see what educators are excited about without the us-them distinctions of an exhibit hall, mingling in an atmosphere that encourages different types of attendees to mix. Much networking gets done at the many meetups, receptions, and parties.
“SXSW EDU’s mission is to advance teaching and learning and we work to assemble as broad an array of stakeholders as possible, believing the more diverse the community the more impactful the conversations,” said Ron Reed, SXSW EDU’s founder and executive producer, and director of emerging events for South by Southwest. When EDU was launched, Reed told me, it was one of the few education events not tied to a specific association or membership group. He sees the event landscape continuing to evolve. “We believe it is a reflection of education becoming an increasingly interdisciplinary conversation,” he said.
Draw: Festival! And the SXSW reputation for informative and interesting events.
Difference: Educators, industry, entrepreneurs, nonprofits and wonks attend as equals in large number.
Go: To see what’s exciting educators without the restrictions of an exhibit hall, or feeling as though industry types are intruding into a sacred educator-reserved space.
Next: SXSW EDU, Austin, March 9-12, 2020.
SIIA Ed Tech Industry Conference
If anyone is playing the long game in edtech industry conferences, it’s the Software and Information Industry Association. The SIIA Ed Tech Industry Conference may be part of a larger “code and content” trade association, but it’s an organization in which the education division—technically, its Education Technology Industry Network—has traditionally been very strong.
SIIA’s long-running annual conference was once the only industry education conference with a technology emphasis. It’s undergone a few modest identity changes. The last time I wrote about it, the name had changed to “Education Industry Summit” from “Ed Tech Industry Summit.” But it’s generally been held in San Francisco, most recently every June. Attendance numbers more in the hundreds than the thousands, owing to SIIA’s tight focus on edtech industry companies.
What’s remained consistent—and is being emphasized even more this year—is that SIIA is a place where the industry, from K-12 through higher education, discusses the hard issues of building and maintaining an education technology business. It also has an established innovation showcase for startups, popular one-to-one business connection meetings, and the annual education CODiE Awards which are now in their 34th year.
Jill Abbott, the new senior vice president and managing director for SIIA’s ETIN, says she’s seen “a larger focus on start-ups in the event landscape” over the past five years, as well as a more thoughtful approach to programming. “Questions such as, ‘What are the educational trends?’ (and) ‘How can we help companies evolve their business model?’ are now being addressed,” she said.
As to where SIIA fits in, Abbott said: “SIIA views its role to focus on essential problems or questions that we need to address as an industry.” That includes making sure there’s a “call to action” in its conference programming, beyond exchanging ideas and networking. “Providing a call to action—whether it’s for diversity, equity, and inclusion, new marketing approaches, or how AI is impacting your business—brings the conversation outside of the event and into the organization,” she said.
Draw: Peer-to-peer conversations about the business of edtech, plus the CODiE Awards celebration marking the edtech industry’s highest honors.
Difference: Long-standing association event with a tight focus on industry and company needs.
Go: To discuss hard business issues, learn from what other companies have done, and connect.
Next: SIIA Ed Tech Industry Conference and CODiE Awards, San Francisco, June 10-12, 2019.
[UPDATE 6/17/19: The dates for the 2020 SIIA Ed Tech Industry Conference have been set, May 18-20, 2020, in San Francisco.]
EdNET, Content in Context and others
No industry-focused conference I highlighted five years ago survived completely unchanged. Some didn’t survive at all.
In 2014, Content in Context was a thriving conference for educational content publishers going digital, drawing hundreds to D.C. each June. Formerly called the Association of Educational Publishers (AEP) Summit, the event name changed to CiC and AEP was acquired by the Association of American Publishers. But its 2017 event, after a move to Philadelphia, was CiC’s last gasp.
I reached out to an AAP spokesperson who said the organization had recently hired a new leader for education, and anticipates future events that might be open to members and nonmembers alike.
Aging was somewhat kinder to EdNET: at least the event name survived. But both emphasis and length morphed. What had been the longest-running and broadest-based standalone K-12 industry conference—a fixture for three days each September that drew hundreds—also wrapped up in 2017. MDR, the marketing data and services company which ran it, moved to a new format in 2018. EdNET now is a set of three, smaller one-day regional events across the country, with a more narrow education marketing focus and fewer than 100 attendees each.
That cellular division and regeneration appears to have worked for MDR. Kristina James, who’s responsible for the EdNET events, said two of its three 2018 events had waitlists. Part of that success, she said, was in “minimizing the time and financial commitment necessary to attend and presenting content that was localized for attendees.” This year, EdNET plans to host similar one-day events in Boston, New York City and San Diego.
Industry reflected in a not-so-funhouse mirror
So you may now be thinking: What was it about 2017?
Here’s where it becomes clearer that the traditional edtech industry reached a Titanic tipping point of sorts: maybe not exactly in 2017 when certain high-profile events ended, but during the past five years. The conferences reflect that upending in three different ways (and it’s not simply that ASU GSV has sucked all the money people out of the industry conference room).
Consolidation. Not only is “edtech” no longer separate from “education,” but K-12, higher education, and workforce learning are less distinct and moving to more of a seamless continuum. (I keep returning to Deborah Quazzo’s delightful term “preK-Gray.”) Some individual industry-focused conferences may no longer make sense when companies are trying to cross boundaries and business models.
Delivery. The double-edged sword of better internet bandwidth enabling more education technology uptake is that people can now get the equivalent of live conference panels and keynotes in high-def, streamed remotely and sometimes in real time. As MDR’s Kristina James noted, “The education industry event landscape has really evolved over the past few years with more people connecting via webinars.” So the better conferences are evolving: fewer rote presentations and more face-to-face time, rapid-fire discussions, and unique experiences.
Generations. As an Exec of a Certain Age, this might ring more true to me than to others. But many startups that have flooded the industry over the past few years may prefer real-time remote over in-person interaction. That means attending fewer events, while making sure the ones they do attend have the most bang for the buck. In addition, the driving forces behind the longest-running edtech industry events are getting older; the analysts and organizers that put on the original-format EdNET for many years, for example, have retired. Newer blood has new approaches.
When you make plans to attend education technology industry events, remember that the conferences themselves can change as much as the industry does. As they say in the finance world—and perhaps more frequently at ASU GSV—past performance is no guarantee of future results. Or of edtech success.
Three years from now, 15-year-old high school sophomores are going to be college freshmen. And their expectations about the tech that surrounds them in 2022 will have been shaped by both what they experienced in school as K-12 students and outside of school as teenaged consumers.
At CAMEX 2019 in San Antonio, held by the National Association of College Stores, I explored what that combined expectation of edtech and consumer tech exposure might mean. While the slides of my thought leadership session by themselves aren’t that useful without narration or detailed notes (I favor lots of images with any vivid words coming from me, not crowded bullet points), I did summarize my trends take in a series of a dozen tweets. Of course.
I gave an hour-long original talk at #CAMEXshow on what 15-yr-old high school students today are going to expect of tech as incoming college freshmen in 2022, based on their teen #edtech & consumer tech experience. I’m going to try and summarize in a tweetstorm with images. 1/12 pic.twitter.com/xXhWcqPZuO
First, teen tech landscape. It’s a cloudscape. Since iPads & Chromebooks and more pervasive campus broadband, the decade-long lag from consumer tech to #edtech adoption has compressed to 2-3 years. That means a more seamless tech life, but also more danger of schools buying fads. pic.twitter.com/AuGPxPmW3z
Teens also are more savvy of how tech is affecting their behavior this decade. @CommonSense found 89% of teens have a smartphone. But @PewResearch found 54% of teens also think they spend too much time on smartphones. And nearly a third say it distracts them in class. pic.twitter.com/qX1LhUiAYE
So what K12 tech trends now will persist in 3 years, to 2022, setting teen expectations? Early school experiences with AR/MR/VR (extended reality), mostly in computer labs. Coding leading to a fascination with robots. Dominance of cheap Chromebooks and Windows devices in school. pic.twitter.com/fAmnhVm96e
Now to teen consumer tech expectations. Big story here (or hear) is #smartspeakers. Adoption is huge, rapidly leading to virtual assistants with screens. Some hotels and higher ed dorms are already making smart speakers standard. This is a key 3-year teen expectation trend. pic.twitter.com/YdWWXN0nGx
Other key consumer tech trends teens will internalize: Smart homes. Internet of things devices (standalone). More advanced wearables like smartwatches and fitness trackers, and wireless headphones (Apple shines in both, and revenue is increasing faster than unit sales). pic.twitter.com/nnXWSBJeMV
Final top teen consumer tech trend is being social through deeply engaging tech. Extended reality through group AR/VR experiences and arcades, and eSports as competitors or audience. Both are together-yet-isolated teen tech trends that will see strong growth by 2022. pic.twitter.com/R1oXom7KEH
Teens, as consumers coming into college, will expect tech everywhere in 2022, but will want it to integrate well with physical life and (my take, based on some of the behavioral research) not simply a case of losing themselves in screens. pic.twitter.com/VHDDTVbATx
Combining K-12 #edtech trends with consumer tech trends affecting today’s 15-yr-old,: In 2022 smart speakers will turn out to be transitional, XR will be common for group/social interaction until hurdles are overcome, and Apple may be hurt by the lack of edu exposure it once had. pic.twitter.com/WTjSk28lAO
Key issues for teens over the next three years in tech products? Expectations of “free” software, potential for a major privacy breach and attendant backlash, and increasing concerns by teens themselves of having too much screen time. pic.twitter.com/qweeXpD6qk
I’ve done my best to summarize an hour of @CAMEXshow detail. But it may give you something to think about as you consider how today’s in-school K-12 #edtech trends and out-of-school consumer tech trends are shaping the expectations of a 15-year-old between now and 2022.
Many people don’t have a clue how journalism works. Journalists may have less access to events and their newsmakers than the general public. All this for a career choice that has limited job options.
Those are the headlines from my recent temporary return to full-time journalism after a several-decade hiatus. The full story I lived through as a fact-chasing Rip Van Winkle is more nuanced. Yet dramatic cuts in journalists’ ranks and an apparent increase in attempts to control what’s produced not only makes doing the work more challenging, it may combine to undermine what the public gets in good journalism, especially at the local level.
In 2018, I decided to step up my journalism game. After leaving an executive position in education technology at the end of 2017 following a corporate ownership change, I took the next year to rediscover my reporting chops. I shifted from long-time contributor to the tech news site GeekWire to the role of regular columnist and then, for an intense four-month period at the end of the year, filling in as GeekWire’s interim deputy editor. All on a freelance basis.
It was eye opening.
It wasn’t that I was taking a financial risk. Much like “gentlemen farmers” of an earlier era who made their living elsewhere, I was a “gentleman journalist.” I expected to be paid — this was a profession, after all — but I didn’t expect to have to live only off of that income.
I discovered much has changed since I left full-time journalism 30 years ago, back then as part of a good-sized, all-news radio station newsroom in Seattle. The rise of digital was the least of it. It was the apparently changed public understanding, even appreciation, of journalism, coupled with a precipitous decline in the number of professionals in the craft since the turn of the century.
As earlier in my career, I thought I could do some good. At the very least, I knew I could explain the inside workings of tech to those outside, or give those in the industry a different perspective.
But I wound up getting that revised perspective, too. My top three takeaways:
Credentials limit access as much as they grant it.
Generally, an event issues credentials to members of the press to spur coverage. The implicit bargain is that the event will waive admittance fees or criteria in exchange for exposure — good, bad, or neutral — as long as those being credentialed really do represent the news media.
Yes, there’s an element of control here: the event gets to decide who to credential. But reporters get access usually at least on par with regular attendees.
That was my experience for many years as a freelance columnist. But I witnessed a shift more to control than access when I dove in deeply in 2018.
There was the Amazon Web Services booth at a major education technology conference where staff were freely talking with anyone who walked up, including me, until one marketing employee glanced at my badge and immediately clammed up.
When I asked why, she said, “I don’t know if I should be talking to you.” I mentioned I was just looking for information she’d share with any attendee (and which she had just shared with the person who had been in the booth before me). She inverted the Amazon smile into a frown, and walked away.
At other technology trade shows, where a few years ago exhibitors would have pulled someone wearing a media badge into their booth to pitch their product, company representatives shied away. At one, I finally flipped my badge over so the “media” wasn’t visible; at another, I replaced my press badge with a regular attendee badge. Both approaches worked better to get, again, public information.
Then there was that instance at an otherwise-excellent and well-run major edtech conference where I was barred from a keynote simply for wearing the press badge it had issued.
Control has always been part of credentialing press. But the negative aspects seem more pronounced now. News media badges prevent conversations and observations that normally would occur with no problem — even when anyone with any attendee badge could quickly “cover” an event on social media or a blog.
My takeaway: If you want the real experience and full access, register as an attendee, unless it’s truly a limited-access event that you can only get into with press credentials.
Journalism as a career is in trouble.
Briefly in 2018, I considered returning permanently to writing and journalism. Sure, I’d heard that pursuing a “traditional” news or writing career was hard now, but I wasn’t aware of how bad the situation was.
It’s really, really bad.
First, there’s the number of jobs. While specialty digital news organizations like GeekWire are growing, overall, reporting positions are in decline. In mid-2018, Pew Research Center released its analysis of federal job stats.
The analysis finds that from 2008 to 2017, newsroom employment in the U.S. dropped from 114,000 to 88,000, for a loss of 27,000 jobs. Newspapers were hit the hardest. The only significant increase in employment was seen in “digital-native” news organizations, nowhere near enough in number to make up for the decline.
A separate Pew analysis found about a third of large U.S. newspapers and digital-native news outlets have seen layoffs between 2017 and 2018.
These cuts and outright news organization failures have led some observers to fear a growing number of “local news deserts,” where there are no daily local news outlets at all.
Then there’s pay. Despite some politicians’ claims, no one gets rich in journalism unless you’re one of those rarified celebrity news figures. That was true in the 1980s, and seems more true today.
And freelance? Never mind. A recent Authors Guild survey, the largest U.S. survey of published authors ever, found the median income of published writers in 2018 was $6,080, down from $10,500 in 2009. This includes book authors.
Part of the blame lies in how digital platforms like Google and Facebook have upended advertising that news organizations used to rely on to pay staff and other bills. Another lies in the lure of “free” news pulled together from various sources by aggregators, giving those who don’t want to pay for a subscription a no-cost alternative.
Together, the takeaway is that it’s harder than the last time I worked in a newsroom to make a living as a full-time journalist or writer — if you can find a job.
People don’t understand how journalists work.
Perhaps the most troubling of the three takeaways is that much of the general public doesn’t seem to understand what journalists do and how they do it. That’s anathema to the role of independent journalism in a democracy to provide good information and check accountability.
Let me be very clear about this: As long as there is a US Constitution and a First Amendment, you can't stop #journalism. Journalists have NEVER been well-loved. Those in power historically don't care for independent messengers. But if you want democracy, you need accountability.
I received politely haranguing calls from public relations people asking me to “re-frame” an already published story — not because any facts were wrong, but because it didn’t match the company’s preferred slant. (I smiled.)
Yes, there are still many good public relations practitioners who realize where their jobs end and the journalists’ begin. Still, even wearing a marketer’s hat, I was surprised by the barrage. It must work with some writers, because it happened frequently. (To be clear: It didn’t work at GeekWire.)
I am still amazed by the number of PR people who send me details for @GeekWire they say are under embargo, without asking first if I'll honor an embargo.
Without advance agreement, there is no embargo.
Note I said PR 'people.' Not 'professionals.' PR professionals know better.
All of this appears far more blatant and — dare I say it — clueless than it was three decades ago.
Plus, there’s the issue of trust in the news media by the general public, which Gallup shows is lower than it was 30 years ago.
Maybe it’s because three decades ago, memories of Watergate and journalists’ key role in exposing a presidential coverup were still fresh. Reporters were celebrated in popular culture in films like Broadcast News, All the President’s Men, and The Killing Fields. When we had more local journalists, we more likely knew someone who was a reporter and better understood what they did.
Or, perhaps, maybe today some journalists are so overworked, underpaid, and fearful for their jobs it’s considered easier to push them and see what happens.
Whatever the reason, the lack of public understanding is a bad thing. Directly being on the receiving end of it didn’t make it better. Even if I’m just a sample of one.
After a year of increased intensity, I have a better appreciation for those who choose to be journalists in the current news environment. It’s more of a gutsy choice than when I practiced journalism full-time until the late 1980s, and very different than what I’ve experienced as an external columnist and contributor to various news outlets over the past 25 years.
Sometimes, you have to be inside to realize how much the view from outside diverges from reality.
I’ll keep writing — I can’t not write — and submit that writing to GeekWire and other outlets as I do other work. I’ll continue to support credible for-profit and nonprofit news organizations with my subscription and donation dollars. I’ll proudly stay a supporting member of the Society of Professional Journalists (anyone can join).
At the same time, I’m more aware that getting the occasional benefits from a farm are far different than planting and working the fields every day. To be more than a gentleman farmer, you have to be willing to regularly rake the muck. The same is true of being a real journalist.
(My personal thanks to the professional team at GeekWire, which has allowed me to work with them and contribute since the site’s 2011 start.)