Sunday, April 13, 2014

Doctoral Dissertation Defense

Some friends and colleagues have followed along in blog posts and social media as I pursued an Ed.D. (a doctorate in education) at the University of Pennsylvania over the course of the last two years. I've now completed the journey, successfully defending last week, and I've decided to use two blog posts to share what the defense experience was like. In this first installment, I'll post my opening remarks. 
 ____________________________

Introductory remarks by Deke Kassabian on Wednesday April 9th, 2014, 
at the defense of his dissertation entitled 
"MASSIVE OPEN ONLINE COURSES (MOOCS) AT ELITE, EARLY-ADOPTER UNIVERSITIES: GOALS, PROGRESS, AND VALUE PROPOSITION"

Thanks to all in attendance today. It’s early in the morning and I appreciate the support and the effort it took. Most of all, thanks to the members of my committee for their guidance, helping me to arrive at this moment.
My story starts not in 2008 at the birth of Massive Open Online Courses, but in 2011, as MOOCs burst onto the higher education scene. When Stanford University opened two of their Artificial Intelligence courses to the public, over the Internet and at no charge, more than 100,000 people showed up for each course. Leaders across higher ed took notice.
During 2012, which was called “the year of the MOOC” by the New York Times, Massive Open Online Courses went from being virtually unknown to among the hottest topics in higher education. 2012 was the year when higher education insiders formed three major MOOC platform and distribution companies, Udacity, Coursera and edX. Soon after, MOOCs began to appear from some of the top universities in the country. Since first coming to public attention, more than 800 open online courses have been made available through the three largest MOOC providers, featuring the faculty and course content from more than 200 of the most well known universities in the world. Many millions of students have by now taken these courses. The numbers of courses, universities, and students all continue to grow.
MOOCs quickly became the subject of hyperbolic claims ranging from how they would “save” higher education through improvements in student throughput and cost efficiency, to how they would “doom” higher education through the casualization of the faculty labor force or even lead to the closing of universities crushed under disruption and unbundling effects previously experienced in a range of industries when the Internet profoundly affected their business models.
Through those early days, MOOCs appeared as feature stories in the higher education and popular press almost daily, while only slowly becoming the topic of research papers in scholarly journals.
Some of the early MOOC evangelists described the potential for MOOCs to help with higher education biggest challenges. MOOCs clearly make education more available to more people. Whether MOOCs also help with cost control by scaling up some classes, or completion by leveraging MOOCs for advanced placement or other credit, is a matter that continues to be debated. Interestingly those latter challenges do not appear to be the focus of the early adopter universities. As I will describe shortly, they have other goals.
If enthusiasm for MOOCs in the early days was high, MOOC skepticism is growing just as rapidly. Past large scale online education efforts have failed, so it is reasonable to wonder whether MOOCs will turn out to be merely a higher education fad that will fade away.
Certainly recent reports of low completion rates have shifted the MOOC narrative. In the parlance of the Gartner “Hype Cycle,” 2011 and 12 may have been the “peak of inflated expectations,” and 2013 and 14 may be the “trough of disillusionment” in which MOOCs don’t turn out to be the silver bullet that neatly solve higher education’s problems. A worthy question is whether MOOCs then follow the classic hype cycle and rise through a “slope of enlightenment” toward a “plateau of productivity.”
The research that this dissertation describes is not a study of the MOOC phenomenon generally. It is not a study of students or learning outcomes, and it is not a study of higher education disruption. Those are all interesting topics, and all of them influence and intersect with this research. But this research had a specific focus. This research is about the goals, progress, and the value proposition of the elite early adopter universities.
Since 2012, many elite universities have developed MOOCs, but their motivations have not been entirely clear. What do they hope to achieve and learn through their early efforts? How will they assess success? Do they plan for MOOCs to play a long-term role in their education mission, and if so what is that role? For those early-adopter universities that plan ongoing MOOC programs, what is the value proposition that they seek? The purpose of this case study research was to explore these questions in some depth.
Qualitative, case study methods were used for this research. For study sites, I looked to elite U.S. universities that have offered multiple MOOCs through major MOOC providers. My selected study sites were Columbia, Duke, and Harvard.
For each site studied, I requested interviews with those involved in strategy development and decision-making regarding MOOCs. I also requested interviews with faculty members involved in planning or teaching in the MOOC format as well as faculty members who were willing to share concerns and skepticism about MOOCs.
41 people in all were interviewed across the 3 study sites. All of those interviewed reviewed and signed a consent form in which the intent of the study was described. The consent form also made it clear that I reserved the right to directly attribute quotes, and I used this ability throughout the dissertation.
Triangulation was pursued through a combination of documents, observations and multiple interviews per site. Documents available to the public were collected, and relevant internal documents were requested during site visits. 
At two of the three studied sites, I was fortunate to have the opportunity to directly observe relevant activities. At Columbia, I was invited to attend an early “flipped classroom”, while at Harvard I observed a group meeting on research approaches to MOOC data. These observations contributed to my understanding of the evolving university culture around MOOCs. In each case, I listened for elements of discussion that suggested short and long term goals from decision-makers, and also used what I heard to adjust the questions I later asked during interviews.
Access to top decision makers was challenging at each of the studied sites. Fortunately, many were generous with their time. Where interviews were not granted, I sought their comments in public documents.
At one of the study sites, a key faculty leadership voice declined to comment on the record. This faculty member spoke with me off the record and had very relevant things to say, which helped to shape my thinking. The faculty member’s direct comments, however, could not appear in this study.
Here is some of what I learned. The sites in this research study do not expect either of the dramatic outcomes mentioned earlier. They don’t speak of MOOCs saving or disrupting higher education. Instead they are interested in the more modest and reasonable potential that MOOCs have to contribute to their mission plans in the areas of education and outreach, and to study the ways in which higher education may evolve in the Internet age. 
A key finding of this research was that the goals of the elite, early adopter universities studied do not fully align with the public narrative found in the press. The studied universities are interested in expanded access to education, but may be even more interested in teaching innovation and benefits to on-campus education. Other goals include providing more visibility for some of their educational programs and faculty, and enabling more evidence-based education research, toward a better understanding of how students engage with online course material and how they learn. Improvements to completion or cost control were not goals.
Officials at all 3 sites admitted that it was still challenging to measure progress toward their goals, saying that it was still too soon to know exactly how to do so. They say that they will continue their involvement in MOOCs for at least another few years in order to pursue their goals and to develop maturity in their ability to measure progress toward meeting those goals. Along the way, these universities demonstrate higher education leadership through a new educational form at a time when higher education may be facing pressures to change.
While MOOCs may not yet – and may not ever – be the “game changers” for higher education that some predicted, neither are they disappearing. Instead, they appear to be poised to play an important role in the higher education picture for at least the next few years, and perhaps beyond, with a strong value proposition for elite early adopter universities.
This study concludes that the value proposition for these early adopter universities is the ability to simultaneously pursue the goal of improving on-campus teaching and learning while also promoting the university and its faculty and connecting through educational outreach with the public – all while showing leadership in an emerging higher education learning technology.

I’ll close there and turn things back to my chair for the question and discussion portion of the hour.


Later this month in part 2, I'll post a portion of the Q&A with my committee. As always, thanks for reading! -Deke 

Wednesday, March 26, 2014

Net Neutrality, Part 2

Locking the Little Guys Out?


Back on March 5th I blogged about net neutrality basics, in order to provide background to support a discussion of some recent events making news.  Today, I’ll talk about those recent events.

Image Credit: Huffington Post
In the simplest terms, Net Neutrality advocates believe that all Internet traffic should have equal footing when moving through the network, without being blocked or slowed down. A commonly used example is that an ISP should not be permitted to artificially slow down the traffic of one streaming video service in order to make some other streaming video service (perhaps even its own) look better by comparison.

Net Neutrality is surely more complex and nuanced, but that example helps to convey the main idea.

During the first few months of 2014, a few interesting things have developed and Net Neutrality may be a useful lens through which to understand them. The following is my best current understanding of the situation, but I’m not a lawyer or a communications policy wonk! I welcome corrections, arguments, and differing opinions!

Let’s look at each of those things a little more closely, beginning with…

The Weakening of the Open Internet

The January US Court of Appeals ruling on Net Neutrality actually maintained the expectation that ISPs should be transparent regarding their traffic handling and special business arrangements in order to avoid opaque anti-competitive practices. The court ruled, however, in a way that may allow broadband providers to charge for expedited services. Back in 2010, I argued that something like this could be a sensible model. If transparency was maintained, different service levels could reasonably be offered at different price points. In 2014 I find myself a little less persuaded by my 2010 thinking, in part because…

The Broadband Market is Coalescing

The number of viable options for consumer Internet access to the home is small and shrinking. In many markets, it’s a choice between telephone company Internet access (such as Verizon FiOS) and cable television company Internet access (such as Comcast Xfinity Internet). Certainly, more options could be better for consumer choice in terms of features and service offerings, and could also provide more pricing pressure.

A merger of the two largest cable companies in the US to create a behemoth cable TV and broadband Internet company further reduces options. When asked about the impact of this proposed merger, Comcast says that it isn’t anti-competitive because there is virtually no overlap in the current Comcast and Time Warner markets served. I believe them. The fact that they are not in the same markets means that competition for consumer choice is not being directly reduced by this purchase bid. It seems likely that the two companies have intentionally stayed out of each other’s markets – a practice that saved them both money but was not good for consumer choice. Nevertheless, I think a Comcast acquisition of Time Warner Cable could still hurt consumers in a less direct way. The combined company would serve such a large portion of the home broadband market that it would wield disproportionate leverage with companies who provide bandwidth-heavy online services. That could lead to more …

Charging for Special Arrangements

There are many popular streaming video services that get lots of attention, including Apple’s television and movie rental and purchase services through iTunes, Amazon’s Instant Video, Hulu, and several others. But none are as popular, and account for as much prime time network traffic, as Netflix.

When you try to stream an episode of House of Cards from Netflix to your TV using your Comcast broadband Internet access, and the connection seems a little bumpy, do you curse under your breath at Netflix or Comcast? How did you pick?

Comcast and Netflix tell different versions of this story, and I’m sure each has a defensible position. But the “solution” they arrived at together seems to be a new business arrangement in which Netflix agreed to pay Comcast for network traffic handling that assures a better viewing experience for viewers. This particular arrangement seems to be more like a fee-based non-transit network peering arrangement than an expedited service arrangement. But either way, Netflix is paying the bill. At some point, I suspect Netflix customers will pay that bill.

Netflix CEO Reed Hastings does not sound like this was the approach he wanted. In a Netflix blog post on March 20th, Hastings said:

“Without strong net neutrality, big ISPs can demand potentially escalating fees for the interconnection required to deliver high quality service. The big ISPs can make these demands -- driving up costs and prices for everyone else -- because of their market position.” -- Netflix CEO Reed Hastings

Quite recently, rumors have begun to appear that Apple is negotiating a different arrangement with Comcast, but in the same general space. Apple, according to these rumors, wants assured high performance for their video content on the Comcast network. Let's watch this space closely to see if we are witnessing the start of a trend.

Parting thoughts

Comcast is a corporation with capabilities to sell, and the current and changing legal and regulatory environment makes it possible for them to sell preferred access (whether through peering or traffic “management”) to video providers like Netflix (and maybe soon to Apple). They have a responsibility to their shareholders to try to maximize revenue. Comcast is not a villain. Comcast is a for-profit corporation.

The questions that I believe are worth considering are these: Is Net Neutrality hopelessly lost, and if so, what are the implications of life on the Internet in a post Net Neutrality world?

A hypothetical might help: What becomes of the young, upstart company with the fresh new idea and technology? How can they break into the streaming video market? If they can’t pay the top dollar that Netflix and Apple can pay, their service on the Comcast-Warner (probably not their real name!) network might not look so good. Netflix and Apple will pay for fast access to your TV, and the upstart company will look slow by comparison and as a result their business will probably have a hard time succeeding. You might argue that that’s how capitalism works, but I’d argue that a level playing field is a key to preserving a healthy environment that benefits us all. Without it, we all may miss out on the advantages of competition and innovation.

Time for you to weigh in

Is Net Neutrality an important protection? If so, how should we maintain Net Neutrality while recognizing the legitimate business interests of broadband providers? Are broadband providers "information service providers" or "telecommunications carriers?" It turns out that the distinction matters for this debate.

If there are more development, or if this subject turns out to be interesting to enough of the RapidGroove readers, there may be a Part 3 at some point.

Links:



Thanks for reading! A blog works best with active participation. If you enjoy this blog, please give it a +1 and leave a comment. Share it on Twitter, Google+, or Facebook. More readers will drive more discussion.

Wednesday, March 5, 2014

Net Neutrality as Fighting Words

[This piece originally appeared as a guest column in Network World in 2010. The discussion seems relevant today as big cable companies and content providers figure out what's fair and how to conduct business together.]
The term network neutrality has been used lately to refer to a number of different ideas. One is that networks should be operated without any protocol filtering. Another is that the one and only business model for an ISP is one in which there is a flat fee for unlimited access at the specified line rate. And still another is that networks should be available to all, equally, regardless of their geographic location. There may even be more ideas wedged uncomfortably into this single term's common use.

No wonder we're fighting!

So, does net neutrality prevent ISPs from managing their networks? Does it mean that an ISP cannot favor some traffic over other traffic? Does it mean that some towns or homes, perhaps in rural areas, are guaranteed equal access to networks available in more heavily populated or wealthy areas?
First, I don't know anybody who argues that an ISP cannot manage its network. Monitoring for things such as link utilization and how heavily taxed packet forwarding components are over time is a normal part of operating any large network. Responding to problems found in such monitoring by adding capacity, upgrading software, or even re-designing networks, all are normal parts of network management.
Image credit: http://mathcurmudgeon.blogspot.com
The real touchy point, when it comes to network management, is whether an ISP can decide that some application traffic does not get the 'hands-off' treatment that the user expects -- that the ISP can instead slow some traffic down or stealthily terminate some sessions based on the application protocol or the user involved, in the interest of keeping resources more available to all. If the ISP does this without transparency to its users, that isn't network management. It's false advertising.

Then there's the subject of whether net neutrality allows for a business model in which some traffic is expedited. Those who oppose net neutrality because it would appear to preclude differentiated services are combining two issues in an odd way. Though I think the dishonest favoring described in the previous paragraph is ultimately a problem for users and for the development of new network services, I believe that expedited network traffic handling as a business arrangement, articulated in a service offering and an SLA and available to anyone willing to pay for it, can be a reasonable and fair business model.


Bandwidth and the consumer

The ISPs have traditionally operated more as bandwidth providers than as content providers (though some clearly want to play in both spaces going forward). The business model of being a bandwidth provider has its real challenges. There are ISP costs that really do scale with user load, but also a user community that much prefers flat-rate pricing to usage-based pricing. And as these users become consumers and producers of more rich media, global IP traffic is growing rapidly while ISPs revenue, linked more closely with the number of users, is now growing much more slowly.

ISPs naturally want their network investments to serve large communities in a cost effective way, and so count on significant statistical multiplexing. Many ISPs become concerned, quite reasonably, when network use by small numbers of resource-hungry users account for more than their 'share' of the finite resource, while the users maintain the reasonable belief that they paid for a certain amount of access bandwidth and just want to make full use of it some of the time.

In their bandwidth-provider role, ISPs have paid much attention in recent years to "file sharing" applications and users, and the response has sometimes been to manage their network to limit such use, despite not being crystal clear to their users that this is what they do. This part of the argument often gets emotional, probably in part because of the perception of illegal or immoral use of the network, and also because there are third parties who have financial interests in some of the content being shared and they are motivated to apply legal pressures. But from a strict capacity standpoint, it's still really just a matter of finite resources and significant diversity in demand.

Can't the users just select the ISP who will treat their traffic as they expect? Unfortunately, time has shown that the market is not organically providing this solution, at least in the consumer area.

Many neighborhoods and regions cannot support two or more bandwidth providers competing on the strength of service and price, given the investments required to operate. Providers recognize this and either select their areas of operation accordingly or "compete" only half-heartedly in some spaces. Perhaps this situation will improve over time when high bandwidth wireless options become more available as an alternative, but I'm not holding my breath.

Bandwidth and the content providers

In other cases, bandwidth providers are looking closely at legitimate content providers -- businesses whose services have become popular enough to account for larger percentages of traffic on a network. These might be media companies streaming television or movie content, or gaming services supporting Massively Multiplayer Online Role-Playing Games (MMORPG).  In these cases, we are seeing increased interest in additional charges on the content providers -- but these content providers also have the reasonable belief that they paid for a certain amount of access bandwidth and just want to make full use of it (but perhaps more than "some of the time").
It's important to recognize that while the content provider's network access will be provided by one set of ISPs, those accessing the service are likely using many other ISPs. The content, though, will travel over both networks (and likely some others in between). All of those ISP networks feel that strain. And all want to know whether there isn't some new arrangement that can help them to cover their costs.

The response to these problems so far has been wars of words, clumsy technical responses, and poorly informed false starts in regulatory bodies. But there are real issues and they deserve serious handling.

A service I would pay for

To the ISPs, my input is please don't "manage" your network by trying to decide which application protocols are good and which are bad, or under what conditions I can no longer use the bandwidth I think I paid to have available. If your network needs that kind of management, that should be a very clearly articulated part of the service offering. Tell users, in the form of an SLA, what the expected use is in straightforward terms. Perhaps that would involve not only access bandwidth numbers, but also permitted frequency of heavier use or even which specific protocols you will not always tolerate. Better yet, respond to conditions of congestion as a capacity problem, and in a protocol neutral way if possible.

I think I can describe what I personally want as a consumer. I don't think I need full line rate 24/7. But when I want major bandwidth infrequently, I want to know that the bandwidth is there without limitations not previously expressed to me. I want to know that the application protocols I use are up to the end stations that make up the connection, not the hops in between. I want to know that there are no artificial barriers to my using that bandwidth, artificially 'smoothing' my packet rates, re- setting TCP sessions, or changing IP addresses through NAT that might break some application protocols.

If I'm unlucky enough to be asking for network resources when others have already grabbed it, I lose at that moment -- and I'm OK with that. To me, this is a lot like when I try for city street parking. If I get there and there are no available spaces, I understand. But I don't want to drive down the street and see dozens of empty parking spaces that I cannot use because I've already parked a few times earlier this month or because my car is the wrong color.

Here are a few options for network service level agreements that I think many of us, whether individuals or businesses, could understand and live with:


  • Bandwidth is bandwidth is bandwidth. An access bandwidth is provided, and best effort service is provided. Any minimal filtering that's done is described in plain English. Example: 10Mbps service, best effort, all protocols and ports permitted except port 25 is blocked in an effort to reduce spam.
  • Managed bandwidth. An access bandwidth is provided, but the consumer or corporation can expect some bandwidth limits to be imposed. Example: 100Mbps service, best effort, all protocols and ports permitted except port 25 is blocked in an effort to reduce spam. No more than 2GB per calendar week, Sunday through Saturday. Charges beyond 2GB may apply.
  • Differentiated/expedited services. This one is more complex. It's like "bandwidth is bandwidth", but you can mark some small percentage of your traffic as priority, and the ISP will expedite handling at congested points in its own network. No guarantee beyond the ISP network. Example: 100Mbps service, best effort, all protocols and ports permitted except port 25 is blocked in an effort to reduce spam. Up to 10% of all packets offered per hour can be marked for expedited handling. Beyond 10%, charges may apply or markings may be ignored.

In that last case, a clear concern is that special handling probably ends when the traffic leaves the ISP network with which the user has the business arrangement. When these ideas were first seriously considered about a decade ago some of us imagined that initially the service would only apply in that limited scope. Later there could be ISP alliances in which their service models and terminology matched, and still later, there might be more complex peering arrangements, with or without settlements, that would allow expedited handling to be preserved as traffic crossed network boundaries. None of this materialized at the time, which I've taken to mean that adequate demand from the businesses of the time did not exist. Perhaps it exists now as more voice and video and online game services have come to the network.

Many additional SLAs, clear and simple and able to meet real user goals, are possible.

My preference is to solve most of these resource scarcity problems with big bandwidth when possible. If it isn't always possible, some network complexity is a necessity, but I think it should be accompanied by clear and understandable terms of service. If I have to "pick sides”, I'm in favor of net neutrality. But unlike some zealots, there's room in my understanding to allow for tiered services and expedited services if done in a fair, approachable, clear way, available to all.

Well, that was my thinking in 2010. I mostly still agree with myself, but later this month I hope to have more to say on the emerging stories making net neutrality news. -DK

Links: