Mar
10
2015

Should I outsource my school’s Wi-Fi network?

Picking up from our last discussion around cloud wireless, let’s take a look at the specifics around when a school district should move forward with either an on-premises or managed Wi-Fi solution.

The Wi-Fi market is continuing to see incredible growth, with it projected to reach $6.7B by 2018, per Dell’Oro Group.  Nowhere is this growth more profound than in the education market. School systems are continuing to roll out new BYOD and 1:1 Computing programs, resulting in more mobile devices on the network, putting an incredible strain on both infrastructure and IT staff.

Schools need a top performing wired and wireless network for high bandwidth applications like YouTube in classrooms and cloud applications such as Moodle or Blackboard.  They also require centralized user and security policy administration to support learning initiatives from online testing to flipped classrooms and beyond.  Finally, there needs to be quick resolution to connectivity issues for staff and students.  Any downtime can result in lost productivity and frustrated students and teachers.

With all this in mind, when does it make sense for a school district to make the switch from an in-house Wi-Fi solution to a managed offering?  Key to this decision is to ask the following questions:

  • Single-site, Multi-site, and Multi-tenant Facilities – does the district have difficulty managing multiple buildings?
  • Wireless network expansion driven by BYOD – is the district able to keep pace with the number of new devices coming onto the network?
  • Limited IT resources – does the district have enough resources to manage the network?
  • Day-to-day maintenance/management – if the staff is limited, would districts benefit from offloading IT burdens and augment their existing resources to focus on what they do best—serve their students and faculty?
  • Wi-Fi Security – does the IT staff have real-time and historical reports and trends into network health, users, connected devices, capacity and usage? Can they make informed decisions on network investments to improve the user experience?

Critical to all of these questions is making sure the district also has the flexibility to go from an outsourced model to bringing management back in-house – especially, knowing that E-rate funding has no guarantee year after year. Also, most school districts do not want to get locked into a long term subscription model.

Schools that receive E-rate funding this year and choose ADTRAN ProCloud Wi-Fi need not worry about leveraging their Wi-Fi investment in the future, since they have the ability to bring the solution in-house via Bluesocket vWLAN and manage it themselves – when and if they are ready.

A school system’s first priority is students. Giving them an education that facilitates long term academic success while instilling confidence and critical thinking skills requires a modern curriculum and a dependable network. That’s why the network solution needs to be one that enriches this experience and puts learning above all.

Be sure to visit ADTRAN at the upcoming CoSN 2015 conference in Atlanta from March 16-19 to gain a better understanding of managed Wi-Fi and its benefits to advancing K-12 goals and initiatives.

Jason King is the director of marketing for the Bluesocket Business Group at ADTRAN. With over 15 years’ experience in the industry, he is responsible for the overall promotion and positioning of the company’s Wi-Fi solutions. Find him on Twitter <em>@jjking24

 

Permanent link to this article: http://blog.adtran.com/should-i-outsource-my-schools-wifi-network/

Mar
09
2015

The Benefit of Gigabit: Customer Insight

While the benefits of Gigabit deployments far outweigh any criticism, some skeptics question why companies would go to the trouble of rolling out ultra-fast broadband networks today. For those opponents, a recent announcement from Jackson Energy Authority (JEA), a municipal utility, should put to rest any concerns.

Just last week, it was announced that the city of Jackson, Tennessee and JEA have been named a US Ignite partner. The city wide fiber network provided by JEA, supports the community’s evolution into a globally connected commercial center.

Thanks to the announcement, JEA is attracting national attention as a direct result of its decision to upgrade its network  to deploy symmetrical Gigabit services to its 18,000 subscribers. Not only is JEA helping the local community prosper, but it’s being recognized for having the technology infrastructure in place to be an accelerator for economic, educational and community benefit.

Jackson Mayor Jerry Gist stated, “We compete against cities from across the country for economic development projects, and the recognition as a US Ignite city underscores the fact that Jackson is the best city for 21st century business and industrial recruitment due to the first-class fiber infrastructure that JEA has deployed.”

“The decision to deploy Gigabit services to our customers was easy. All we had to do was look at the incredible benefits, including stimulating economic development and how it could transform the community as a whole, to see that Gigabit was the way to go,” said Ben Lovins, senior vice president, telecom for Jackson Energy Authority. “The residential and business customers we talk to have high expectations for their Internet experience and this shows that we are on the right path to enabling future growth.”

Jackson Ignite – the city of Jackson and JEA’s partnership with US Ignite – delivers on the community evolution objectives from ADTRAN’s Enabling Communities, Connecting Lives campaign. So if after reading about the Jackson Ignite national partnership you’re still not convinced that Gigabit is valuable for the business case, continue checking with ADTRAN as more communities are set to be announced. Gigabit is the real deal and service providers are cashing in on new customers, new revenue opportunities and staying ahead of the competition.

Kevin Morgan is the Director of Marketing at ADTRAN and has over 25 years of advanced communications technology, fiber optic systems, and business product marketing experience and is the FTTH Council Vice Chair. 

Permanent link to this article: http://blog.adtran.com/the-benefit-of-gigabit-customer-insight/

Feb
25
2015

“And ponies! Free ponies for everyone!”

I had to laugh when I saw Marc Andreessen’s comment in a press release on “President Obama’s Plan to Regulate the Internet” issued by FCC Commissioner Ajit Pai’s office last week.  While most of the comments were along the lines of “If it ain’t broke, don’t fix it”, “global Internet will suffer”, lack of transparency and “an unusual attempt by a President to influence a legally independent agency”, I thought Andreessen’s “free ponies” comment pretty much sums up the current goat rodeo in Washington.

On Thursday February 26, we expect the FCC to agree to new rules that re-adopt and expand the previous Open Internet rules that were thrown out by the Court of Appeals in early 2014.   The previous rules were relatively simple and carried forward the policy of a “light regulatory touch” first articulated when Democrats controlled the FCC under President Clinton:

  • Transparency. Fixed and mobile broadband providers must disclose the network management practices, performance characteristics, and terms and conditions of their broadband services;
  • No blocking. Fixed broadband providers may not block lawful content, applications, services, or non-harmful devices; mobile broadband providers may not block lawful websites, or block applications that compete with their voice or video telephony services; and
  • No unreasonable discrimination. Fixed broadband providers may not unreasonably discriminate in transmitting lawful network traffic.

These rules were all subject to “reasonable network management.”  The Court of appeals had vacated the discrimination/blocking rules, however, because the FCC claimed authority under Section 706 to adopt the rules, but those rules were “common carrier”-type regulations, which was inconsistent with the FCC’s classification of Internet access services as an “information service.”

In response to the court decision, the Commission originally proposed to tweak the rules so they would pass Court review consistent with the 2014 court decision and the FCC’s authority under Section 706.  Unfortunately, succumbing to political pressure, including unprecedented intervention by the White House, the FCC will decide instead to reclassify Internet access services as “telecommunications services” under Title II, and to apply the new rules equally to fixed and mobile broadband.  Such a reclassification does allow the FCC to impose utility-like regulations on Internet access providers, but also carries with it regulations applicable to telephone services dating back to 1934.  The FCC is attempting to limit the application of some of these anachronistic regulations by “forbearing,” but many of the more burdensome provisions — including after-the-fact rate regulation and the specter of class action lawsuits — remain.

The new rules specifically adopted in the FCC order to govern Internet access service would provide:

  • No Blocking: broadband providers may not block access to legal content, applications, services, or non-harmful devices.
  • No Throttling: broadband providers may not impair or degrade lawful Internet traffic on the basis of content, applications, services, or non-harmful devices.
  • No Paid Prioritization: broadband providers may not favor some lawful Internet traffic over other lawful traffic in exchange for consideration – in other words, no “fast lanes.” This rule also bans Internet service providers (ISPs) from prioritizing content and services of their affiliates.
  • A Standard for Future Conduct: Because the Internet is always growing and changing, there must be a known standard by which to determine whether new practices are appropriate or not. Thus, there will also be a general Open Internet conduct standard that ISPs cannot harm consumers or edge providers.
  • Greater Transparency: The proposal enhances existing transparency rules, which were not struck down by the court.
  • Reasonable Network Management: For the purposes of the rules, other than paid prioritization, an ISP may engage in reasonable network management. This recognizes the need of broadband providers to manage the technical and engineering aspects of their networks.
  • Interconnection: For the first time the Commission would have authority to hear complaints and take appropriate enforcement action if necessary, if it determines the interconnection activities of ISPs are not just and reasonable, thus allowing it to address issues that may arise in the exchange of traffic between mass-market broadband providers and edge providers.

The Good

The good news is that the Tier 2 and Tier 3 service providers who are accustom to operating in a regulated environment should remain on track with their plans to leverage the Connect America Fund (CAF) subsidies. These subsidies provide funds to build out broadband networks to greater and greater speeds. The recent CAF 2 Order increased the required broadband speeds from 4/1Mbps to 10/1Mbps and the FCC recently changed the definition of broadband from 4/1Mbps to 25/3Mbps.

The Bad

The rules themselves create confusion because of their ambiguity.   An ISP contemplating a new offering cannot know whether the FCC will decide that the service “harms consumers or edge providers.”  For example, AT&T’s sponsored data plan, where the content provider pays the usage charges of the customer, has been challenged as violating “net neutrality,” despite the fact that it’s just like toll-free calls, which have been perfectly legal (and beneficial to businesses and consumers) since the 1960s.  In addition, the FCC would allow prioritization of “specialized services,” but that is another vague term.  In sum, the uncertainty of knowing what conduct/services may be deemed unlawful is likely to deter ISPs from making investments in new services.  The threat of after-the-fact rate regulation will also deter investment in new facilities.

The Ugly

The FCC’s rules are likely to be challenged at the Court of Appeals as there are significant problems with this FCC rulemaking.  On a number of occasions, the FCC had examined how to classify Internet access services, and each time they decided it was an “information service” and thus outside of Title II.  The FCC is reversing all of those previous decisions, even though none of the salient facts have changed.  In addition, there is a very difficult statutory problem with the FCC’s application of Title II to mobile broadband, because it is not connected to the Public Switched Network.  There may also be challenges to their forbearance decision, since the FCC is not following its own standards of undertaking a granular, market-by-market analysis.  On top of those substantive problems are the procedural irregularities, including the influence of the President (which is also the subject of a Congressional investigation).

Following the release of the Open Internet Order by the FCC, we anticipate that the Tier 1 Telcos, MSOs and Wireless industries will begin litigation. The FCC’s order will also be challenged by net neutrality advocates that want more extensive regulation, including unbundling.  We also believe that the FCC’s Order will be eventually found to be unlawful.  These industries will petition to stay the FCC Order but it is unlikely that tactic will be successful in the courts because of the high hurdle for such judicial relief.  In theory, Congress has been studying a legislative fix that would apply “light touch” regulations, but passage of any legislation these days is difficult, and would face a veto in any event.

The bottom line is that we expect the FCC decision on Network Neutrality to inject a lot of uncertainty and confusion into the broadband arena, although we won’t know important details of the new regulation until the text of the decision is released, which could take several months.  The long term problem we’re going to face as a telecommunications industry is that the FCC has, sadly, become just another pawn in the disconnected, partisan dog-and-pony show that Washington DC has become.

Gary Bolton is Vice President, Global Marketing for ADTRAN. 150x150 Gary Bolton “And ponies! Free ponies for everyone!”

Permanent link to this article: http://blog.adtran.com/and-ponies-free-ponies-for-everyone/

Feb
06
2015

Super-Vectoring: Great for Central Europe, Less so for North America (3 of 3)

This is the third and final installment to my blog series regarding an enhanced DSL technology known as Frequency Division Vectoring (FDV). This technology and other super-vectoring technologies are being looked at by operators to once again allow them to leverage billions of dollars (or euros) of investment in their Fiber-to-the-Cabinet, -Node and -Curb deployments.  These super-vectoring technologies can double the performance of today’s vectored VDSL2 supplying up to 300Mbps of broadband service. This allows operators to stave off both the Cable/MSO competition as well as the high cost associated with full blown Fiber-to-the-Home (FTTH) deployments.

Now that we’ve looked at some of the challenges in the market and the DSL acceleration technologies that have emerged to address those challenges, today we’ll explore what this means for service providers and the business opportunities ahead. Before getting into the operational benefits of deploying FTTx advancements, whether deployed in lieu of or in conjunction with a full FTTH deployment strategy, we will look at the market drivers for expanding premium broadband.

Today, in North America at least, the average peak demand for a residential broadband connection is approaching 50Mbps. At first glance, this seems hardly worthy of an investment in FTTH or these innovative 300Mbps FTTx technologies.  However, with disruptive cable operators investing in Gigabit service supporting networks via DOCSIS 3.1, and other disruptive new service provider entrants expanding their FTTH footprint, there is a need to compete with headline speeds. Many consumers simply pick the higher rate when comparing a similarly priced offer.

Emerging consumer applications– as always– will drive the demand for bandwidth and operators have to be ready to deliver or risk being marginalized by the market and competition. Cloud-based home backup and Ultra HD (4KTV) are poised to push the bandwidth demand to 100 Mbps or more per home in the next few years. The shift to cloud services, mobile broadband expansion, the growth of Internet of Things and the demand for community development all play a part in moving the needle.

Great for Central Europe, not so much for North America

To understand the business case advantages offered by these new FTTx technologies, we need to understand an important FTTx metric – copper loop length. This provides an important indicator of which geographic regions and which residential markets would be in favor of FDV and therefore most likely to champion the industry standardization and network homologation of such technologies.

There are longer FTTx loop lengths deployed in North America, whereas in central Europe they generally have deployed their services cabinets much closer to the customer, many of which fall into the loop length ranges that could take advantage of super-vectoring service rate gains. As a proof point, since I last wrote on this topic in December, there has been an ITU-T standards meeting in Geneva where European incumbent operator DT was proposing to create a new annex for enhanced data rates compatible with today’s VDSL2 17a services deployed in Europe today.

It is important to note that for any super-vectoring technology to be usable, whether FDV or another, it must be standardized and available from all of an operator’s vendors; must be deployable into existing cabinets and must run in parallel with the existing vectored VDSL2 services. Of course, the technology must also improve upon the performance of today’s vectored VDSL2 service rates at intermediate FTTx copper loop lengths, generally 200 – 400m from the customer. This loop length is too far for typical FTTdp/G.fast deployment scenarios but much closer than 500 – 700m distances…perfect for vectored VDSL2. Super-vectoring is all about squeezing more bandwidth out of these shorter copper loop lengths sitting in no-mans-land – too long for FTTdp and too short for vectored VDSL to take advantage of.

The Overall Cost to Pull Fiber

How committed a service provider is to their large investment in FTTN/Cab comes down to the cost of the alternative which is pulling fiber down or across the street, through a yard or wall and throughout a residence.  This cost is what is saved by leveraging next generation DSL technologies like FDV and G.fast as the premium bandwidth is already installed. New construction houses, condominiums and apartments, known as greenfield deployments, are strong targets for FTTH. We want to pull new fiber… not new copper. This means not just though the street but the building as well. This latter part has been a challenge as many new homes continue to be wired with CAT3 telephone wiring or 100Mbps CAT5/6 Ethernet cable. When service providers get involved in new residential development they plan for these new homes to be constructed with optical fiber through to the wall socket. Now what about the cost of upgrading existing homes with copper wiring to optical fiber?

The cost savings afforded by cabinet and distribution point FTTx deployments is based on the cost to pull fiber those last few hundred meters to and through the home. US$500 – 700 dollars per home is the range seen most often in published business cases. The cost to pull fiber can vary. It can be as low as $1 per ft ($3 per meter) when good quality duct and conduits exist, or it can cost a magnitude higher when boring is required. Boring can be required when ducts are full, do not exist or are in disrepair. This is the case for most homes over 20 years old and historic streets and buildings. North America has relatively newer buildings compared with Europe so lower FTTH connection costs generally apply. That said, according to the 2013 American Housing Survey from the US Census Bureau, the average residential dwelling was built in 1974. Less than 15% of homes are younger than 15 years old increasingly the likelihood of requiring higher priced techniques to connect homes with fiber.

Multi-dwelling units (MDU), apartments and condominiums house a large percentage of the population and offer a great opportunity to leverage existing fiber that often pass by these buildings. The cost to connect each suite can be costly and with 25% of the population living in MDUs and as high as 40% in many dense urban environments, MDUs must be a target for delivering premium broadband services. Leveraging FTTx technologies allow operators to quickly and cost effectively add subscribers. And unlike DSL technologies of the past, the existing phone wiring delivering broadband is owned and maintained by the landlord – not the incumbent telco allowing competitive service providers.

Super-vectoring as an accelerant to Fiber rollouts

DSL acceleration technologies have sometimes been criticized for taking resources away from – and ultimately delaying – the deployment of FTTH. G.fast and FTTdp has been the most recent target of that criticism. However, the prudent operator and careful observer of these technologies recognizes that when a FTTx solution can allow new bandwidth-intensive applications to be rolled out, this in turn drives demand for FTTH. FTTx deployment models allow operators to maximize fiber deployed in the street today allowing for an important return on that asset which raises monies to be later invested in FTTH.

In the same way super-vectoring should not detract from the momentum of other premium broadband deployment models like G.fast-fueled FTTdp and PON fueled FTTH. FDV meets this important requirement. FDV is aligned with the DSL technology roadmap. FDV leverages advances in G.fast chipset technology and is therefore synergistic with G.fast deployment plans being constructed by dozens of the world’s largest broadband network operators.

Super-vectoring technologies like FDV allow incumbent operators to deliver double the bandwidth currently deployable out of their existing cabinet infrastructure. Raising service rates from today’s 100 – 150Mbps up to 200 – 300Mbps. These rates provide important provider differentiation and in the case of FDV help accelerate the deployment of 500Mbps and greater FTTdp deployments by seeding G.fast components into the network.

As handheld and home devices and appliances become more powerful, more connected; As consumer and home automation applications become more cloud-based and more sophisticated, your residential broadband connection must become a high bandwidth, low latency cloud services conduit. FDV is another tool broadband network operators can use to meet this network performance objective.

Kurt Raaflaub serves as ADTRAN’s senior manager of strategic solutions marketing, and has more than 20 years’ experience in telecom. He has global solutions marketing responsibility for SDN/NFV, Gigabit Broadband, Packet Optical, Carrier Ethernet-based Cloud Connectivity as well as managed/hosted ProCloud services delivery for residential, enterprise and backhaul markets. Prior to his current position, Raaflaub was responsible for directing ADTRAN’s Broadband, Carrier Ethernet and Packet Optical solutions marketing activities within ADTRAN’s Carrier Networks Division. In 2006, he joined ADTRAN from Nortel where for over a decade, he held various roles focused on marketing and managing new disruptive market opportunities.

Permanent link to this article: http://blog.adtran.com/supervectoring-great-for-central-europe-less-so-for-north-america/

Jan
29
2015

A Flexible Approach to Wi-Fi for E-rate

The meteoric rise of smartphones and tablets combined with the transition from desktop computers to laptops are putting substantial strain on Wi-Fi networks at today’s school districts. Not only are more devices connecting to these networks but students and faculty expect to remain always on and are increasingly streaming media and other high bandwidth applications in classrooms.

With the new funding for E-rate making the news, school districts are looking to take advantage of this program to build out a world class wireless network for their students and staff.

The demand for more bandwidth, more real-time multimedia and access to e-textbooks and educational applications, such as Moodle and Blackboard, all via Wi-Fi, means that any drop in performance is immediately noticeable by users. In addition, as schools migrate from a primarily wired infrastructure to a wireless one, the growth in devices on the network also highlights a need for additional security.

IT staff at school districts have been trying to keep up with user needs and expectations but the requirements are evolving fast. Today’s IT needs to be faster, more nimble, handle many devices, provide tighter security, scale quickly and be cost effective. To achieve this, a new approach needs to be taken to the problem.

The traditional Wi-Fi architecture has been based upon a controller-based switch that becomes the central point of intelligence and control for all access points (AP’s).  The controller becomes the choke point and bottleneck for the network, requiring IT to add more controllers as inevitably more users and devices come onto the network.

This traditional architecture has been replaced by a Cloud Wireless design where the controller is eliminated, with management and control of the network virtualized in the cloud.  This approach greatly increases the ability to scale the network to meet Bring Your Own Device (BYOD) demands, with the ability to support a factor of 10X more devices than before.

The Cloud Wireless approach has also opened up the possibility of educational institutions taking advantage of a managed and hosted service for their Wi-Fi network, offloading routine network management burden from their strapped IT staff.

With so many choices for today’s K-12 IT staff when it comes to their wireless network and mobility needs, there are several questions that come to mind:

  • Should I opt for a hosted/managed service vs. managing it on-site?
  • Should I go with 802.11n or 802.11ac?
  • What kind of back-end switching network do I need?
  • What kind of security do I need to ensure privacy mandates?
  • And on and on the list goes…

We will dive into these issues in later posts.  The key point is as IT managers wade through all the options to pick the right wireless solution for their situation, they need to ensure their network will be able to stand up to the demanding needs of their users.

 

Jason King is the director of marketing for the Bluesocket Business Group at ADTRAN. With over 15 years of experience in the industry, he is responsible for the overall promotion and positioning of the company’s Wi-Fi solutions. Find him on Twitter: @jjking24

Permanent link to this article: http://blog.adtran.com/a-flexible-approach-to-wifi-for-erate/

Older posts «