Network Implementation Archives - My TechDecisions https://mytechdecisions.com/tag/network-implementation/ The end user’s first and last stop for making technology decisions Wed, 23 Dec 2020 19:09:59 +0000 en-US hourly 1 https://mytechdecisions.com/wp-content/uploads/2017/03/cropped-TD-icon1-1-32x32.png Network Implementation Archives - My TechDecisions https://mytechdecisions.com/tag/network-implementation/ 32 32 Designing and Implementing Technology at OUR New Office https://mytechdecisions.com/project-of-the-week/implementing-technology-workplace-office/ https://mytechdecisions.com/project-of-the-week/implementing-technology-workplace-office/#respond Thu, 31 Dec 2020 14:07:22 +0000 https://mytechdecisions.com/?p=21116 We write a lot about implementing technology in the workplace Emerald Expositions is the leading operator of B2B trade shows in the United States, operating more than 55 trade shows and numerous other events. The company also owns a number of trade publications, including My TechDecisions. The Tech Decision After being purchased by Emerald Expositions, My […]

The post Designing and Implementing Technology at OUR New Office appeared first on My TechDecisions.

]]>
We write a lot about implementing technology in the workplace

Emerald Expositions is the leading operator of B2B trade shows in the United States, operating more than 55 trade shows and numerous other events. The company also owns a number of trade publications, including My TechDecisions.

implementing technology, MyTechDecisions Office

The Tech Decision

After being purchased by Emerald Expositions, My TechDecisions’ team along with colleagues in sales, marketing, audience development, design, and those running sister sites Commercial integrator, Security Sales & Integration, and Campus Safety, were in need of a new office space.

After creating a scatterplot of employee addresses to find central location for all employees, the company settled on a brand-new office space at 100 Crossing Boulevard in Framingham, Mass.

The new space would serve as a fresh start for the newly acquired properties, but before moving in the entire office needed to be built from scratch.

That included hiring an AV integrator to implement technology up to the standards of the publications that cover the industry.

“I hired a project manager because this is across country for me. They are great – they represent Emerald. The project manager bid it out,” says Stacy Smith, executive assistant and corporate operations manager for Emerald. “We were ahead of schedule the whole time.

“Lee Pongraphan, our sr. director of infrastructure and services, and I worked closely for months on the spec we need on the technology side. Lee maps all that out, and then the project manager sources that out for AV,” says Stacy. “As soon as you sign the lease you’re working on the technology.”

In this case, the project manager chose Dnet Cabling & A/V Technologies to implement much of the technology alongside Emerald’s internal IT team.

“We know the project manager, so we knew clients would be moving in. We did the work downstairs (at the building). Mike Bogdan reached out and said they wanted to bring us in on the project,” says George Ziegler, AV engineer at Dnet Cabling & A/V Technologies. “We submitted a bid based on requirements from the client, they were satisfied with it, and we got awarded both the communication and AV installation.”

The Solution

The work started before even entering the building for Dnet.

“Before the project took off we started with communication,” says Ziegler.

“We made sure everything was kosher with the client before any equipment was purchased, and that they fully understood what the scope of work was – networking, sound masking, TV installations, and connections at the tables.”

Ziegler and his team first came into the space after walls were put up. They began by cabling for the communications. They weren’t handling the phone themselves, as it was a network system that then internal IT team would deploy.

They still needed to run Cat6 white cable to every location so when it came to the rack it was one color. They implemented a Systemax solution and unloaded patch panels in case of issues – they could easily remove or replace jacks or cables without needing take down the patch panel completely.

Workstations could then be moved around as needed.

“It was a pretty straightforward installation for the communications side,” says Ziegler.

A more unique aspect of the space was the sound masking solution from Cambridge Sound Management. The solution is built into the ceiling with a similar look to the sprinkler system in order to blend with office aesthetics.

Read Next: See all of our projects of the week here!

The level of white noise coming from the solution can be adjusted through a corresponding interface. A relay is inside the equipment so that during a fire alarm the sound masking will be muted so that alarms can emit to the correct decibel in the space, adhering to the electrical code.

Finally, two conference rooms in the new office are fitted with Samsung OLED displays. HDMI/USB extenders from Crestron were mounted under the tables, while a Cat6 cable runs from the device to the TV, enabling users further connection options.

Additional technology implementations at the office include a keycard access system, wireless printing capabilities, and guest Wi-Fi access separate from the internal network.

The Impact on TechDecisions

Our coworkers are very happy with the space. Coming from a space that the company had leased out for fifteen years, a fresh start after the acquisition was well needed.

The advance in new technology for the space is also a welcome addition for a company that has multiple publications dedicated to the technology field.

“The new office space is pretty amazing with all its technological perks. The design is very modern, but inviting,” says Manuela Rosengard, creative services director, Emerald Expositions.

“The artwork for the office has a personality unique to Massachusetts with its Boston theme. It is a great environment to come to every day.”

This article was originally posted in January 2020.

The post Designing and Implementing Technology at OUR New Office appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/project-of-the-week/implementing-technology-workplace-office/feed/ 0
Major Universities’ AV Experts to Discuss Networked AV Best Practices at Industry Roundtable https://mytechdecisions.com/it-infrastructure/networked-av-best-practices-roundtable/ https://mytechdecisions.com/it-infrastructure/networked-av-best-practices-roundtable/#respond Mon, 21 Oct 2019 18:47:51 +0000 https://mytechdecisions.com/?p=19545 A roundtable discussion exploring networked AV best practices and the transformative effects networked AV has on learning and higher education will take place at 11 a.m. PT on Oct. 29. The event includes an in-person discussion between multiple AV experts. A live Q&A, where webinar attendees can ask questions of the participants will occur immediately […]

The post Major Universities’ AV Experts to Discuss Networked AV Best Practices at Industry Roundtable appeared first on My TechDecisions.

]]>
A roundtable discussion exploring networked AV best practices and the transformative effects networked AV has on learning and higher education will take place at 11 a.m. PT on Oct. 29.

The event includes an in-person discussion between multiple AV experts. A live Q&A, where webinar attendees can ask questions of the participants will occur immediately after the discussion.

The roundtable event is hosted by Audinate, developer of the industry-leading Dante AV networking technology, and brings together experts from the University of Southern California, University of Oregon, Arizona State University, University of Colorado, Denver and Brigham Young University – Idaho.

Read Next: Investments in Digital Skills is Waning – and Will Cost the U.S. Billions

“When we talk with our partners about how they are using networked AV we always come away impressed at their innovation and ingenuity,” said Joshua Rush, Senior Vice President of Marketing at Audinate.

“This is especially true in the higher education market. We realized it would be incredibly beneficial to bring together some of the leading minds utilizing networked AV in higher education – and then share that discussion with the industry at large.”

Networked AV best practices discussion panelists include:

  • Joe Way, Director of Learning Environments at the University of Southern California
  • Guy Eckelberger, Director of Information Technology at the University of Oregon
  • Sean Snitzer, Senior Systems Analyst at Arizona State University
  • Scott Burgess, Manager of Recording Labs and Live Sound at the University of Colorado, Denver
  • Will Davidson, Media Systems Engineer at Brigham Young University – Idaho

Registration for the webinar is open now at: www.audinate.com/ddm-edu

The post Major Universities’ AV Experts to Discuss Networked AV Best Practices at Industry Roundtable appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/networked-av-best-practices-roundtable/feed/ 0
ADTRAN Addresses Connect America Fund Program with Fully-Managed, Cloud-Based Performance Test Solution https://mytechdecisions.com/compliance/adtran-caf-connect-america-fund/ https://mytechdecisions.com/compliance/adtran-caf-connect-america-fund/#respond Tue, 20 Aug 2019 12:00:43 +0000 https://mytechdecisions.com/?p=18376 Adtran, a provider of open networking and subscriber experience solutions, announced its fully-managed, cloud-based performance test solution, built to streamline data collection and reporting for network operators that utilize the Connect America Fund (CAF) program. Recipients of these Federal broadband subsidies will be required to submit testing results as part of their annual compliance in […]

The post ADTRAN Addresses Connect America Fund Program with Fully-Managed, Cloud-Based Performance Test Solution appeared first on My TechDecisions.

]]>
Adtran, a provider of open networking and subscriber experience solutions, announced its fully-managed, cloud-based performance test solution, built to streamline data collection and reporting for network operators that utilize the Connect America Fund (CAF) program.

Recipients of these Federal broadband subsidies will be required to submit testing results as part of their annual compliance in the first quarter of 2020, Adtran says.

Carriers that do not comply with the FCC speed and latency requirements will be subject to a reduction in support, commensurate with their level of noncompliance. In addition, providers will be subject to audit of all testing data.

To help ensure that the funds allocated are used for broadband service delivery in rural areas, the Federal Communications Commission (FCC) has adopted Performance Monitoring and Measurement requirements to ensure greater accountability for recipients of Connect America Fund (CAF) high-cost universal service support, including price cap carriers, rate-of-return carriers, rural broadband experiment (RBE) support recipients, Alaska Plan carriers, and CAF Phase II auction winners. — company press release

“We developed this one-stop, fully-managed subscription service to give our customers peace-of-mind by offering a turnkey option that off-loads the entire testing process, so they can focus on what they do best—providing a superior subscriber experience,” ADTRAN Services Portfolio Manager Derek Foster said.

“We know there is still some uncertainty with the FCC, which causes concern for our customers and that’s why we developed a simple and affordable compliance service.”

The FCC framework for CAF Performance Monitoring and Measurement includes measurements to be conducted over a minimum of two consecutive weeks during peak hours for at least 50 randomly selected customer locations within the census blocks of each state for which the provider is receiving model-based support.

Alternatively, service providers may deploy at least 50 white boxes to customers within the Connect America Fund Phase II-funded areas within each state and certify that 95 percent or more of the measurements taken quarterly during peak periods for a period of two weeks were at or below 100 milliseconds.

The service provider is responsible for the hardware and administrative costs of this testing.

The ADTRAN managed testing solution takes care of each aspect of FCC compliance from initiating the test to collecting data to report creation and submission to the FCC in the format needed.

It removes the administration burden related to Connect America Fund test compliance and utilizes the ADTRAN Speed Test Server Network as a cloud-based, managed service. The solution includes the SmartRG Device Manager Software to serve as the test controller and the SmartRG Gateways as the test clients and is available across the entire SmartRG portfolio of gateways. — company press release

By leveraging ADTRAN’s solution, carriers are able to deploy a purpose-built, open and flexible solution that leverages existing hardware, removes testing of unmanaged LAN segments, reduces privacy concerns for end customers and eliminates the unsightly white boxes used during the testing period.

Related: Obstacles and Priorities for NetOps Teams (And How to Address Them)

The solution delivers the added benefit that, once deployed, provides ongoing value by delivering network performance metrics and measurements that can help the operator better understand network performance over time.

The post ADTRAN Addresses Connect America Fund Program with Fully-Managed, Cloud-Based Performance Test Solution appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/compliance/adtran-caf-connect-america-fund/feed/ 0
Masergy’s New Intelligent Service Control Portal Experience Packages Analytics and Control https://mytechdecisions.com/it-infrastructure/masergy-intelligent-service-control-portal-revamp/ https://mytechdecisions.com/it-infrastructure/masergy-intelligent-service-control-portal-revamp/#respond Wed, 14 Aug 2019 12:00:50 +0000 https://mytechdecisions.com/?p=18236 Masergy, a software-defined networking company and provider of managed SD-WAN, cloud communications, and managed security solutions, announces the next evolution of its Intelligent Service Control portal that enhances the application user experience, transparency, and analytics. Masergy’s new ISC portal simplifies and unifies network and application management with real-time visibility, analytics and service control purpose-built for the multi-cloud […]

The post Masergy’s New Intelligent Service Control Portal Experience Packages Analytics and Control appeared first on My TechDecisions.

]]>
Masergy, a software-defined networking company and provider of managed SD-WAN, cloud communications, and managed security solutions, announces the next evolution of its Intelligent Service Control portal that enhances the application user experience, transparency, and analytics.

Masergy’s new ISC portal simplifies and unifies network and application management with real-time visibility, analytics and service control purpose-built for the multi-cloud enterprise.

The portal offers a holistic view of clients’ global SD-WAN and Unified Communications as a Service (UCaaS) applications while enabling the ability to manage, secure, and optimize their network environments in real-time.

Related: Voice Control: The Next Big Thing in Wayfinding?

From private and public bandwidth to edge devices, public/private cloud instances, business applications, and voice/video communications, global enterprises now have the unified visibility and control they need to maximize their application performance. No more reconciling information across multiple dashboards.

Features of Masergy Intelligent Service Control portal include:

  • A single pane of glass delivers unified views of analytics for the customer’s global networks, UCaaS, WAN edge devices, and application performance.
  • Customizable dashboard views allow the customer to feature the information they need with views of top applications, security threats, network services, network usage, and open support tickets.
  • Real-time bandwidth controls provide the ability to modify port bandwidth globally across both public and private connections.
  • End-to-end visibility of application performance help customers make faster, more informed decisions about bandwidth allocation and service improvement.
  • Comprehensive self-service features empower the customer to control the network with site and contact management capabilities, ticket tracking, alarm notifications, invoicing, device reports, change history, and escalations.

“When it comes to accelerating the pace of IT and building a multi-cloud environment, unified network visibility is everything–global enterprises need a single source of truth for information about their cloud application performance,” said Masergy’s Chief Digital Officer Terry Traina.

“Masergy’s Managed SD-WAN solution is leading the industry in providing customers with a unified framework for monitoring and managing their increasingly complex environments, so our customers can ensure cloud application reliability, streamline WAN management, and stay focused on their strategic initiatives.”

The post Masergy’s New Intelligent Service Control Portal Experience Packages Analytics and Control appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/masergy-intelligent-service-control-portal-revamp/feed/ 0
Setting Up O365 as A Unique Layer In Your Governance Hierarchy https://mytechdecisions.com/it-infrastructure/o365-governance-hierarchy/ https://mytechdecisions.com/it-infrastructure/o365-governance-hierarchy/#comments Tue, 28 May 2019 14:42:02 +0000 https://mytechdecisions.com/?p=16465 The Microsoft Office 365 environment boasts a spectrum of components, some of which can be managed by your organization’s existing frameworks. However, other components of the O365 environment may require the development of a unique governance hierarchy framework. Organizations implement varying levels of governance hierarchy frameworks to align strategy, policies, and procedures. At the highest […]

The post Setting Up O365 as A Unique Layer In Your Governance Hierarchy appeared first on My TechDecisions.

]]>
The Microsoft Office 365 environment boasts a spectrum of components, some of which can be managed by your organization’s existing frameworks. However, other components of the O365 environment may require the development of a unique governance hierarchy framework.

Organizations implement varying levels of governance hierarchy frameworks to align strategy, policies, and procedures. At the highest level, an organization’s Corporate Governance Framework defines the “systems of rules, practices, and processes by which companies are governed”.

Below this layer is a myriad of function-specific frameworks, such as the organization’s IT and Digital Governance, Records Governance, and Data and Information Governance Frameworks.

While these functional governance frameworks contribute to the overall governance of your new or existing O365 environment, gaps are likely to remain.

This article explores those gaps that your new O365 Governance Framework should address while acknowledging those components covered by your organization’s existing governance frameworks.

And while each organization requires its own unique set of governance frameworks, this article also argues that if your organization lacks functional governance frameworks, then those frameworks should be considered before creating a single governance framework to address all O365 components.

We’ve Got You Covered

The following Office environment components should be covered under your organization’s current functional governance frameworks:

O365 Application Implementation and Utilization Consistency

O365 components addressed: application selection and utilization, application integration, O365 security

Office application(s) involved: all

Contributing governance framework(s): IT Governance

The suite of applications associated with the O365 environment is not only large and slightly overwhelming to the average user, but this list of available applications is also constantly changing as Microsoft updates their technological objectives.

How do you know which applications are useful for your business, and how do you keep consistency in user adoptions of those applications chosen to meet the business needs?

A robust IT Governance Framework has been credited with enabling ROI realization for IT investments.

One major component shared by robust IT governance hierarchy frameworks is the alignment of IT objectives and the business’ requirements, objectives, and strategies4, allowing for technology investments that support business needs.

So, if the business has performed accurate needs analysis and their requirements are documented, knowing which O365 applications are necessary to enable these requirements is not nearly as difficult as guessing.

However, the micro-governance required for each of the implemented applications is vital to O365 ROI realization and is detailed in the section of this paper describing the O365 Governance Framework.

This alignment between the business and IT also ensures that ‘Shadow IT’ (the application implementation and utilization by a user outside of an IT Department’s list of authorized application usage) is kept to a minimum, reducing security risk, and increasing application adoption consistency across the user population.

Office Security

O365 components addressed: O365 security, permission levels, change management

Office application(s) involved: all

Contributing governance framework(s): IT Governance

It goes without saying that the security of the O365 environment is largely the responsibility of the IT Department, and its corresponding standards and processes should be outlined in the IT Governance Framework.

Additionally, the O365 environment allows for general security to be managed by the business through administrative functionality that does not require IT development intervention.

However, given that the organization’s typical user is one of the most significant (and uncontrolled) risks to a technology environment, the business bears much of the responsibility to mitigate user-related risk through employee cybersecurity awareness training, and detailed documentation of access and permission levels (including view, read, and change), followed by the necessary change management (as the employees’ permission requirements change due to promotion, termination, or other mortality causes), and prompt collaboration with the IT Department when these changes occur.

This process is particularly efficient if the O365 environment pulls user information from existing Active Directory (AD) lists.

The business’ responsibility for ensuring security in their cyber work environments should be outlined in the Company’s Records Governance and/or Information Governance Frameworks and is detailed in the following section.

O365 Digital Records

O365 components addressed: records, correspondence threads, emails, logical topology

Office application(s) involved: SharePoint, Teams, OneDrive, Yammer, OneNote, Outlook, PowerPoint, Word, Excel, Access, Stream

Contributing governance hierarchy: IT Governance, Digital Governance, Records Governance, Information Governance

Digital Records is a grouping of Office components for which contributing functional governance frameworks carry a lot of weight. The term ‘digital records’ encompasses just about any formal and informal collaboration, communication, and work product.

For the public sector, a Freedom of Information Act (FOIA) request could send employees scrambling to locate an informal communique buried in an Outlook folder, while a federal or state regulatory agency could cause a private sector employee to search their personal OneDrive for an expenditure receipt.

The manner in which digital records are structured, stored, archived, and deleted could determine if a department fails an audit, or an employee spends a weekend reproducing misplaced deliverables.

Related: Get the Most Out of Amazon Web Services for Cloud Analytics

Before a SharePoint environment becomes a rampant dumping ground for old documents or miscellaneous content from a terminated employee’s hard drive, the company’s Records, Digital, and/or Information Governance Frameworks should detail records retention and destruction policies as well as the records’ physical security and cybersecurity protocol, while empowering departments to create approved and socialized naming convention and taxonomy.

Another vital component in the Digital Records category of Office components is the environment’s landscape or layout (logical topology) and how well it enables communication and data flow between the various O365 applications.

Without efficient data flow, not only is communication onerous and out-of-sync, but data contextualization and associated metadata may be lost, effecting data searches and retrieval. Efficient topology design will vary by organization and O365 environment structure.

You’re on Your Own

Now that your organization’s existing governance frameworks have addressed a majority of your O365 environment’s components, the remainder of this article will focus on those components that require the development of a unique O365 Governance Framework that easily integrates with the function-specific governance frameworks under the organization’s Corporate Governance Framework.

O365 User Adoption and Training

O365 components addressed: application usage

Office application(s) involved: all applications

Contributing governance hierarchy: O365 Governance

Among the many things to consider when migrating to the O365 environment or increasing user adoption and utilization of an existing O365 environment is employee training.

While some of the applications available in the Office suite are intuitive and commonly used (like Outlook email), other applications (like PowerBI) may be less understood by the average user.

And even commonly used and understood applications may contain functionalities that are not widely implemented but could enable specific business requirements.

Formal O365 training programs, workshops, or lunch-and-learns can broadly socialize the value each implemented O365 application brings to the business.

Jack Wilson is a Process Engineer with Optimum Consultancy Services, a software consulting firm specializing in business optimization through advisory services and modern software solutions. For more information, visit www.OptimumCS.com.

A particular focus on how these applications mitigate or eliminate specific business process- or work-related pain points may drastically increase user adoption and overall organizational consistency.

This training should be managed, directed, and implemented by either an O365 Governance Committee, or a joint collaboration of O365 experts from both the IT Department and the business.

This governance entity also ensures all Office-specific governance needs are addressed and liaises with the owners of the organization’s function-specific governance frameworks to ensure integration of Office utilization with their corresponding standards and policies.

O365 Application-specific Governance

O365 components addressed: application selection and utilization, application integration, Office security, records, correspondence threads, emails, logical topology, O365 security, permission levels, change management

Office application(s) involved: InfoPath Forms, PowerApps, Office 365 Forms, Third-party Forms and Apps, Workflows, Custom Apps, Reports and Dashboards, Business Connectivity Services, Searches

Contributing governance hierarchy: Office Governance

Depending on your organization’s existing data transfer mechanisms (for team collaboration, communication, personal material storage, and digital records management), and their pending migration to an O365 environment (or their current implementation within an O365 environment), application-specific governance will be required.

Detailed in a previous O365 governance article, the applications requiring specific governance are:

  • InfoPath Forms
  • PowerApps and Office 365 Forms
  • Third-party Forms
  • Workflows
  • Custom Applications
  • Business Connectivity Services
  • Searches
  • Third-party Applications
  • Reports and Dashboards

A Layered Approach

Assessing those governance frameworks already in place within your organization will prevent your IT Department and business collaborators from creating an O365 Governance Framework that addresses components already governed by those frameworks.

Read Next: 9 Cyber Security Training Videos That Aren’t Boring

Let existing governance hierarchy manage their intended components across all technological environments, while utilizing your O365 Governance Framework to speak to those components that are unique to an O365 environment.

Additionally, beware the temptation of using an Office-specific governance framework to address similar components across non-O365 environments.

The post Setting Up O365 as A Unique Layer In Your Governance Hierarchy appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/o365-governance-hierarchy/feed/ 3
Canon Adds 10,000 Square Feet to Business Processing Centers https://mytechdecisions.com/compliance/canon-adds-10000-square-feet-to-business-processing-centers/ https://mytechdecisions.com/compliance/canon-adds-10000-square-feet-to-business-processing-centers/#respond Mon, 27 May 2019 20:00:44 +0000 https://mytechdecisions.com/?p=16386 Expansion by Canon offers more space to help enterprises with their digital transformation initiatives

The post Canon Adds 10,000 Square Feet to Business Processing Centers appeared first on My TechDecisions.

]]>
Businesses undergoing a digital transformation could all use a helping hand. It’s a huge undertaking that can eat up precious resources. Digital transformation spending will approach the $2 trillion mark in 2022, good for a 16.7 percent compound annual growth rate, according to IDC. Moreover, 30 percent of the global 2000 companies will allocate 10 percent of revenue to digital strategies by 2020. Obviously businesses see digital transformation as a long-term investment.

With the recent addition of a 10,000-square-foot Eastern regional facility in Scranton, Penn., Canon Business Process Services continues to expand the capabilities and reach of its Business Processing Centers. The facilities are designed to help clients with their digital transformation initiatives.

Together with Canon’s Western U.S. and Philippines-based operations, the Pennsylvania location extends the company’s national and global capabilities to better support client business objectives.

“Organizations increasingly are launching digital transformation projects because these initiatives enable them to better concentrate on their core business while containing costs and improving operational efficiency,” notes Joe Marciano, president and CEO, Canon Business Process Services. “With our extended business processing capabilities, Canon can help clients meet these important goals.”

Canon’s processing centers support business functions ranging from invoice and claims processing to document imaging and electronic discovery services for corporations and law firms. Highlights include: workflow and forms design; digitization of hard-copy documents; search-enabled images; historical or vital records preservation; indexing/coding and transactional processing.

Another feature of Canon’s Business Process Centers is security. The facilities underscore Canon’s commitment to ensuring that client data is reinforced by the highest levels of confidentiality, integrity and ease of access. To support this goal the Business Processing Centers utilize technology and processes—such as dual physical authentication and third-party application penetration testing—to meet relevant industry standards. The processing centers currently have accreditations that include: AICPA SOC 1 and SOC 2; HITRUST/CSF certified; PCI compliant; HIPAA compliant and GDPR Ready.

The post Canon Adds 10,000 Square Feet to Business Processing Centers appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/compliance/canon-adds-10000-square-feet-to-business-processing-centers/feed/ 0
8 Open Source Digital Transformation Journeys from Red Hat Summit 2019 Keynote https://mytechdecisions.com/it-infrastructure/digital-transformation-open-source-red-hat-summit-2019-keynote/ https://mytechdecisions.com/it-infrastructure/digital-transformation-open-source-red-hat-summit-2019-keynote/#comments Wed, 08 May 2019 13:35:54 +0000 https://mytechdecisions.com/?p=16171 IBM, Delta, ExxonMobil, Lockheed Martin, Volkswagen, DBS Bank Unlimited, Deutsche Bank, and Microsoft executives explain how they achieved digital transformation at Red Hat Summit 2019 Keynote.

The post 8 Open Source Digital Transformation Journeys from Red Hat Summit 2019 Keynote appeared first on My TechDecisions.

]]>
Perhaps I’m showing my ignorance of the industry here, but when I decided to attend Red Hat Summit 2019 in Boston, MA this week I wasn’t ready for the experience I was walking into. I expected a typical trade show – booths, sessions, a keynote at the end of the first day in a modest ballroom. I attended the show because it was local – and I’m glad I did. It turns out that our audience is crazy about open source.

I arrived for the keynotes on the first day, waited for the doors to open, and followed the crowd. We skipped through the tradeshow floor, now dark and roped off, and suddenly I was in a long, dark hallway, crossing beam-lights making “X” shapes for the crowd to walk beneath. The sides lined with tables full of wine and beers for the crowd to enjoy. At the other end the keynote stage – seventy feet across at least, with the largest video wall perhaps forty feet long by ten feet high, flanked by two ten by ten video walls. A lone DJ on stage playing everything from Cardi B to Stevie Wonder.

Thousands were in the crowd, and each seat held a dedicated bracelet that would eventually light up along with the show. Was I at a technology show? A rave? A rock concert? I wasn’t sure until Red Hat CEO Jim Whitehurst stepped on stage to discuss the theme of this year’s show – Expand Your Possibilities.

Clearly Red Hat Summit was much more than I expected, and that was proven further as the keynote sessions continued. Read on to learn more about the insights and understanding coming out of the keynotes from Red Hat Summit 2019, company by company – specifically how these companies used open source to improve their organizations:

Ginni Rometty, President and CEO, IBM*

*IBM is currently in the process of purchasing Red Hat

  • IBM has been working with open source for twenty years. In 1999, IBM invested $1 Billion into open source and Red Hat. It plans on purchasing Red Hat for $38 Billion
  • The roots between the two companies are deep, and the beliefs in how important an ecosystem is to drive innovation are as well
  • The importance of open governance is vital to open source, and if you want to take then you have to give. You can use it, but you have to contribute
  • Both companies saw that they had joint clients, and many non-joint clients as well. There is plenty of mission critical work that has yet to move to the cloud, public or private. You need to have a way to connect all of those pieces.
  • The timing is right between IBM and Red Hat because there is a fabric for a standard to do this across the globe. The opportunity is right in front of so many clients – they want to address this. Moving data to the cloud, where date needs to be, and hybrid environments. It’s time for chapter two for many clients, and the demand for Open source is stronger than ever.
  • IBM and Red Hat will work together, but perhaps not come together in the traditional sense. To preserve the value of Open source – all welcome – is part of the mission as well. The mission is to scale Open source.
  • Red Hat will stay an independent unit. The idea of having a platform that invites innovation for everyone means open source, open standards, friends and competitors alike. IBM can help power it to make it go further and wider. A broad horizontal scale. The work that Red Hat has done to create its culture is important, and so the companies will work together.
  • IBM will change, building on top of what Red Hat has done. Offering clients a secure mission critical, hybrid stack. A bigger and broader plate for innovation everywhere.
  • All companies are on a journey. There’s been plenty of experimentation, which is great. Now, however, we’re at the stage of scaling. We aren’t sprinkling in technology, we’re at the stage of true transformation. Exciting possibilities include cloud, all of the forms of artificial intelligence augmenting human work, blockchain fundamentally changing supply chains, and quantum computing.
  • This acquisition is a win for clients.

Rahul Samant, EVP and CIO, Delta Airlines

  • Delta has transformed IT in order to transform the customer experience. It is working to be one of the world’s most trusted brand. That means people first, but technology next.
  • Two years ago Delta had data all across the company. The first goal was to gather that data into repositories. The company is building APIs and hosting them in Red Hat Data Shift, so all employees can gain insights on customers to increase service and personalize customer experience.
  • Before flying, you get emailed a menu of what you will be able to select on your flight. From there, they have automated, single-click check-in. Delta is the pioneers in using RFID technology to track bags from gate to gate. Those experiences are being built off of a digital foundation.
  • On the operation side, they are doing machine learning oriented decision support. When they suffer irregular operation, such as bad weather, they make sure the decision support tools are in the hands of front-line people. That way the decisions on delays and cancels can be augmented, technology helping human decisions.
  • All of this technology is sitting on the scalable, reliable platform of Red Hat. Red Hat is below the hood, and the customers experience the outcome.
  • The technology vision is easy. It needs to be contextualized. In 2016, when this transformation started, the timing was off. They needed to focus on the basics of the platform. Fast forward a year and a half and they were ready for the transformation – hiring 600 new people with contemporary skills and abilities, and making sure they blend in with legacy employees. Delta focused on the human factors to make sure there wasn’t friction between the two groups.
  • There is a war for talent. Working with partners not only brings great products, but help to bring great talent to the digital transformation.

Austen Smack, Container Platforms Product Owner, ExxonMobil

  • ExxonMobil embraces technology to solve energy challenges. Geoscientists are using Open Shift to change the way they work through disruptive innovation.
  • Disruptive innovation is the idea of introducing new technology that might not have short-term effects, but will pay off in the long-term. ExxonMobil watched customers as they did their work to learn what needed to change. Specifically with container platforms.
  • They started with a small sandbox cluster, and continued to deploy more non-production clusters. As of today, there are over 130 development teams consuming resources on ExxonMobil’s platform.
  • They keep up with these developer teams by surrounding themselves with people that are willing to disrupt, and bringing vendors with them throughout the process. Red Hat and other vendors have been on the team since day one.
  • Everything they are doing is automated. Everything is infrastructure as code. Investing in extra time for automation will reduce technical debt.
  • Containerization allows ExxonMobil to expand delivery options and enable quicker code iterations, then deliver those solutions quickly to customers. Customers might not know that at this point, however. Customers don’t want to put in the work unless it makes sense to them. Containerization expanded delivery options, allowed customers to better collaborate, and allowed for the deployment of solutions. ExxonMobile deployed a data science image, and customers only had to click on a URL to access the application. They could instantly visualize code, and because of that the customers could work with ExxonMobil to make better business decision, faster. It took hours instead of weeks. The only limitation is how quickly the data scientists can code.

Michael Cawood, VP of Product Development, F-16/F-22 Integrated Fighter Group, Lockheed Martin

  • Lockheed Martin is under two threats – nation-states creating technology to render planes useless, and competitors fighting to take business away from Lockheed Martin.
  • Traditionally the development of F-22 were a long process. Two to three years to develop a plan, and five to seven years to develop new capabilities. They utilized a Waterfall method of development, creating hardware and software independently. Flight tests took over a year before certification. Supply chain processes and hardware development cycles sometimes took years. The organization was siloed, and timelines were longer due to working with secure government systems.
  • Six years ago, Lockheed implemented Agile methods, but didn’t achieve any improvement.
  • In 2017 the company partnered with Red Hat. They changed tools, facilities, process, and culture utilizing Red Hat Open Innovation Labs.
  • The approval of software tools has been sped up from months or years to a few days. They are using companies to build and deliver hardware in five to ten days instead of years. Forecasting improved 40%. They are now on track to deliver communications capability to the F-22 three years earlier than the original plan.

Michael Denecke, Head of Test Technology, Volkswagen

  • Michael’s job is to make sure all control units work properly together. The new challenge of autonomous driving falls under his scope. The normal way to do tests were not enough.
  • Volkswagen got the idea to do all of the tests on virtual test requirements. They used Red Hat to put these tests in virtual containers. They chose Red Hat because, after speaking to a number of IT vendors, Red Hat was the only one to say it could be done.
  • Volkswagen had a short timeframe to prove the viability of the idea. The Red Hat Open Innovation Labs allowed them to reach the goal in this short timeframe. The Open Innovation Labs showed Volkswagen to not worry about top-down innovation, but to put the onus on the team.
  • First they proved they could have virtual test environments. Then they proved they could mix virtual and physical testing environments. Finally, they showed that quality and success was dependent on changing the culture and collaborating.
  • No matter how crazy an idea seems, if you believe in it and it gives a business advantage then look into it. Trust your teams, and give them the responsibility to decide on their own.

David Gledhill, CIO and Head of Group Technology & Operations, DBS Bank Unlimited

  • Ten years ago they started their digital transformation. Five years ago, they saw what was going on in China with banking disruption. The traditional physical branching wasn’t going to work. They decided to drive a digital bank.
  • They were used to a traditional stack, and needed to change that as well. They needed to become a technology company, and looked at some of the biggest players to find out how to be like them.
  • They weren’t building technology like a bank. They moved to open stack and used Red Hat to help with that. They built out the cloud environment, and continued to scale.
  • The goal was to make banking invisible, which needed a lot of engineering. The customer toil needed to be taken out of it. They also needed to think about the ecosystems – how to create them and how to be part of others. They needed to be able to scale without concern, and build APIs (DBS has 350 and counting).
  • As they transformed, they wanted to take the whole bank along with it. Learning how to fail fast was difficult, but also become a culture of continuous learning and development. The cultural shift has been the biggest part of setting the company apart.

Tom Gilbert, Global Head of Cloud, Deutsche Bank

  • Deutsche Bank has created an Everything-as-a-Service platform, aimed at opening up new ways of working and allowing developers to get new ideas into production. They also wanted to use their own infrastructure more productively.
  • Fabric, Deutsche Bank’s EaaS platform, is built on Red Hat OpenShift. 25,000 production containers are held on the platform to date. They are being brought to market in just three weeks.
  • The technical capabilities have helped to foster communities in the bank so anyone can contribute to the platform. New iterations are introduced every few weeks. Rapid prototyping is available on demand. External developers and third parties can supply software, and test applications in the environment.
  • As of today they are running over 25,000 container in production.
  • They are running Fabric in multiple premises through multiple suppliers. On-premise IT has multiple suppliers providing data center services, and off-premises in the cloud they are using Microsoft Azure. They can write once and run everywhere. If they need massive scale at short notice for a short period, for example deploying new products into new regions quickly, it’s now possible thanks to the combination of OpenShift and Azure.
  • This hybrid cloud platform on OpenShift let them build all of the controls and automation into the platform, enabling applications in the highly-regulated bank to consume the public cloud while reducing risk.t has also reduced cost and increased agility.

Satya Nadella, CEO, Microsoft

  • Microsoft and Red Hat have made much progress together, and announced at they keynote the general availability of the Azure OpenShift service, a collaboration between Microsoft and Red Hat.
  • Believers in distributed computing, hybrid computing, and edge computing will appreciate the flexibility and agility provided from this partnership.
  • One thing Microsoft is focusing on is getting data centers to meet the real needs of customers. Ensuring certifications needed don’t create friction for customers. Also the number of regions available – Microsoft has just launched its first data center region in Africa.
  • Nadella is most excited about the number of software engineers being hired outside of the tech industry. It means that software is becoming a new factor of production in every industry. That means computing and architecture is going to change – everything from storage, to cloud, artificial intelligence, quantum, and all the stops between.
  • The thing that truly needs to happen is to ensure the cost structure of raw infrastructure and the innovation platform is something that makes every business competitive. Support of open source and support for standards will reduce the friction of development. The second area is democratizing technology. Machine learning, DevOps, data science – those technologies need to be available to anywhere out there so any company can take them and build on them. It’s up to platform vendors to reduce the barriers.
  • Five years ago Microsoft and Red Hat had an adversarial relationship. Moving forward, Microsoft is committed to the partnership because it’s driven by what customers expect of them – interoperability and commitment and contribution to open source.

The post 8 Open Source Digital Transformation Journeys from Red Hat Summit 2019 Keynote appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/digital-transformation-open-source-red-hat-summit-2019-keynote/feed/ 4
Everything to Know About Setting Up Wi-Fi and Routers https://mytechdecisions.com/it-infrastructure/everything-to-know-setting-up-wi-fi-routers/ https://mytechdecisions.com/it-infrastructure/everything-to-know-setting-up-wi-fi-routers/#respond Thu, 25 Apr 2019 09:00:19 +0000 https://mytechdecisions.com/?p=15769 Wi-Fi basics explained: everything that you need to know about setting up Wi-Fi and routers for the best coverage possible, including coverage, positioning, and more.

The post Everything to Know About Setting Up Wi-Fi and Routers appeared first on My TechDecisions.

]]>

Wireless connectivity and the networks to support it have gone from luxury to necessity. The demand for reliable continuous high speed wireless connections will only grow.

Users expect and depend on a seamless experience when moving about a facility. Whether reading emails, texting or watching video, your network is ‘required’ to keep up with multiple users without a blip or stutter. It would be fair to say that nearly all of your clients expect a Wi-Fi network to perform equal to and often more reliable than a wired connection.

It has been stated that wireless is the most convenient and influential method of communication yet invented; unfortunately it is also the most troublesome method yet implemented.

The concept that wireless communication could be more problematic than a wired system strikes many new to implementing a facility wide network as odd. A common misconception is that the wireless units are simply connect and go, a fallacy that if followed frequently results in failure.

There are some straightforward actions designers and systems managers can use to make the system more robust. A great deal can be done without diving deep into the esoteric configuration menus, but by optimizing simple aspects of installation.

Who’s On FirstThe 1940’s comedy team Abbott and Costello have a classic bit about misunderstanding names of baseball players. The premise is about a team whose players have unusual names – Who is on first, What is on Second, and I Don’t Know is the third basemen. Costello asks ‘Who’s on First’, ‘Yes’ responds Abbott. Even if you havenever seen the act, you get where the joke is going.

The comedy bit can be used to describe how many wireless network frequencies are setup. If you are unaware of what is being broadcast and how, it leads to a confusion and frustration on par with our rube Costello. Determining which Wi-Fi frequencies and channels to use is essential and depends on your user base needs, facility build and environment.
For our purposes we will focus on the two main frequency sets available to Wi-Fi. These are nestled in the Industrial, Scientific and Medical (ISM) segments of the US frequency spectrum. There are a few others but these are currently considered more esoteric and specialized than for general usage.

A large portion of Wi-Fi devices operate in either the 2.4GHz or 5GHz frequency range. It is important to note that the ISM is unlicensed, which means nearly any manufacturer can make and use radios and distribute them in a fairly unregulated manner. What does this mean to your installation? Have you ever noticed that on the bottom of nearly all electronic devices is a sticker declaring compliance with the FCC Part 15 ruling? This little coda dictates that a device working in the unlicensed band essentially must not cause interference with licensed spectrum use and must accept interference. Yes, accept interference, a fun bit of ruling that Part 15.

It’s the Frequency

It was well anticipated that Wi-Fi would have to handle multiple users and would need to have a way to accommodate a lot of traffic on a single RF signal.

The frequency sets are divided into smaller segments called channels. In the 2.4GHz range the US are broken up into eleven channels ranging from 2412 MHz (channel 1) to 2462 MHz (channel 11). Up until a few years ago nearly 90% of all Wi-Fi devices operated only in this frequency set. This is also where remote control toys, baby monitors, wireless CCTV cameras, microwave ovens and practically every consumer RF device work. It is not the quietest of places to operate in.

5GHz range provides a few more channels, nearly all are non-overlapping, but the actual number available depends on intended usage. While the frequency set has been assigned for quite some time, the limited use was due to the radios being expensive and consuming more power than the 2.4 models. Today nearly all devices have the ability to use this range, though it is still a relatively quiet spectrum. This sense of emptiness, traffic wise, would seem a no-brainer to use; some inherent issues limit its use.

Crossing the Streams

How do they get 11 + channels on a single frequency range without interfering with each other? They do not. While the 2.4GHz Wi-Fi spectrum does allow for a number of channels it only has three which are not overlapping. Channels 1,6 and 11 do not overlap each other but they do have other channels which overlap them on either side.

A common mistake is to immediately set all of your wireless connec- tion points to one of these three channels. The thought process is not without logic it would seem to reason that non overlapping channels would be the safest place with less interference. The problem is that this is exactly what every person installing a router or access point is going to do. It is also where a manufacture of a device which broadcasts Wi-Fi will seat their device by default.

Rather than set up your system on these three channels in a knee jerk reaction, it is of utmost importance to verify RF traffic with a spectrum analyzer. In many cases it may be best to place your connection points in the overlapping channels and leaving a guard band of a channel or two between them. The benefit of avoiding the sirens call of the set aside channels of 1,6, and 11 is knowing that most anyone setting up a rogue router, like in a dorm room or office space, will not overly impact your system.

Size Matters

When it comes to wavelength, size matters a lot. Earlier in the article it was mentioned that the use of 5GHz was limited; this is due, in part, to its physical limitations. Compared to the signals used for terrestrial radio, or even digital off air TV Wi-Fi signals are tiny. Standard broadcast signals can ‘bend’ around objects and even the earth, getting a signal to your receiver. The 2.4 and 5GHz signals are not nearly so nimble.

Even the size difference, small as it may seem, between 2.4 and 5GHz can make a remarkable difference in whether a signal gets through or not.

What breathing room you get in using a higher frequency with less traffic is offset by the very susceptibility to physical interference. In having a smaller wavelength the signal is much more likely to suffer from absorption by modern building materials and humans. It can also increase confusion from reflecting signal paths and nulls, (where colliding reflected signals create a ‘dead zone’). It also means that a system designer must take extra care in placement of antennas.

Objects May Be Closer Than They Appear

If there is one critical piece of hardware in a wireless system, it is the antenna.

Any business person can tell you, it is all about location, location, location. There are numerous causes of low or unreliable signal, many of these can be alleviated by thoughtful placement of antennas.

It is important to provide a clear path for the antenna, keeping the placement where the line of sight is clean will establish a solid base of connectivity. The optimal space would be clear of any and all obstructions, walls, etc.

An empty room is not conducive to business or education needs so to help keep a best case line of sight setup mount the antennas high up. This combination of positioning not only lifts the receiving above the fray, it also helps minimize those multipath rejection and wavelength null spots.

The size of any physical obstructions is important but their proximity is just as impactful if not more so. A large column will block more signal when it is 10 feet from the antenna than when it is 20 or 30 feet away. Such a column blocking the ‘view’ of an RF transmitter may seem obvious, like the proverbial stadium seat behind one, it applies to any object in a room. This same ‘inverse square rule of obstructions’ comes into play with file cabinets, desks and even humans.

In addition avoid placing antennas in corners or near wall as this increases reflections. Make sure that they are not behind metal doors, in equipment racks, or in the ceiling (above tiles). While the first two may seem obvious, it is common mistake. Ceiling tiles often contain materials which inhibit RF or reflect signal back. Installing antennas in any
of these configurations will only lead to increase the dreaded multipath interference and greatly diminish the system performance.

It’s Not Polite to Point

Pointing an antenna at the desired coverage area will only cause grief. While all areas of standard antenna radiate energy, it is designed to generate the strongest signals from the sides with greatest rejection at the tip and base.

Pointing a Wi-Fi antenna as one would with a satellite dish will actually decrease the power possible to the intended location. When setting the orientation of your antennas consider just where you are looking to cover. Is it just the floor or are there balconies and open plan secondary floors? When installing antennas, first look for ways to expose the maximum transmission surface to as much space as possible.

Use a Wi-Fi transceiver system which facilitates the use of MIMO (Multiple in Multiple Out) remote antennas. These systems allow for the transceivers to choose the strongest signal while covering a larger area. Typically enterprise class systems will include internal and the ability to add external units. The remote connections are made via reverse SMA terminations, giving you the ability to extend 50 – 150 meters from the receiver unit via a relatively thin gauge cable.

Even with the antennas centrally located and mounted from the ceiling there are situations where adding specialized versions to expand or limit coverage is necessary.

Adding high gain antennas can increase both the relative signal transmission and reception. High gain antennas are specially wound coils which can double the distance covered. Most high gain units are passive, meaning that they do not need an additional power source and amp to provide improvement.

While it is tempting to simply add a very high gain unit and call it a day, moderation is the key. Very high gain can be problematic for some receiver units and could introduce noise which degrades the signal integrity. Typically an antenna rated at 4 to 8dBi is sufficient for most installations.

There are some considerations in the placement of these units. Use should be limited to standard ceiling heights. Placing these higher can have the undesired effect of narrowing the transmission radius, effectively blocking out portions of the floor.

There are times when an installation needs to insure certain adjacent areas of a facility or a campus gets no coverage. From device free zones to preventing overlap with other installed systems, the reasons for preventing coverage are varied as they are valid. Above we mentioned the effect of installing high gain to high up and the dead zones this could create. It is recommended to not use this technique even when the null or dead zone is desired as ratio of power transmitted and available data throughput can be compromised. Instead use directional units which focus transmission in one region such as Yagi’s, Parabolic.

In combining antenna types, location, gain and frequency can increase a system’s reliability and performance with straightforward and fairly non-invasive installation.

Repeat as Needed

Quite often with larger campus installations just extending the antennas will not suffice.

One solution is to set up a number of individual routers each connecting to the larger network. While this can work it is time consuming, requires additional network setup and IP address configuration and management. In general it is considered unwise to connect multiple masters to a network, even when special connection policies are implemented.
In short, having too many masters in a network is just asking for trouble down the line.

Access Points (AP) are devices which act as ‘repeaters’ with built in security, access control and a host of other router like features but without the ability to act as one. Implementation generally requires only access via main Wi-Fi credentials making them invisible to the end user. Enterprise class units provide tools to tweak antenna performance, data type restrictions and number of and type of clients who can attach. An AP can connect via an Ethernet connection or wirelessly to the main router or other AP’s.

AP’s are often used as ‘Hot Spots’ allowing general access to wireless users across multiple ‘networks’. This application is often seen in street cafes providing free internet access to clients, who can move between them without losing internet while not needing specific access to the network itself. When used as an adjunct to the main wireless network of a facility AP or Wireless Access Points (WAP) provide a managed tool to extend service.

Mesh, no Mess

Adding Wireless Access Points over simple antenna distribution systems can also provide the backbone for a more sophisticated and self-resolving network.

WAP units can be set to operate as a Mesh Network . Where standard network connections simply feed the signal received to a main router, mesh network devices work collaboratively. Rather than depend on a connection device jumping from antenna to antenna, an AP configured for mesh will automatically choose the strongest signal and route it between devices. There is no need to worry if an intermediate device is down as the mesh technology will automatically find an alternate path, insuring that the data always gets through.

Spectrum Analysis

With an understanding of some fundamentals of Wi-Fi reception issues and solutions, there is question of what tools can provide us with the information needed to plan.

To determine just what the Radio Frequency (RF) environment is in your location you will need an off air spectrum analysis tool. These devices were once bulky, standalone units priced into the thousands of dollars. Today it is a computer based software with USB dongle.

An off air spectrum analyzer translates the ambient RF present into a graphical form showing what frequencies are being used and how dense the traffic is for each. Depending on the software sophistication an SA can also interpret a room to show the best positioning of receivers, show the radius of coverage area and suggest best solution.

Prices can range from just over a hundred dollars to thousands and can come in the form of standalone handheld or software based. It is important to review the specifications and reporting details of each to insure it meets your needs. As an example, some lower priced analyzers provide excellent graphical reports but no way to store or print out.

If you are just starting out, the WiSpy is an excellent beginner’s tool with several options and upgrades available. Combining a simple USB dongle and software package it is a powerful tool which may suit many mangers for years.

Air Magnet from Fluke is the gold standard in Wi-Fi Spectrum analysis tools. Utilizing direct to computer, remote network devices, handheld diagnosis and an outstanding reporting tool, this is the be all.

As with any tool, knowledge of its parts and repeated practice will make it a second nature solution.

Fraught no More

The design, implementation and maintenance of a distributed Wi-Fi network can be fraught with frustrations. Using the building blocks described here you can begin to take command of your Wi-Fi network and creating solutions before your clients notice a need.

 

This article was updated, originally published in 2016.

The post Everything to Know About Setting Up Wi-Fi and Routers appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/everything-to-know-setting-up-wi-fi-routers/feed/ 0
Kaleido Announces New Support for Microsoft Azure; Unveils Ability to Build Borderless Blockchain Platforms Across Clouds and Geographies https://mytechdecisions.com/it-infrastructure/kaleido-announces-new-support-for-microsoft-azure-unveils-ability-to-build-borderless-blockchain-platforms-across-clouds-and-geographies/ https://mytechdecisions.com/it-infrastructure/kaleido-announces-new-support-for-microsoft-azure-unveils-ability-to-build-borderless-blockchain-platforms-across-clouds-and-geographies/#respond Mon, 25 Mar 2019 18:00:20 +0000 https://mytechdecisions.com/?p=15186 New Blockchain Business Cloud enables clients to build networks across multiple clouds and geographies on a single platform.

The post Kaleido Announces New Support for Microsoft Azure; Unveils Ability to Build Borderless Blockchain Platforms Across Clouds and Geographies appeared first on My TechDecisions.

]]>
Implementing a blockchain across multiple clouds and geographies has been simplified, thanks to a new solution introduced by Kaleido. Available on Microsoft Azure’s Marketplace, the Blockchain Business supports Azure as well Amazon Web Services (AWS). According to industry analysts, AWS and Azure comprise over 80 percent of the public cloud sector. Decentralized ownership of the network across clouds and regions is a foundational enterprise blockchain requirement and Kaleido’s new borderless blockchain capability uniquely fills this need, enabling an accelerated pace of adoption by global blockchain consortia.

Blockchain technology is sweeping across enterprises in all major industries. With its shared ledger and smart contract software, blockchain requires business network participants to collaborate in order to deploy and operate the technology. This collaboration can often take the form of blockchain consortia: groups of organizations that collaborate to define use cases and to develop and run blockchain applications for shared business utility. In fact, according to a recent study by Deloitte, 74% of respondents said their company was either already participating in a blockchain consortium or likely to join one.

Yet organizations to date have struggled to progress blockchain projects in part because each enterprise brings its own pre-existing investments in cloud, IT structures, operating preferences and data residency requirements. Participants starting with different entry points could find themselves siloed in one cloud when they want to graduate out of limited size pilots into large scale hybrid business networks where they can realize the full value of blockchain ecosystem initiatives.

“Blockchain with its shared ledger is fundamentally a distributed technology. The world’s transactional systems are being reinvented on blockchain and need an enterprise platform that also is broadly distributed,” says Steve Cerveny, Kaleido founder and CEO. “We’ve built our platform across the leading public clouds including Microsoft Azure and Amazon Web Services to give our clients the ability to create global, cross-cloud networks. Our customers’ digital ecosystems can now grow and scale wherever and however the participants require.”

Kaleido on Microsoft Azure is available today, can you can try it for free at kaleido.io.

The post Kaleido Announces New Support for Microsoft Azure; Unveils Ability to Build Borderless Blockchain Platforms Across Clouds and Geographies appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/kaleido-announces-new-support-for-microsoft-azure-unveils-ability-to-build-borderless-blockchain-platforms-across-clouds-and-geographies/feed/ 0
Are Enterprises Nearing the End of the Traditional Router? https://mytechdecisions.com/it-infrastructure/are-enterprises-nearing-the-end-of-the-traditional-router/ https://mytechdecisions.com/it-infrastructure/are-enterprises-nearing-the-end-of-the-traditional-router/#respond Fri, 08 Mar 2019 21:00:56 +0000 https://mytechdecisions.com/?p=14957 Volta Networks unveils cloud-native router solution to reduce cost of ownership for companies.  

The post Are Enterprises Nearing the End of the Traditional Router? appeared first on My TechDecisions.

]]>
How much are you spending to outfit your organization with the appropriate number of routers? Chances are, more than you want—as your business and network expands, so does your investment in routing technology. Network operators face the daunting task of building and operating networks that continually need to scale to support the explosion in bandwidth consumption from hyper-connected consumers and businesses. Industry estimates expect global IP traffic to reach 4.8 zettabytes by 2022 – more than doubling in less than five years – and legacy router vendors only answer to keep pace is for customers to buy bigger routers. These are expensive, proprietary platforms that lock network operators into a specific vendor and raise compatibility concerns.

The Volta Elastic Virtual Routing Engine (VEVRE) from Volta Networks is a virtual routing platform that the company claims can reduce total cost of ownership by 90% compared to legacy routers. Plus it can be scaled to handle 255 virtual routers per switch as well as support for industry standard routing protocols and carrier automation. Based on patent-pending technology, Volta’s VEVRE is in lab tests and trials with network operators worldwide.

“With one of the largest networks in the world, we continually evaluate new technologies that can enhance the performance and economics of our network to better serve our customers,” says Dorian Kim, vice president of IP Engineering at NTT Communications Global IP Network. “In our tests, Volta’s virtual routing technology was completely interoperable with our existing routers while delivering potentially significant cost reductions.”

Cloud-Native Virtual Routing at Massive Scale

Designed on the virtualization and cloud-scale principles that helped disrupt the compute and storage markets, Volta’s VEVRE separates the control and data planes and leverages the benefits of the cloud and white box switches built on Ethernet switching silicon to achieve its order or magnitude cost savings. Key components include:

  • Volta Elastic Virtual Routing Engine(VEVRE) – the control plane runs on any public, private or hybrid cloud so customers can optimize for the lowest cost. Cloud-native using container technology, it is designed for elastic scale, flexibility and resiliency.
  • Virtual Route Processors (VRPs)– the equivalent of a routing engine, VRPs are hosted on the VEVRE, are unique to a specific customer, application or service, and assigned to a set of physical or logical ports on white box switches.
  • vAgent– Volta’s software agent on each white box switch that communicates through standard APIs to the associated VRP. The lightweight and efficient software minimizes processing overhead on the switch while Volta’s route compression optimizes its performance.

“Compared to the compute and storage markets, the networking industry is at least ten years behind in taking advantage of the benefits of virtualization and cloud scale – leaving network operators with little choice but to keep buying bigger, more expensive routers that are bloated with features they will never use. Volta was founded on the idea that it was time to entirely rethink routing,” adds Dean Bogdanovic, co-founder and CEO, Volta Networks. “With complete interoperability with any existing router in their networks, our VEVRE platform empowers network operators to begin scaling their network cost-effectively with virtual routing leveraging powerful white box switches and a cloud-based control plane.”

Carrier automation

Volta’s VEVRE is designed to support network operator and carrier automation, including NETCONF and YANG. Volta’s YANG model service library powered by an API approach complies with all the key standards for management, administration and network orchestration (MANO). The API provides a single point of connection for optimizing automation as compared to appliance-based routing solutions that still require network operators to manage every single box.

The post Are Enterprises Nearing the End of the Traditional Router? appeared first on My TechDecisions.

]]>
https://mytechdecisions.com/it-infrastructure/are-enterprises-nearing-the-end-of-the-traditional-router/feed/ 0