Breach Management: The Value of Outsourcing Data Security

Breach Management: The Value of Outsourcing Data Security

According to most legal firms, managing cybersecurity is not a problem.

This mindset, unfortunately, is a much bigger problem.

A Law Department Operations (LDO) survey conducted by LegalTech news in 2015 found that only seven percent of respondents believed their law firms’ cybersecurity strategies could not protect their organization’s data. The consensus was that established cybersecurity policies were enough to handle possible breaches, despite reports from the FBI in 2011 that identified law firms as major targets of cybercrime.Breach Management: The Value of Outsourcing Data Security

One of the survey respondents even laughed at the lack of caution shown by his/her fellow LDOs: “Not only will big law firms be breached, but they have already been breached. They are just not talking about it.”

With cybercrime on the rise and many legal firms feeling overconfident about their cybersecurity policies, how can law firms be sure that they are keeping data safe?

Outsourcing Data Security

If the first step of correcting a problem admits that the problem exists, then legal firms must acknowledge their weaknesses in the areas of cybersecurity and data control. Legal firms are not experts in data security, despite the valuable information sent through legal servers each day. This makes most firms ill-equipped to handle cybersecurity on their own.

Big companies may have dedicated IT security teams, but not all firms enjoy this benefit. If a business lacks in-house expertise, working with third-party security professionals may be necessary. However, not just any security provider will do the trick—the data security team chosen should be able to handle a wide range of issues:

  • Compliance with federal guidelines for data security
  • Hardware security, including desktop computers, cloud storage, external hard drives, and server infrastructure
  • Software security, including updating versions, patching known vulnerabilities, and maintaining malware protection
  • Big data management by way of identifying redundancies, controlling user access to sensitive data, and creating incident response plans

All other considerations aside, there are three primary things to look for when selecting a cybersecurity service provider:

  1. A firm’s ability to monitor a system in real-time to recognize breaches before they happen,
  2. The ability to stop attempting breaches from occurring,
  3. Response strategies in place if a breach occurs.

Firms that have established protocols in these three areas will have a comprehensive system for detecting and responding to cybercrime. In addition, with hackers developing new strategies for data theft every day, legal firms, do not have a second to waste in getting their data security frameworks up and running.

Dean Van Dyke iBridge LLC

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsourcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

iBridge Newsletter10 Legal eBook CTA

Big Impacts of Big Data: Why Analytics are Necessary to Drive Growth

Big Impacts of Big Data: Why Analytics are Necessary to Drive Growth

Although businesses are learning the value of analyzing big data with analytics, the process can be difficult to manage on a large scale. Many enterprises begin with basic data analysis but become bogged down as their efforts progress, preventing them from receiving the full value of their big data analytics.

Organizational adaptation is needed on a large scale to drive growth. This usually involves a change in priorities—businesses need to identify the areas most critical to success and apply analytics across the entire enterprise.

Barriers to Change

Investing in analytics can be a hard pill to swallow for bottom-line focused executives, as early analytic applications don’t always drive defensible returns.graphic-1142957_1280 However, this fundamental fear undermines the true value of big data analytics—innovation and meaningful insights can only be found by assessing data in context and with the necessary scope. Executives (who fail to see how analytics can improve decision-making) prevent this large-scale assessment from taking place by underfunding the analytic tools, training, and quality controls that quality data analysis needs.

On top of that, businesses without established analytic infrastructure have a tough time of capturing the value that analytics provide. Often, industry-wide shifts are necessary to set an enterprise towards better analytic governance, and yet top-level executives are wary of this transition and its costs.

Scaling to Increase Impact

Despite the barriers to change common in companies beginning to leverage big data, new technologies are developing that help address the challenges of achieving scale:

  • Analytics software is improving in sophistication, allowing more targeted solutions that better address the specific needs of each business. This translates to a more direct and demonstrable impact on a businesses’ bottom line.
  • Users of analytic tools are gaining confidence in the value of analytics; this push towards adoption is a necessary part of analytics generating enough momentum to become a viable option at scale.
  • Aside from the improved tools on offer, businesses hoping to achieve scale must adapt their internal policies to reflect analytics’ increased role. Decision-making must become intertwined with analytic reporting, along with a general push towards a culture of data governance that involves redesigning jobs and placing faith in analytics.

Analytic solutions are becoming more accessible than ever, changing the way businesses will handle big data. As technology continues to grow, a foundational culture of analytic reporting and data management will become a necessity for businesses hoping to stay lean and profitable. Businesses need to speed up their data-analysis transformation practices.

Desh Urs iBridge LLC

Written by Desh Urs

Desh Urs brings more than 20 years of entrepreneurial, start-up and Global 500 corporate experience in sales, marketing, and general management to the customers of iBridge. He has led sales organizations as SVP at Qsent, Inc. and VP at Acxiom Corporation, and has focused on the usage of data in data distribution, direct marketing, fraud prevention, and law enforcement.

As a Vice President of Global Sales, Services, and Marketing at Silicon Graphics, Inc., Urs managed engineering and non-engineering functions, developing solutions in sciences, telecommunications, manufacturing, media, business, and defense intelligence, for companies with revenues of several billion dollars. During his tenure as Vice President at Think Tools AG and Brio Technology, Inc., he ran business development and alliances providing solutions in Business Intelligence and Decisions Cycle Management to Global 100 corporations worldwide. In the late 1980s, Urs founded Indus Systems, Inc., which he profitably sold to a systems integration company.

Urs serves on several Advisory Boards, as well as many company Boards, in the United States and India.

iBridge NewsletterUnderground eBook CTA

Two Sides of the Coin: The Inseparability of Process and Big Data

Two Sides of the Coin: The Inseparability of Process and Big Data

“Big data” is a buzzword in the online world. From an online pundit’s perspective, big data is the key to marketing success, business optimization, and overall project efficiency.

Is it that simple, though?

Two Sides of the Coin: The Inseparability of Process and Big Data

Image Courtesy of radnatt at FreeDigitalPhotos.net

Unfortunately, data alone cannot provide the meaningful insights to enact organizational change. Data is just one side of the coin. The other the process by which the data is created or acquired. People do not want to see a magic trick; they want to see how it is done.

This transparency is a logical step towards better overall information governance—knowing only the outcome does not provide the competitive insight that the process reveals. Process visibility is increasing in all markets, from customer service to sales to technological development. Industries are learning that the “journey” is just as important as the outcome, and big data is no exception.

Process Analysis of Big Data

Examining the comprehensive process of big data management involves three aspects:

  • Data Quality: Accurate and useful data is necessary to make improvements in any organization. Regardless of what type of data is collected, a system for data quality assurance must be implemented. Trustworthy and actionable data is the cornerstone of effective decision-making.
  • Data Extraction: Data is rarely confined to a single location. Big data aggregation involves collecting information from widespread and diverse sources, and is a more complex process than many people realize. This is where big data and process become intertwined—the variety of ways data is transformed and applied in databases can influence how it’s analyzed. Documenting these extraction methods is necessary to gain a comprehensive picture of how businesses arrive at meaningful results.
  • Data Analysis: After collection, data must be put through analytic algorithms that provide insight into where processes can be improved. Process documentation is essential here, as analytic sorting usually relies on mathematical formulas and suffers from an inherent lack of transparency. Knowing the process by which this data is assessed and how it is applied is an inseparable part of building trust in the data assessment process.

Successful outcomes rely on your ability to describe, define, and adjust your processes. Data is great—but it is not enough on its own. Including the process by which the data is found provides insight that translates to better business transparency, process visibility, and decision-making.

Dean Van Dyke iBridge LLC

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsourcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

iBridge NewsletterUnderground eBook CTA

7 Barriers to Cloud-Based Big Data Adoption

7 Barriers to Cloud-Based Big Data Adoption

The concept of “big data” is all the rage these days. Healthcare industries are constantly on the lookout for new ways to aggregate and apply the wealth of information available to them. One of the most common solutions involves a shift towards digital enterprise and cloud-based applications. Unfortunately, adapting the healthcare infrastructure to accommodate big data poses several challenges:

Transition to the cloud

As the cost of physical data center locations becomes more expensive, many healthcare facilities are electing to move their data centers into the cloud. This is a costly move, and requires top-down organizational changes to the way data is collected, stored, and analyzed.Cloud

Security compliance

While cloud-based infrastructure is convenient, it’s also more vulnerable to unauthorized access. Hybridizing physical data centers with cloud systems creates a nightmare of security challenges. The healthcare field is a prime target for hackers, and increasingly complex data storage centers create more openings to be exploited.

Job requirements

Reliance on a cloud-based infrastructure requires different skills for healthcare employees. Skills that focus on application are taking precedence over physical transmission and security. The changing datacenter is more agile and complex than ever before, meaning that employees trying to keep up must diversify their skillsets to stay efficient and competitive.

Global connectivity

Managing a local data center is challenging, but nowhere near as difficult as attempting to manage a cloud-based enterprise that spans multiple geographic regions. As technology improves and barriers to international business are reduced, healthcare facilities must know of the regional challenges, legal restrictions, and resource commitments that take place when operating on a global level.

ERP applications

The changing digital landscape affects how all information is processed, including the resource planning applications that drive the healthcare industry. The transition to the cloud creates the need for ERP applications that aren’t just optimized for on premise installation, but are also integrated into virtual platforms.

Automation

Automating data analysis in the healthcare industry can streamline production, reduce complexity, and facilitate a more efficient use of resources. However, this process can be costly to implement and requires integration with unique software-defined networks.

Disaster recovery

While doomsday scenarios aren’t usually included in budget allocation, the shift towards digital healthcare enterprise provides an extra level of security for healthcare facilities that only have physical locations. Moving data to the cloud protects it if a natural disaster occurs or server malfunction, giving healthcare administrators a fallback option and better peace of mind.

Dean Van Dyke iBridge LLC

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsourcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

iBridge NewsletterUnderground eBook CTA

Holistic Improvement: Why You Need It and How It Helps

Holistic Improvement: Why You Need It and How It Helps

More often than not, the determining factors between quantity and quality of improvement are approach and implementation. Numbers increasing month-over-month resulting from strategic lay-offs, product streamlining, or introducing new markets is often expected and easily assessed. Conversely, improvement issues and subsequent opportunities can be difficult to understand and alter if not evaluated holistically.

The Holistic Approach to Improvement

Too often, individual departments or branches operate in their own bubble. Resource sharing and big picture communication is seldom implemented to benefit the organization, and as a result improvement quality can suffer.

Holistic Improvement: Why You Need it and How it Helps

Image courtesy of Stuart Miles at FreeDigitalPhotos.net

Employing a holistic approach to improvement and treating your organization as an economic ecosystem means viewing each separate entity as part of a moving, living and contributing piece of the puzzle. Developing improvement strategies that not only benefit the organization, but that can also be implemented at multiple levels is essential to forward momentum.

Contributing Components of Holistic Improvement

It’s synergy and cooperation you’re looking for, and addressing key components in a holistic approach to improvement can be done through multiple facets.

  • Leadership – Being in a leadership role means taking responsibility and initiative. Leadership must be a priority, not just a position. Often, quality improvement is made when leaders motivate employees and help direct paradigm shifts within an organization. 
  • Strategic thinking – Strengthening strategic thinking and focusing efforts on a few intelligently conceived improvement tactics over multiple sub-par methods can prove longer lasting and more impactful.
  • Vision and culture – Creating an operational roadmap or blueprint that outlines clear expectations and improvement goals is invaluable when trying to articulate an organization’s quality vision. A clear vision is needed to successfully cultivate a quality improvement culture.
  • Mission critical problems – Organizing multiple quality improvement projects that focus on specific-yet-complementary areas of an initiative is often the best way to solve complex, multi-faceted “mission critical” problems.
  • Big Data implementation – Use of Big Data resources to fuel a quality improvement culture offers insight into solving previously unsolvable problems.
  • Product/process robustness – The human factor still proves to be the largest source of variation within a system. Developing product and process robustness to counterbalance human variance helps safety net an improvement culture. 

It’s important to develop and apply holistic improvement methods when evolving your organization’s quality culture. While consumer tastes and employee competency can transform over time, improvement never goes out of style.

Dean Van Dyke iBridge LLC

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsourcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

iBridge NewsletterBack-Office eBook CTA

How Big Data Complexity Is Redefining Legal Tech

How Big Data Complexity Is Redefining Legal Tech

The volume of data accumulated daily in any law practice is massive, and continues growing annually. This, combined with the increase in technological evolution seen in the legal industry, are together reshaping the face of daily business within a law firm. But is technology changing the legal space, or is the legal industry affecting the development of legal technology instead?2M1AXEU9Q2

The Chicken or Egg Argument

The age-old question over which came first for the chicken and the egg has a lot in common with technology vs. legal industry debate. Unlike chickens, though, technology and transformation within the legal industry take place simultaneously, with each mutually informing the other.

New tech entrants into the marketplace focus on legal disruption, responding to the drive from clients. Changing the way resources are structured takes priority. This not only involves concerns over the sheer volume of data accumulating, but also the increased complexity in keeping up with regulations and compliance mandates.

Data does not exist in a vacuum. For law firms particularly, data must be filtered in a usable way, converting raw information to knowledge. This is just as true for gap analysis during contract management as it is for fact development for a case. A greater understanding must be reached about the best way to collect and manage data within an organization, adopting an approach that works for the daily needs of the business.

One Size Does Not Fit All

None of this should imply there is a “one-size-fits-all” solution for big data management. Instead, the solution lies in implementing various tools and platforms that are flexible enough to work for a range of different client needs. Applying the tactics to the concept, document or data at hand accomplishes clients’ goals, as disparate as they may be.

The traditional legal model is cumbersome and time-consuming, qualities that do not lend themselves well to today’s fast-paced working environment. Clients, boards and shareholders want more comprehensive results, and want those results faster. To do this, start by understanding client risks, then draft a solution designed to meet those specific needs.

The legal industry can no longer claim that technology capable of this level of flexibility does not exist. Instead, legal departments must adopt a culture of technology that lends itself to better process management regarding today’s reality of handling big data.

Desh

Written by Desh Urs

Desh Urs brings more than 20 years of entrepreneurial, start-up and Global 500 corporate experience in sales, marketing and general management to the customers of iBridge. He has led sales organizations as SVP at Qsent, Inc. and VP at Acxiom Corporation, and has focused on the usage of data in data distribution, direct marketing, fraud prevention, and law enforcement.

As a Vice President of Global Sales, Services, and Marketing at Silicon Graphics, Inc., Urs managed engineering and non-engineering functions, developing solutions in sciences, telecommunications, manufacturing, media, business, and defense intelligence, for companies with revenues of several billion dollars. During his tenure as Vice President at Think Tools AG and Brio Technology, Inc., he ran business development and alliances providing solutions in Business Intelligence and Decision Cycle Management to Global 100 corporations worldwide. In the late 1980s, Urs founded Indus Systems, Inc., which he profitably sold to a systems integration company.

Urs serves on several Advisory Boards, as well as many company Boards, in the United States and India.

Newsletter-CTA1-1024x129Law-Firms-and-Cyber-Attacks-eBook-CTA-1024x444

The Proper Dose of Big Data

The Proper Dose of Big Data

In the wired healthcare realm, it is common to hear the term “big data” tossed around. What does this term mean for the average patient and the average clinician?

The trouble with big data is that it is so big – both as a concept and in actuality. The data sets we are dealing with in big data are massive and complex, so traditional modes of analyzing and processing these data sets may not be up to snuff. Once security issues are added to the mix of difficulties of dealing with big data – how to capture, analyze, share and benefit from it – it is tempting to toss its use aside.

However, some clinicians are advocating for a more proactive approach to healthcare big data, even implying that neglecting careful analysis of big data could violate the Hippocratic oath. So what are the benefits for doctors and patients, and how do we make sense of big data in a digital world?

First Do No Harm?

Dr. Bob Wachter, a nationally-recognized hospitalist and advocate for doctors going digital, argues that big data is an absolute must for determining the best course of treatment for patients in need. While due diligence must be taken regarding treatment protocols and settings and the individual needs of each patient, there is another issue that many clinicians still ignore: the cost. Wachter argues that doctors must be more invested in determining the best way to treat patients without ignoring the important mission of controlling healthcare costs, and that not doing so is akin to “doing harm”:

“When we are profligate in our spending we don’t take advantage of the data we have to figure out the best way to treat patients, the best way to prevent bad things from happening, the cheapest way … to safely and effectively take care of a patient. Should that be in the hospital, should that be at home, should that be in a clinic? When we’re not doing that, I think we’re not following our Hippocratic Oath.”

Those are strong words, but Wachter makes a valid point. Snowballing healthcare costs in this country must be brought under greater control, and big data gives clinicians access to a huge amount of information that is useful for determining what treatment approach is best. When this data is ignored and costs are removed from the equation, we can end up with lives saved but ruined by financial crisis – on both a micro (individual patient) and macro (US healthcare system) level.

Where Do We Go from Here?

Wachter proposes a new equation for analyzing healthcare value: quality plus safety plus patient satisfaction divided by total treatment cost. Ignoring the cost factor is no longer feasible in the modern age. First, the economy will not allow it, and big data is available to help physicians determine what treatments will resolve health issues effectively and which are too costly and inefficient to bear. Along with addressing the huge hurdle of ineffectual healthcare security, new focus must also be given to how to better capture and apply big healthcare data. It is not only patients’ lives and wallets but also our nation’s economy that depend on it.

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsurcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

The Top 5 Tips for Big Data Use in Healthcare

The Top 5 Tips for Big Data Use in Healthcare

Like virtually every other industry, success in the healthcare sector these days relies primarily on becoming more data-driven. Leveraged properly, big data can help deliver better patient care while at the same time reducing the per capita cost of that care. A targeted investment in big data with regards to healthcare analytics combined with best practices in the big data space can be a recipe for analytic success. Here are a few tips that can help you get started.

freedigitalphotos.net/cooldesign

1. Set Clear Goals

The first step in a successful big data analytics project is to define your business objective. Knowing exactly what you want to accomplish with big data at your back is a must before launching into a new idea. For example, are you trying to answer specific business questions, the scope of which exceeds traditional tools? Or do you want to make future predictions that could shape the way you make business decisions next quarter? Without taking the time to set definitive goals ahead of time, you run the risk of creating a very expensive failure.

2. Take a Comprehensive Approach

It’s natural to assume that analysis only applies to previously unstructured data, but don’t forget to take into account the answers that are probably hiding in data that’s already been processed and cleansed. You also need to include data from not-so-obvious sources, like social media and web logs. Any data analysis project has to be all-inclusive in order to establish a meaningful big picture.

3. Embrace Discovery Analytics

Big data doesn’t exactly replace legacy evidence-based research, but effective analytics are essential to separate out the chaff. There’s really no difference between discovery analytics and big data analytics. Big data analytics aren’t just about reporting; they help inform diagnosis and strategy. Through the use of new algorithms and data visualization techniques, big data can speak volumes—and far more clearly.

4. Simplify

Big data doesn’t have to be overwhelming if you take a simplified approach. Choose analytics technologies that help you connect using familiar tools, and that also support short-cycle iterative analysis. This helps open the analysis field to more minds than just a handful of highly paid specialists.

5. Engage Outside Experts

Managing big data is no small task; no matter how skilled your IT staff and existing analytics team, your big data project can surely benefit from some specialized support. Working closely with an experienced vendor can shorten the learning curve tremendously when it comes to figuring out new processes for big data analytics.

Written by Dean Van Dyke

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsurcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.