Monthly Archives: November 2014

Business Improvement Through Process Redesign

Business Improvement Through Process Redesign

Any business can benefit from process improvement; more efficient operations can improve customer service and satisfaction, drive additional business and goodwill and reduce costs, all leading to higher profits. Implementing a process improvement can be a disaster, however, if not done properly. A successful process improvement effort follows these steps:

  1. Identify and map the current process.
  2. Define the goal of the process improvement.
  3. Select a methodology.
  4. Recognize and address the risks that can derail the project.
  5. Plan and execute.
  6. Implement, evaluate and tweak.

Look closely at each step in more detail.

1. Identify and map the current process

Source: freedigitalphotos.net

Source: freedigitalphotos.net

Do not try to solve everything at once—select one process at a time. For your first attempt, choose a less problematic project that has a high chance for success. That way, you can go through the exercise, get your feet wet and be ready for tougher challenges.

When you have selected a target for your process improvement, look closely at existing methods and document the way they are done today. Without knowing where you are, you cannot move forward. Do not rely on existing process documentation, because your organization may have strayed from the written procedures. Be honest about where the gaps are.

2. Define the goal of the process improvement

To achieve your goal, you need a target that is specific and measurable. Instead of using “improve customer service” as a goal, try “reduce customer wait times to less than 1 minute on average” as an objective. Without a measurable goal, there is no way to determine if your redesign succeeds.

3. Select methodology

Numerous published methodologies have been developed for process improvements, including such well-known systems as Lean, Kanban and Six Sigma for manufacturing; Continuous Process Improvement for human resources processes; and Business Process Management for billing and collections. Do your homework and select the right methodology for your target process.

Invest in some outside help. Most process improvement methodologies have certified practitioners who can look at your organization and its processes objectively and guide you through the project. This requires time and money, but the investment should be weighed against the anticipated cost savings and increased revenue.

4. Recognize and address risks

Take a good, honest look at your organization and its culture, and identify the risks that can derail the project. Many organizations suffer from resistance to change, a “not invented here” mentality and other pitfalls. Each risk must be addressed and mitigated before getting too far into the project. Again, do research to find the best ways to deal with your organizational risks.

5. Plan and execute

Using your selected methodology, plan and execute your process improvement. The result should be a documented process that eliminates redundancy, bottlenecks, unnecessary materials and activities and other factors that plagued the old process.

6. Implement, evaluate and tweak

The final step is to put the new process into practice and evaluate it. Give it some time—few process improvements are overnight successes, because people must learn and grow accustomed to them. Evaluate the process against the goals you set at the beginning. If the goals are being met, congratulations! If not, you may need to revisit the design of the new process and do some tweaking to get it right.

When you have mastered one process improvement cycle, look for additional processes that can be improved. The experience you gain with each one will help you conquer the next.

Ashok

Written by Ashok Kumar, Manager, Information Security

Mr. Ashok Kumar brings over 14 years of Information Technology and Information Security experience to iBridge. He has worked in Healthcare, BPO, Telemedicine, Remote IT Infrastructure Monitoring and Management, Software development and Information Security Management. He has an understanding and knowledge network routers, L2 & L3 switches, virtual Cloud infrastructure, Firewalls, UTMs, Server architectures and Server OS platforms including Novell NetWare, UNIX, Windows, Linux, and Solaris.

Ashok has played key roles in system designing and capacity planning for enterprise class data intensive applications for distance learning and diagnostics in healthcare. Recently, he was the lead architect for design and deployment of a failover solution in healthcare for Patient Health Information (PHI) and demographics. He brings a well-balanced approach between budgets, requirements, and maintanance.

He leads the company in ISO 27001 process implementations, threat and risk assessment. He is responsible for all aspects of security at iBridge and maintaining a best-in-class environment for internal users and clients.

New Report: Data Breach Threat Prep Improves but Falls Short

New Report: Data Breach Threat Prep Improves but Falls Short

It is no secret, sensitive consumer data, is a popular target for hackers and cybercriminals. While there is room for improvement, a new study from Experian and the Ponemon Institute shows significant increases in awareness and preparedness for data breaches amongst executives and the companies they lead.

Just as important, as being prepared for data breaches is having breach action plans in place so companies can take appropriate steps to respond and minimize damages. Rather than waiting for the other shoe to drop, being prepared is a crucial first step to lessening the effects of consumer-compromising data breaches and their corresponding blows to companies’ reputations.

Source: freedigitalphotos.net

Positive Changes

The Ponemon study shows a 10 percent increase in companies who report they have privacy protection training programs in place. These initiatives train employees and contractors who may deal with sensitive consumer data on how it must be managed. In addition, nearly three-quarters of all companies have some type of breach action plan ready to go, and one-quarter of the companies surveyed say they currently hold some type of “cyber insurance” policy to help minimize the financial costs of devastating data breaches.

Tough Pills to Swallow

However, nearly one-half of survey respondents reported that their companies experienced a significant data breach (one involving more than 1000 records) last year. This number is up nearly 10 percent from the previous year’s data. More sobering is the 30 percent of respondents who reported that they believe their companies’ current response plans to be inadequate.

Merely having a plan is not enough. It must be airtight and regularly updated due to the rapidly changing nature of breach threats in a wired economy. From the report:

“Regular reviewing, updating and practicing a data breach plan based on changes in the threat landscape and a company’s structure are essential for properly managing a breach.”

That statement is a clear indictment of most companies’ data breach preparedness plans: they are not adequate to begin with, and they are not regularly updated to remain relevant. There is a long way to go still for proper data breach preparedness.

The Road Ahead

The Ponemon report makes a few key recommendations for companies hoping to make genuine good-faith efforts to secure consumer data and minimize their own financial risk should breaches occur:

  • Regular assessments and updates to existing breach response protocols
  • Involving the top brass – CEOs, boards of directors and others – in breach prep and risk assessment
  • Improved training for employees on how to properly guard sensitive consumer data

If companies hope to protect their most valued assets – their customers and their own reputations – from the devastating losses that may result from serious data breaches, it is time to get down to brass tacks. Having a plan is not enough; that plan must be regularly updated, tested and improved. Without a sincere effort at staying ahead of the threat, consumer data will remain ripe for the wrong parties’ picking.

Written by Ashok Kumar, Manager, Information Security

Mr. Ashok Kumar brings over 14 years of Information Technology and Information Security experience to iBridge. He has worked in Healthcare, BPO, Telemedicine, Remote IT Infrastructure Monitoring and Management, Software development and Information Security Management. He has an understanding and knowledge network routers, L2 & L3 switches, virtual Cloud infrastructure, Firewalls, UTMs, Server architectures and Server OS platforms including Novell NetWare, UNIX, Windows, Linux, and Solaris.

Ashok has played key roles in system designing and capacity planning for enterprise class data intensive applications for distance learning and diagnostics in healthcare. Recently, he was the lead architect for design and deployment of a failover solution in healthcare for Patient Health Information (PHI) and demographics. He brings a well-balanced approach between budgets, requirements, and maintanance.

He leads the company in ISO 27001 process implementations, threat and risk assessment. He is responsible for all aspects of security at iBridge and maintaining a best-in-class environment for internal users and clients.

Watching the Fox Watch the Henhouse: Preventing Data Breaches by Employees

Watching the Fox Watch the Henhouse: Preventing Data Breaches by Employees

Businesses and governments spend a lot of money and effort preventing hackers from accessing sensitive data, and rightfully so: the major data breaches that make the headlines are usually the work of outside agents who use known system vulnerabilities, trickery, and “brute force” methods to get credit card numbers, passwords and other saleable data. It is less well known, however, that many data breaches are carried out by the employees who are entrusted with that data. Sometimes this is deliberate and sometimes accidental, but either way, a data breach can ruin a firm’s business plans, intellectual property strategy and reputation.

There are several steps a business can take to prevent these inside-job data breaches. No method is foolproof, but the right combination of strategies, tools, and enforcement can stop all but the most determined miscreants.

Administrative Measures

Source: freedigitalphotos.net

Every employment or contractor agreement should require the employee to read and abide by data access and distribution policies, and should include a non-disclosure agreement. These policies and agreements should be carefully written so there is no ambiguity regarding what is expected and what the penalties are for non-compliance.

But people forget, so it is also prudent to remind them through periodic, mandatory training. This training need not be lengthy or onerous, but it should be frequent enough to keep employees thinking about data security and their role in it.

Another important administrative step is to implement clear data classification definitions: what kinds of information are company-confidential, and what can safely be shared with the outside world. When employees understand what types of data are sensitive and need to be protected, they can follow the rules more easily.

System Tools

Another advantage of clear data classifications is the ability to control access accordingly. Every employee should have only the access needed to perform his or her job duties. Sensitive data should be kept in database systems with role-based access controls and audit trails that show who did what and when.

There are numerous software systems available that can monitor employees’ activities, and some can even prevent copying certain files or file types to USB flash memory devices, email attachments and web sites. These solutions can be pricey and take time to set up properly, but can be a good investment for businesses that have a lot to lose from data breaches.

What about Employee Privacy?

Businesses have been using computers long enough that most employees understand that the computers and data they use at work do not belong to them, and that they have no reasonable expectation that their activities will go unmonitored on some level—which is another item that should be clarified in the initial employment agreement. Monitoring tools need not be obtrusive, creepy or threatening, and most employees will accept them as long as the software does not place unnecessary burdens on their ability to do their jobs.

You never know when even the most trustworthy people will make bad choices regarding their employers’ data. Every employee and contractor is potentially a fox watching the henhouse. But with well-planned policies and procedures, coupled with appropriate, correctly-configured tools, businesses can make inside data breaches difficult, if not impossible, and keep those hens safe.

Written by Ashok Kumar, Manager, Information Security

Mr. Ashok Kumar brings over 14 years of Information Technology and Information Security experience to iBridge. He has worked in Healthcare, BPO, Telemedicine, Remote IT Infrastructure Monitoring and Management, Software development and Information Security Management. He has an understanding and knowledge network routers, L2 & L3 switches, virtual Cloud infrastructure, Firewalls, UTMs, Server architectures and Server OS platforms including Novell NetWare, UNIX, Windows, Linux, and Solaris.

Ashok has played key roles in system designing and capacity planning for enterprise class data intensive applications for distance learning and diagnostics in healthcare. Recently, he was the lead architect for design and deployment of a failover solution in healthcare for Patient Health Information (PHI) and demographics. He brings a well-balanced approach between budgets, requirements, and maintanance.

He leads the company in ISO 27001 process implementations, threat and risk assessment. He is responsible for all aspects of security at iBridge and maintaining a best-in-class environment for internal users and clients.

Stormy Weather in the Cloud?

Stormy Weather in the Cloud?

With near-daily security breaches exposing celebrities’ private photos and compromising consumers’ bank accounts, it’s not too much of a stretch to say that when it comes to the cloud, it seems to be “raining all the time.”

There is no denying the convenience and efficiency of cloud-based storage and computing, but security remains lax and does not seem to get better. However, it is not just the average Joe or Jane and his or her small business assets that are at risk when there are security breaches: the federal government plans to dedicate one quarter of its $80 billion IT budget for migrating government data to cloud-based systems.

Source: freedigitalphotos.net

Cloud First

The Obama administration’s “cloud first” policy has encouraged all federal agencies to transition toward cloud-based solutions to improve efficiency and capacity, increase IT flexibility and limit costs. With the ongoing conversion of federal records to electronic versions stored in the cloud and many federal agencies switching to cloud-based email services, consumers’ shifting preferences for cloud-based computing are mirrored at the federal government level. This is an A+ for efficiency and potential savings, but at what cost to security and privacy?

Despite the federal government’s forward-thinking virtualization strategy, its own IT experts do not seem on board. In fact, survey data from MeriTalk found that nearly 80 percent of federal IT pros lack confidence in the ability of their agencies’ cloud vendors to secure data or to achieve FISMA compliance.

Challenges and Change

Until the ongoing weaknesses that continue to plague cloud storage are managed, the transition to a cloud-based federal government will probably not be without significant pains. Beyond the security concerns, an array of other complex challenges makes cloud-based solutions not quite the answer they’re made out to be:

  • Inability of cloud solutions to meet strict federal standards for record keeping
  • Portability issues, including how to migrate cloud data when large government contracts expire
  • Incompatibility of different cloud-based software solutions, making universal access a difficult and risky proposition

While the government works with cloud solutions vendors to obtain and maintain their storage systems, it is the vendors themselves that must focus on improving security. In a competitive market, there must be greater focus on how to create secure systems that meet government security and recordkeeping standards while making data migration simple. Until these issues are resolved, private citizens and federal agencies can make a few changes to help secure their valuable data and financial assets: involve records professionals in selecting cloud-based vendors; define security goals and select cloud solutions that work within them; and investigate potential vendors before signing a contract.

As the great skyward migration carries forward, private citizens and government agencies must be more discerning about their selection of cloud-based providers to secure the future of private and public data.

Written by Desh Urs

Desh Urs brings more than 20 years of entrepreneurial, start-up and Global 500 corporate experience in sales, marketing and general management to the customers of iBridge. He has led sales organizations as SVP at Qsent, Inc. and VP at Acxiom Corporation, and has focused on the usage of data in data distribution, direct marketing, fraud prevention, and law enforcement.

As a Vice President of Global Sales, Services, and Marketing at Silicon Graphics, Inc., Urs managed engineering and non-engineering functions, developing solutions in sciences, telecommunications, manufacturing, media, business, and defense intelligence, for companies with revenues of several billion dollars. During his tenure as Vice President at Think Tools AG and Brio Technology, Inc., he ran business development and alliances providing solutions in Business Intelligence and Decisions Cycle Management to Global 100 corporations worldwide. In the late 1980s, Urs founded Indus Systems, Inc., which he profitably sold to a systems integration company.

Urs serves on several Advisory Boards, as well as many company Boards, in the United States and India.

The Proper Dose of Big Data

The Proper Dose of Big Data

In the wired healthcare realm, it is common to hear the term “big data” tossed around. What does this term mean for the average patient and the average clinician?

The trouble with big data is that it is so big – both as a concept and in actuality. The data sets we are dealing with in big data are massive and complex, so traditional modes of analyzing and processing these data sets may not be up to snuff. Once security issues are added to the mix of difficulties of dealing with big data – how to capture, analyze, share and benefit from it – it is tempting to toss its use aside.

However, some clinicians are advocating for a more proactive approach to healthcare big data, even implying that neglecting careful analysis of big data could violate the Hippocratic oath. So what are the benefits for doctors and patients, and how do we make sense of big data in a digital world?

First Do No Harm?

Dr. Bob Wachter, a nationally-recognized hospitalist and advocate for doctors going digital, argues that big data is an absolute must for determining the best course of treatment for patients in need. While due diligence must be taken regarding treatment protocols and settings and the individual needs of each patient, there is another issue that many clinicians still ignore: the cost. Wachter argues that doctors must be more invested in determining the best way to treat patients without ignoring the important mission of controlling healthcare costs, and that not doing so is akin to “doing harm”:

“When we are profligate in our spending we don’t take advantage of the data we have to figure out the best way to treat patients, the best way to prevent bad things from happening, the cheapest way … to safely and effectively take care of a patient. Should that be in the hospital, should that be at home, should that be in a clinic? When we’re not doing that, I think we’re not following our Hippocratic Oath.”

Those are strong words, but Wachter makes a valid point. Snowballing healthcare costs in this country must be brought under greater control, and big data gives clinicians access to a huge amount of information that is useful for determining what treatment approach is best. When this data is ignored and costs are removed from the equation, we can end up with lives saved but ruined by financial crisis – on both a micro (individual patient) and macro (US healthcare system) level.

Where Do We Go from Here?

Wachter proposes a new equation for analyzing healthcare value: quality plus safety plus patient satisfaction divided by total treatment cost. Ignoring the cost factor is no longer feasible in the modern age. First, the economy will not allow it, and big data is available to help physicians determine what treatments will resolve health issues effectively and which are too costly and inefficient to bear. Along with addressing the huge hurdle of ineffectual healthcare security, new focus must also be given to how to better capture and apply big healthcare data. It is not only patients’ lives and wallets but also our nation’s economy that depend on it.

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsurcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.

To EHR Infinity and… Beyond?

To EHR Infinity and… Beyond?

The Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs provided hospitals and physicians with financial incentives to implement certified EHR technology and achieve “meaningful use.” To meet the requirements for financial incentives, healthcare providers must prove that they are meaningfully using the EHRs to record patient information, exchange care records, and meet other previously established thresholds for measurement.

Just last summer, the U.S. Department of Health and Human Services shared new data compiled by the Office of the National Coordinator for Health IT that showed “significant increases in the use of electronic health records.” So now what?

source: freedigitalphotos.com

The Post-EHR Era?

Now that the vast majority of private practice physicians are using at least a basic EHR platform, and nearly half are using advanced functionality EHRs, is it safe to say that providers have met or surpassed the minimal requirements for meaningful use? If so, what happens now?

It’s exciting to see such broad and successful adoption of EHRs, but some prominent healthcare players are indicating that this is just the beginning. Adoption is a key first step, but to capture the full capabilities of EHRs and address major ongoing security concerns, it is important to acknowledge that the technology currently in place is but the first in a long series of steps. Dr. John Halamka, CIO of Beth Israel Deaconess Medical Center in Boston, was recently quoted as saying: “EHRs are bi-planes, not yet jet aircraft.”

What’s Next?

Putting EHRs in place was a monumental challenge, and the U.S. healthcare system seems to have risen to it rather successfully. But now that the basic infrastructure is there, it is time to take some crucial next steps:

  • Addressing serious security concerns
  • Improving compatibility, especially for rural or smaller critical access hospital systems
  • Increasing patient access to EHRs to compile a more accurate lifetime health timeline completely portable for the globalized world
  • Taking a broader IT approach to EHRs, including improving storage options to help organize and protect private patient data and imagery

With such major work still to be done, could it be that providers’ ongoing struggles to reach the thresholds required for establishing “meaningful use” have held back the process? It’s no secret that navigating through federal bureaucracy to establish certification is not exactly a walk in the park.

The Future

The future may hold an entirely different healthcare system in which the current iteration of EHRs plays only a small part. Imagine linking wearable health technology such as FitBits and even incorporating health monitoring information in an EHR “live stream” that may alert physicians to potential health troubles before the patient is even aware. Now that the meaningful use framework has been well established, the outlook is exciting for the future of EHRs and other healthcare technology – as long as major issues like security can be “meaningfully” addressed.

Written by Dean Van Dyke, Vice President, Business Process Optimization

Dean Van Dyke is the Vice President of Business Process Optimization for iBridge. He brings more than 18 years of customer relations, business process outsurcing, lean six sigma, program/project management, records management, manufacturing, and vendor management experience to iBridge. Mr. Van Dyke was the former head of Microsoft’s corporate records and information management team, and served honorably for over fourteen years in the U.S. Navy and Army National Guard. He received his Bachelor of Science in Business Administration from the University of South Dakota and his Master’s in Business Administration from Colorado Technical University.