Responding To The Cybersecurity Skills Shortage

IRBlog_0625_USE-ism copy (2)

An estimated 1 million information-security staff and managers are needed to fill the current global demand, according to Cisco’s 2014 Annual Security Report. The shortage of IT security talent is real and likely to accelerate in the near future as the number of devices that are connected to the Internet continues to expand.  The International Information Systems Security Certification Consortium (IISSCC) has calculated that more than 300,000 cybersecurity professionals are needed to maintain and manage the increasing complexity of information business systems.  The IISSCC reports that a shortage of security experts with leadership and communications skills poses a direct challenge to global organizations.

According to a report by the Center for Strategic and International Studies, and sponsored by information security firm McAfee, global cyber-crime was estimated to cost individuals, companies and government between $375 billion and $575 billion in 2013, and could be a factor in the loss of more than 200,000 jobs in the United States alone. It’s not as if the threat was unexpected; IT professionals have been warning of the pending shortage and its impact on business systems for years.

Many organizations are understaffed when it comes to qualified cybersecurity professionals or employ staffs that are lacking in experience and necessary skill sets to combat an increasing number of breaches of system security. Most organizations need to carefully evaluate their existing IT security team to identify weaknesses in individual skills and provide necessary training and updated tools to bolster their security team’s capabilities. In some cases, outsourcing the responsibility for security may help in areas where the internal security team is particularly challenged. Organization leaders should consider the pending IT security skills shortage when making decisions on business processes, applications and device support.

The magnitude of some recent breaches of company data has had a major impact on the financial well-being of the affected organizations. As a result, the responsibility for making decisions concerning the company’s security info structure is steadily gravitating up the chain of the management hierarchy.   Implementing a culture of compliance, offering advanced training and increased support for security functions is important to better prepare a company for the inevitability of the next cybersecurity breach.

 

 

 

 

A Hiring Strategy for Getting it Right

IR618_iStock_000016148552Small (2)

In today’s world of employment, choosing the right candidate for an open position can be an arduous task for most businesses.  The time and effort needed to fill open positions with the right hires is proving to be time consuming and costly even when making the right decision.  Selecting the wrong candidate can result in higher turnover; have a negative impact on workplace morale; and reduce an organization’s overall productivity. The Harvard Business Review recently pointed out that as much as 80% of employee turnover is due to bad hiring decisions. Getting the decision right up front has never been more important. Getting it wrong has never before been more costly. A hiring strategy for getting it right includes some simple but important fundamental elements.

Revaluate and verify requirements. With the ongoing impact of a poor performing economy on business, few if any existing job descriptions reflect the reality of today’s workplace environments. The need to downsize has resulted in the need to increase the scope of the retained positions, making past job descriptions obsolete. Revaluating and identifying the true needs and wants of the organization is a critical first step in an effective hiring strategy. As with any formula, importing flawed data can only produce flawed results.

Use all the tools and resources available. Making a hiring decision should not be the sole purview of any one individual or singular departmental function. Consultation and collaboration will improve the outcome of the final hiring decision.  Involving others in the interviewing process will help flush-out a prospective candidate’s qualities and capabilities and identify candidates that will better fit the organization’s culture and workplace climate.

Don’t rely on “seat of the pants” technology alone.  The “he or she feels right to me, hire them” approach is still being practiced, even in the most trendy of companies. But using computers and software (big data) to objectively evaluate a candidate’s skills, experience and knowledge can be very effective to objectively determine the suitability of a candidate and will remove undesirable bias from the hiring process.  Choosing the right data tool will enhance the decision making process and will focus the hiring efforts on the organization’s specific goals and objectives.

Don’t overlook the intangibles. A candidate’s skill set isn’t limited to functional, “hard skills” capabilities but also includes how well he or she works in a collaborative environment. Overlooking or down-playing the importance of soft skills such as leadership and communication may result in a missed opportunity for capturing a very desirable employee.

Cultivate the personal connection.  Identifying a strong resume or profile is just one aspect of a successful hiring formula. Establishing and cultivating a personal relationship during the interview cycle will reveal important character traits not readily identifiable on a job seeker’s written document. Casual personal conversation often permits an interviewer the opportunity to ask open-ended questions and to get a deeper understanding of the candidate’s personal qualifications and attributes that determine an applicant’s qualifications for satisfying the company culture requirements.

Ask for the sale. In this very competitive job market and economy, qualified people are in high-demand. The best candidates have multiple opportunities, making marketing an important part of the process of hiring and retaining qualified talent. Selling your favorite candidate on the features and benefits of your organization may be the most important factor for a job seeker’s decision. Show them why they should choose your organization over a competitor. Present a desirable and competitive sales pitch and ask for the sale.

 

Is A Breach of Personal Medical Information the Next Cyber-security Calamity?

IRS060514iStock_000020237654Small (2)

Recent data breaches at major retailers and other retail chain stores, have made everyone aware of just how the loss of control over customer credit card data can impact a company and its valued customers. While the retail industry appears to be taking all the heat for cyber-security problems lately, there is plenty of opportunity for disruptive intrusions of personal data to go around for every industry sector. While the loss of personal credit card information poses significant pain and inconvenience to the retail consumer, a relative loss of patient, personal medical information in the healthcare sector could result in a calamity that would dwarf the impact experienced by the country’s biggest retailers.

In recent published studies, health care companies continue to unnecessarily expose their networks and patient data to online thieves. Health care fraud costs the United States $80 billion, according to the FBI. Currently, the U.S. spends more than $2.7 trillion on health care annually. The potential reward from such a huge economic sector is very tempting for criminals who are stealing patient records to commit medical identity theft and with the implementation of the Affordable Care Act (ACA) the situation is sure to get worse. According to a new report from privacy and information security research firm Ponemon Institute, it is estimated that these breaches will cost the industry about $5.6 billion a year. Once breached, personal credit card information can be relatively easy to replace and secure with new customer security information and new card numbers. But personal medical records, which can include social security numbers, personal health information and healthcare provider and health insurance data, represents more of genie out of the bottle scenario. Once it gets loose, putting things securely back into the bottle is a much more significant dilemma.

The problem of securing medical records seems to stem from employee negligence, unsecured mobile devices and third-party contractors who have access to the sensitive patient information of the health-care organizations they work with, including the medical center, an ambulance company, outside labs, doctors who don’t bill through the hospital, health insurance providers, and possibly a debt collector.

A large majority of information security officials identify employee negligence as the industry’s biggest cause of secure data leaks. With the growing use of personal devices like smartphones, laptops and tablets, there is an ever-increasing risk of exposing personal identifying medical information.  Approximately 88 percent of medical care facilities permit employees to use their own mobile devices to access patient information, increasing the concern that these personal devices are not secure.  Now that medical records are being digitized, it makes all that proprietary data more portable and more accessible to more people, including criminals within the organization and outside hackers. Despite the risk and the impact that data breaches have on patients’ vital, personal information, the truth remains that many healthcare providers are just not prepared or do not have the resources necessary to deal with the combination of threats that can lead to significant liability for the caregiving entity.

In response to recent lapses in medical data security, the U.S. Department of Health and Human Services (HHS) has begun to crack down on organizations that put their patient data at risk. Earlier this year, HHS fined two health care organizations — New York and Presbyterian Hospital and Columbia University — $4.8 million for “failing to secure thousands of patients’ electronic protected health information.”  Last year, the managed care giant, WellPoint agreed to pay the HHS $1.7 million to settle potential violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules.

In the traditional environment where healthcare practitioners are often observed whispering private patient information in hushed tones, failing to silence the data scattering explosion of personal medical information across the world-wide electronic network seems to be an unfathomable outcome.

 

 

 

 

 

 

 

 

 

 

Predicting The Impact Of The Second Machine Age On Employment

Hightech doctor

The discussion of effects of new technologies on traditional jobs and overall future employment trends is nothing new. Anyone who is fortunate enough to have experienced the effects of technology on the manufacturing sector several decades ago can attest to the eventual success and desirability of the end result on employment and the economy. Initially the introduction of robotics, numeric programming, with software that could mimic basic intelligence, sent panic through the massive houses of industrial machine tools operated by hundreds or even thousands of manual laborers. The prediction was that with new skills training and promises of huge gains in productivity the transition, while mildly disruptive to the century’s long traditions of manufacturing, would result in many more new and better jobs. The targeted blue collar worker roles would see opportunities for higher paying and higher skilled jobs.

As we enter another era of technological growth, one that sees unprecedented rapid advancements in everything digital, the discussion of technologies disruption on present and future employment rages on. The new target is white collar jobs and the recurring prediction from many theaters of thinking is that as information technology allows the automation of more and more middle-class jobs, fewer workers will be able to find work.

A report released late last year from the Oxford Martin School’s Programme on the Impacts of Future Technology attempts to quantify the extent of the threat. It concludes that 45 percent of American jobs are at high risk of being taken by computers within the next two decades. The authors believe this takeover will happen in two stages. Computers will begin to replace people in vulnerable fields such as production labor, transportation/logistics and administrative support.

The rate of replacement may lose steam because of logjams in harder-to-automate areas like engineering.

Authors refer to this as the “technological plateau” and see a second wave of computerization, dependent upon the development of good artificial intelligence being not far behind. At risk at this stage could be management, science and engineering, and even arts positions. But this rate of computerization could be influenced by issues such as access to cheap labor and regulation of new technology. “Our findings thus imply that as technology races ahead, low-skill workers will reallocate to tasks that are non-susceptible to computerization — i.e., tasks that required creative and social intelligence,” the authors write. “For workers to win the race, however, they will have to acquire creative and social skills.”

Massachusetts Institute of Technology professors Erik Brynjolfsson and Andrew McAfee have a more optimistic view. The authors of “The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies,” agree that intelligent machines will increasingly displace many workers. But they believe that work will eventually shift to new jobs that technology creates. Workers will be displaced, not replaced. It is an argument first advanced decades ago and one that proved to be more true than false, but an important difference is arising with the arrival of the second machine age.

Computers have begun to infringe on white-collar jobs and increasingly automate even professional positions. Today’s advanced technology is challenging once settled assumptions as to the limits of what software design could accomplish without human intervention and interaction. Artificial intelligence is far exceeding realms once imagined. Computers can diagnose and predict survival rates for conditions like breast cancer from x-rays and as accurately as radiologists.

At the center of the debate is the question: Will the experience of the past be replicated in the coming era or will the speed and extent of the technological advancements produce a profoundly different result? Past performance is no guarantee for the future. Will today’s advancement of extraordinary technology be  different and eliminate jobs on a significant scale?

It appears that while the new digital era has ushered in profound changes in how society labors, recreates and interacts socially, only time will reveal how all this new technology will impact the way we all make our living in this world. But no matter what side of the debate you’re arguing, one fact is clear, as Brynjolfsson and McAfee tell us, “There’s never been a worse time to be a worker with only ‘ordinary’ skills and abilities to offer, because computers, robots, and other digital technologies are acquiring these skills and abilities at an extraordinary rate.”