Thursday, August 23, 2007

Employers' Biggest Legal Mistakes

by Rob Gilmore
[Workforce Week August 19-25, 2007 Vol. 8 Issue 34]

Ten things that can explode into costly lawsuits, unionization and an unhappy workforce.

What are the biggest employee-related mistakes employers make these days? And how can you defuse these potential time bombs before they explode into costly disputes? Here's a quick overview of the top 10 employer mistakes and how to avoid them.

1. Failing to establish an effective sexual harassment policy.
Recent Supreme Court decisions hold employers liable for their supervisors' actions unless complaining employees fail to take advantage of company complaint procedures. In light of these rulings, implementing policies and procedures for dealing with sexual harassment is more important than ever. It is also essential that supervisors be trained on these policies and procedures. Finally, an employer must act in a timely manner to investigate all sexual harassment complaints that are brought to its attention.

2. Failing to pay overtime to nonexempt employees.
Many employers pay employees a salary regardless of the number of hours they work and whether they are subject to the wage and hour laws. Unless they are exempt as administrative, executive or professional employees, you must pay them time-and-a-half their regular hourly pay for all hours worked in excess of 40 per week. When in doubt about whether an employee is exempt, pay him or her hourly wages. This will avoid having to pay back wages if you're audited by the Department of Labor's Wage and Hour Division.

3. Failing to complete I-9 forms for new employees.
Many employers merely photocopy employee-produced documents without filling out the parts of the forms that describe the documents. This can be a costly mistake if the Immigration and Naturalization Service audits you. (One employer was reportedly fined $100,000.) You are not required to photocopy employee-produced documents, but even if you do, you must fill out the forms completely.

4. Failing to take and document disciplinary actions.
Supervisors, not wanting to be perceived as villains, hate to write up employees. Then, when the company can no longer tolerate unsatisfactory performances, the files do not document the poor records and you have no grounds on which to justify discharges. This leaves you open to lawsuits alleging discrimination. Employees who have been discharged for poor performance often have glowing evaluations in their files. This can expose you to lawsuits.

5. Failing to quickly discharge poor performers.
Employers are advised to progressively discipline employees and to give one warning too many rather than one too few. But often a time comes when failure to act is as bad as overreacting. If you have retained employees for many years despite poor attendance records, multiple infractions and even several "final" warnings in their files, you are asking for trouble. These employees are most likely to sue when finally discharged. The best course is to discharge a poor performer as soon as prudently feasible. The more seniority an employee has, the harder to justify discharging him or her.

6. You must be sure that laying off a group of employees has no disparate impact on any protected group.
To avoid lawsuits, verify that the group doesn't contain a disproportionately high percentage of age-protected employees or employees of a particular ethnic or racial group or sex compared to the rest of the work force. The decision of who will be laid off should be based on objective criteria, such as qualifications, experience, and ability to perform certain work essential to the company. If the decision to lay off one employee as opposed to another is based on such criteria, make sure the file supports this decision.


7. Failing to get a signed release from a terminated employee.
As an employer, you may have a legitimate reason for terminating an employee. However, you fear a lawsuit if the employee is a member of a protected class. Many employers are reluctant to use releases because they fear the release may educate the employee about rights and litigation possibilities of which he might otherwise be unaware. But this may be a case of sticking your head in the sand. In light of media attention given to employment discrimination verdicts, employers should not rely on a hope that workers do not know their rights. The right approach to avoid litigation often is to get signed releases from departing employees, particularly if any severance or separation pay is provided to the employees.

8. Conditioning employment offers on medical exams.
The Americans With Disabilities Act (ADA) bars employers from asking applicants about their disabilities or requiring medical exams before offering employment. You can ask applicants to take job-relevant medical exams only after offering jobs. The burden is on you to establish the medical exam's relevance to job requirements. In addition, employers often fail to accommodate their employees' disabilities after they are hired. The ADA requires employers to reasonably accommodate their employees' disabilities.

9. Failing to take proactive steps to keep your work force union free.
Employers must constantly communicate with their employees to deal with their grievances. If employees do not believe their employer is interested in their issues, they may look outside the workplace for representation.

10. Failing to retain labor and employment counsel to avoid making the first nine mistakes.
The proliferation of complex statutes prevents most employers from keeping on top of employment law without professional help.

360 Degree Feedback

Evaluating performance based on just the manager's review isn't always the most effective way to grow people within global organizations. Looking for a more balanced assessment of performance? Look in every direction with 360 Degree Reviews.

Now you can receive performance feedback from a variety of different sources including self and peer review - and even external sources - to get more insightful and realistic assessments of employee performance, competency gaps and development needs.

The 360 Degree Multi-Rater shapes the course of each employee's progress with:

Multiple-Input Ratings. Get comprehensive assessments from every relevant source
Detailed Gap Analysis. Identify areas for improvement and development, 'hidden strengths' and 'blind spots'
Pre-Populated Forms. Save enormous amounts of time over a standard paper-based system by using the role-based competencies and behaviors built into our simple, web- based forms
Comprehensive Writing Assistant. Create concise and meaningful assessments using suggested review text created by experts
Legal Scan. Automatically spot non-compliant language, significantly reducing HR review time

Which will become pervasive, Business Intelligence (BI) or Corporate Performance Management (CPM)?

Lately I have read articles and analysts’ reports discussing whether Corporate Performance Management (CPM) rather than business intelligence (BI) will become pervasive in the future. This seems to be a discussion about apples and oranges where we are talking about two entirely different things. CPM is a business application or process while BI is a technology.

All CPM projects use BI but not all BI projects involve CPM implementations. Using this logic, BI might become pervasive while CPM may not, however, if CPM does become pervasive then BI, since it is a technology used in CPM applications, will also become pervasive.

Why the confusion and the discussion of an either/or situation? Too often in our industry we confuse tools and vendors with business applications or processes. Too often products become identified with a term and then everyone’s projects become identified with that product. Let’s look at the consumer (IT and business) and the supplier (software vendors and their partners).

On the consumer side, it is human nature relating to associate the product/vendor used in an IT project as The application rather merely as the tool used for a particular application. Too often, over the years, I have heard that Oracle, DB2, Business Objects, Cognos or fill-in-your-own-IT-vendor was not very good for a particular DW, BI or reporting project. Asking a few questions during an assessment or project review, however, it is uncovered it was not really the tool that was the problem but rather the data (integrity, quality, availability or timeliness) or how the tool was implemented.

On the supplier side, some vendors follow every hype cycle or IT industry buzzword and label their products or solutions using their products as the latest and greatest fill-in-your-buzzword! Since many consumers do not have time to really analyze fact from hype, the label sticks and the product becomes associated with the business application or process.

And the jumbling of products with business applications or processes is reinforced by industry analyst reports, articles in industry publications, white papers, webinars/seminars/podcasts and case studies. If everyone says it, then it must be true!

By Rick Sherman

Monday, August 20, 2007

Managing Performance in a Virtual World

VIRTUALIZATION CHANGES THE GAME for performance monitoring and management. Read on for a primer on optimizimizing virtual environments, without compromising performance. Automation and adaptability is key….

Ah, virtualization…the promise of infinite flexibility and an example of software and hardware working together in perfect harmony to solve real problems within the data center. Without question, virtualization is a technology that is transforming the IT landscape and the practice holds significant promise to those who are looking to improve availability and IT processes — and ultimately make IT more responsive to business needs.

Virtualization provides too many benefits to just stand by and watch. According to the Yankee Group, 9 out of every 10 enterprises will have implemented virtualization into their IT infrastructure by the end of 2007. While the business case for adopting virtual infrastructure technology is clear and compelling, it is important for companies to understand the performance characteristics of virtualization so they can first put the right management tools and business drivers in place. One area in need of attention is application and performance monitoring and how that process will be able to function effectively in a virtualized environment.

Most companies recognize the value that application and performance monitoring brings to the table. Keeping a close watch on all systems and applications to ensure they are available, and then having the technology in place to resolve the issues and report back to the business with a clean bill of service health provides a level of assurance that can't be quantified in dollars and cents. ("Priceless") But just when everyone gets comfortable, virtualization comes along and changes the IT infrastructure into a dynamic and fluid (and chaotic!) entity.

Given that virtualization implies continual change, managing performance in a virtual environment will require the ability to adapt constantly to changing behavior patterns. While it has always been a challenge to manage the performance of all the moving (and changing) parts that comprise a company's IT systems, virtualization technologies promise to make it all that much more complex.

Certainly, existing systems management tools have been good at understanding infrastructure availability, but they lack visibility into the behavior of virtual resources and application performance. With Gartner reporting that 27 percent of IT executives have no confidence in their current performance monitoring tools, how can companies have confidence in virtualized environments if there is no confidence in the physical world?

DON'T HATE - AUTOMATE!

The industry is going to continue to see an up tick in the request for virtualization technologies, so we'd better get used to dealing with added complexity and constantly changing environments. The bottom line is that enterprises will never be able to scale their virtual and physical data center environments without automation.

Why? Because we have reached a point where it is humanly impossible to keep track of all of the constantly changing components and events within IT that affect the quality of service and the user experience. However through automation - and the availability of self-learning and continuously adaptive technology — we are able to achieve a level of business intelligence that allows us to understand system behavior and anticipate user interactions, heading off any anticipated problems before they even occur.

This approach makes performance management more intuitive and efficient. Using intelligence in this capacity - to automate the decision-making processes --represents a shift in thinking about managing IT, and the industry is following suit as more and more system administrators turn to automated, behavior-based tools in order to scale along with the increase in user demand.

What do you think accurate, realtime knowledge is worth to a system administrator who is in the IT trenches struggling to return a critical server to operation? Or a team in a data center where there is literally no physical room to house servers? Or maybe the question is better positioned as "what is the cost of downtime and will I have to pay Dr. Evil 'One Billion Dollars' to get up and running again?"

BEST PRACTICES: GET PREDICTABLE!

IT is supporting a mind-boggling number of increasingly complex and unpredictable user and technology interactions. Without the restrictions imposed by siloed, proprietary infrastructure platforms, performance has become difficult to predict. To restore predictability and bring performance consistency to virtual environments, IT management needs to adapt in the following ways:

SELF-LEARNING CAPABILITY: Heuristics and behavior analysis have been used in the IT security realm for years. Real-time behavior analysis provides the same benefit of self-learning anomaly detection in the data center. Rather than trying to model constantly changing performance variables, performance management should analyze behavior in real time and correlate infrastructure performance quickly to application performance and vice versa.

AUTOMATED THRESHOLD MANAGEMENT: As part of the self- learning capability, thresholds should be adaptive. Performance management tools for virtual environments should be able to learn and build behavior profiles for servers, virtual machines (VMs) and applications and also adapt thresholds for changing behavior.

VISIBILITY INTO INDIVIDUAL VM AND SYSTEM BEHAVIOR: With so many moving parts in a virtual infrastructure, it can be nearly impossible to isolate the cause of an application performance issue. It is critical to have visibility into the health of each VM and the health of the overall system.

PROACTIVE CAPACITY PLANNING: Performance management tools need to offer resource allocation and capacity planning before deployment as well as in production. The value of virtualization is flexibility and resource optimization. Tools that can deliver that optimization from the onset are the most valuable.

CONCLUSION
The benefits of virtualization are compelling, but successful adoption depends on having the right skills, management tools and business drivers in place. One critical area of importance is application and performance monitoring and how that process will be able to function effectively in a virtualized environment.

Companies will need to get a better grip on managing performance in a virtual environment by implementing management systems capable of learning and adapting to constant change. Self-learning performance management technology provides visibility into the behavior and the interdependencies of the virtual and physical resources and--most importantly--the applications and business functions they support.

By Jean-François Huard