Tuesday, September 25, 2012




Another vast cost Obama has inflicted on the American people

Anybody who knows how frequently government computer systems never work properly would never mandate such a monstrosity.  Britain spent over 20 BILLION dollars on a nationwide health records systems only to give up on it.  They never got it to work properly.  Where I live, just the PAYROLL system for the Health Dept. looks like costing a billion dollars and it is still not working properly after several years of trying.  Small scale computerized health records systems (covering one  group practice, for instance) sometimes work well enough but they take up a lot of the doctors' time -- time that could be spent with patients.  Another Left-inflicted disaster awaits Americans -- JR

In its early days the Obama administration avidly promoted financial incentives for the adoption of electronic health records by medical providers. According to Obama, the adoption of electronic health records would save $80 billion in health care costs. The use of electronic health records has been touted by the government (state and federal) for years. Financial incentives for the adoption of electronic health records by Medicare and Medicaid providers, however, were first funded through the administration’s trillion-dollar so-called “stimulus” program, rammed through Congress in February 2009.

At the time Drs. Jerome Groopman and Pamela Hartzband (both self-advertised Obama supporters) criticized the promotion of electronic health records essentially as a crock in the pages of the Wall Street Journal. Drs. Groopman and Hartzband were on to something. Yesterday’s New York Times reported that “Medicare costs rise as records turn electronic.”

The Times does not observe that we haven’t seen anything yet. We’ll be seeing a lot more of this as Obamacare is implemented.

UPDATE: A friend directs me to this WSJ article by Stephen Soumerai and Ross Kopel. My friend notes that the related letters to the editor are also interesting. Here is the text of the article:
In two years, hundreds of thousands of American physicians and thousands of hospitals that fail to buy and install costly health-care information technologies—such as digital records for prescriptions and patient histories—will face penalties through reduced Medicare and Medicaid payments. At the same time, the government expects to pay out tens of billions of dollars in subsidies and incentives to providers who install these technology programs.

The mandate, part of the 2009 stimulus legislation, was a major goal of health-care information technology lobbyists and their allies in Congress and the White House. The lobbyists promised that these technologies would make medical administration more efficient and lower medical costs by up to $100 billion annually. Many doctors and health-care administrators are wary of such claims—a wariness based on their own experience. An extensive new study indicates that the caution is justified: The savings turn out to be chimerical.

Since 2009, almost a third of health providers, a group that ranges from small private practices to huge hospitals—have installed at least some “health IT” technology. It wasn’t cheap. For a major hospital, a full suite of technology products can cost $150 million to $200 million. Implementation—linking and integrating systems, training, data entry and the like—can raise the total bill to $1 billion.

But the software—sold by hundreds of health IT firms—is generally clunky, frustrating, user-unfriendly and inefficient. For instance, a doctor looking for a patient’s current medications might have to click and scroll through many different screens to find that essential information. Depending on where and when information on a patient’s prescriptions were entered, the complete list of medications may only be found across five different screens.

Now, a comprehensive evaluation of the scientific literature has confirmed what many researchers suspected: The savings claimed by government agencies and vendors of health IT are little more than hype.

To conduct the study, faculty at McMaster University in Hamilton, Ontario, and its programs for assessment of technology in health—and other research centers, including in the U.S.—sifted through almost 36,000 studies of health IT. The studies included information about highly valued computerized alerts—when drugs are prescribed, for instance—to prevent drug interactions and dosage errors. From among those studies the researchers identified 31 that specifically examined the outcomes in light of the technology’s cost-savings claims.

With a few isolated exceptions, the preponderance of evidence shows that the systems had not improved health or saved money. For instance, various studies found the percentage of alerts overridden by doctors—because they knew that the alerted drug interactions were in fact harmless—ranging from 50% to 97%.

The authors of “The Economics of Health Information Technology in Medication Management: A Systematic Review of Economic Evaluations” found no evidence from four to five decades of studies that health IT reduces overall health costs. Three studies examined in that McMaster review incorporated the gold standard of evidence: large randomized, controlled trials. They provide the best measure of the effects of health IT systems on total medical costs.

A study from Regenstrief, a leading health IT research center associated with the Indiana University School of Medicine, found that there were no savings, and another from the same center found a significant increase in costs of $2,200 per doctor per year. The third study measured a small and statistically questionable savings of $22 per patient each year.

In short, the most rigorous studies to date contradict the widely broadcast claims that the national investment in health IT—some $1 trillion will be spent, by our estimate—will pay off in reducing medical costs. Those studies that do claim savings rarely include the full cost of installation, training and maintenance—a large chunk of that trillion dollars—for the nation’s nearly 6,000 hospitals and more than 600,000 physicians.

But by the time these health-care providers find out that the promised cost savings are an illusion, it will be too late. Having spent hundreds of millions on the technology, they won’t be able to afford to throw it out like a defective toaster.

It is already common knowledge in the health-care industry that a central component of the proposed health IT system—the ability to share patients’ health records among doctors, hospitals and labs—has largely failed. The industry could not agree on data standards—for instance on how to record blood pressure or list patients’ problems.

Instead of demanding unified standards, the government has largely left it to the vendors, who declined to cooperate, thereby ensuring years of noncommunication and noncoordination. This likely means billions of dollars for unnecessarily repeated tests and procedures, double-dosing patients and avoidable suffering.

Why are we pushing ahead to digitize even more of the health-care system, when the technology record so far is so disappointing? So strong is the belief in health IT that skeptics and their science are not always welcome. Studies published several years ago in the Journal of the American Medical Association and the Annals of Internal Medicine reported that health IT systems evaluated by their own developers were far more likely to be judged “successful” than those assessed by independent evaluators.

Government agencies like the Office of the National Coordinator of Healthcare Information Technology (an agency of the Department of Health and Human Services) serve as health IT industry boosters. ONC routinely touts stories of the technology’s alleged benefits.

We fully share the hope that health IT will achieve the promised cost and quality benefits. As applied researchers and evaluators, we actively work to realize both goals. But this will require an accurate appraisal of the technology’s successes and failures, not a mixture of cheerleading and financial pressure by government agencies based on unsubstantiated promises.
 The Journal’s authors’ tag indicates that Mr. Soumerai is professor of population medicine at Harvard Medical School and Harvard Pilgrim Health Care Institute. Mr. Koppel is a professor of sociology and medicine at the University of Pennsylvania and principal investigator of its Study of Hospital Workplace Culture.

SOURCE

*************************

In Obama,  Jimmy Carter marches on

by Jeff Jacoby

Jimmy Carter's reputation as a foreign-policy schlemiel can hardly be blamed on the Romney campaign. Americans came to that conclusion more than 30 years ago, having watched the world grow more dangerous -- and America's enemies more brazen -- during Carter's feckless years as steward of US national security.

"There was strong evidence that voters … wanted a tougher American foreign policy," reported The New York Times on November 5, 1980, the morning after Ronald Reagan crushed Carter's reelection bid in a 44-state landslide. By a nearly 2-to-1 ratio, voters surveyed in exit polls "said they wanted this country to be more forceful in dealing with the Soviet Union, 'even if it increased the risk of war.'"

In fact, Reagan's muscular, unapologetic approach to international relations -- "peace through strength" -- didn't increase the risk of war with the Soviets. It reduced it. Within a decade of his election, the Soviet empire -- as Reagan foretold -- would be relegated to the ash-heap of history.  See below:



Like all presidents, Reagan got many things wrong. But one thing he got very right was that American weakness is provocative. A foreign-policy blueprint that emphasizes the need for American constraint, deference, and apology -- what Obama's advisers today call "leading from behind" -- is a recipe for more global disorder, not less. Carter came to office scolding Americans for their "inordinate fear of communism;" he launched diplomatic relations with Fidel Castro's dictatorship and welcomed the takeover of Nicaragua by a Marxist junta. Only when the Soviets invaded Afghanistan in 1979 did Carter wake up to the dangers of appeasing communist totalitarianism. Moscow's naked aggression, he confessed, had made a "dramatic change in my opinion of what the Soviets' ultimate goals are."

Equally disastrous was Carter's reaction to the seizure of the US Embassy in Tehran following the Ayatollah Khomeini's Islamic revolution. Bernard Lewis, the dean of Middle East historians, writes that Carter's meek response -- from his letter appealing to Khomeini "as a believer to a man of God" to his abandonment of the overthrown Shah, a longtime US ally -- helped convince dictators and fanatics across the Middle East "that it was safer and more profitable to be an enemy rather than a friend of the United States."

Is it fair to compare Obama's foreign policy to Carter's? The similarities were especially vivid after the murder of four US diplomats at the American consulate in Benghazi. Even more so when the administration insisted that the outbreak of anti-American violence by rampaging Islamists in nearly 30 countries was due solely to a YouTube video mocking Islam -- a video the White House bent over backward to condemn.

But Obama-Carter likenesses were being remarked on long before this latest evidence of what the appearance of US weakness leads to. Obama was still a presidential hopeful when liberal historian Sean Wilentz observed in 2008 that he "resembles Jimmy Carter more than he does any other Democratic president in living memory." Within weeks of Obama's inauguration, troubling parallels could already be detected. In January 2010, Foreign Policy magazine's cover story, "The Carter Syndrome," wondered whether the 44th president's foreign policy was beginning to collapse "into the incoherence and reversals" that had characterized No. 39's.

The Carter years are a warning of what can happen when the "Leader of the Free World" won't lead.  Jimmy Carter's legacy is still too timely to ignore.

SOURCE

************************

No Way to Run an Economy

One defining characteristic of cronyism is that politically connected businesses and industries get special favors from the government. Most people oppose it because of the inherent corruption, but economists also dislike it because it lowers living standards by hindering the efficient allocation of resources.

Unfortunately, the bailout craze in the United States is a worrisome sign cronyism is taking root. In the GM/Chrysler bailout, Washington intervened in the bankruptcy process and arbitrarily tilted the playing field to help politically powerful creditors at the expense of others. Not only did this put taxpayers on the hook for big losses, it also created a precedent for future interventions.

This precedent makes it more difficult to feel confident that the rule of law will be respected in the future when companies get in trouble. It also means investors will be less willing to put money into weak firms. That's not good for workers, and not good for the economy.

The bailouts in the financial sector are equally troubling. When the politicians intervened, poorly managed firms were given a new lease on life — even though they helped cause the housing bubble!

The pro-bailout crowd argues that lawmakers had no choice. We had to recapitalize the financial system, they argued, to avoid another Great Depression. This is nonsense. The federal government could have used what's known as "FDIC resolution" to take over insolvent institutions while protecting retail customers.

Yes, taxpayer money still would have been involved, but shareholders, bondholders and top executives would have taken bigger losses. These relatively rich groups of people are precisely the ones who should burn their fingers when they touch hot stoves. Capitalism without bankruptcy, after all, is like religion without hell.

And that's what we got with TARP. Private profits and socialized losses are no way to operate a prosperous economy.

SOURCE

**************************

ELSEWHERE

Honduras: Private city will have minimal taxes, government:  "Small government and free-market capitalism are about to get put to the test in Honduras, where the government has agreed to let an investment group build an experimental city with no taxes on income, capital gains or sales. Proponents say the tiny, as-yet unnamed town will become a Central American beacon of job creation and investment, by combining secure property rights with minimal government interference. 'Once we provide a sound legal system within which to do business, the whole job creation machine -- the miracle of capitalism -- will get going,' Michael Strong, CEO of the MKG Group, which will build the city and set its laws, told FoxNews.com."

Gaza: Israeli airstrike kills three:  "An Israeli air strike on a vehicle killed three Palestinian security officials in the Hamas-Islamist ruled Gaza Strip yesterday, Palestinian medics and Hamas said. The Israeli military confirmed it had launched an air strike in Gaza but had no further comment. Hamas and Palestinian hospital officials said a raid after darkness fell in the town of Rafah on Gaza’s border with Egypt killed three officials responsible for overseeing tunnels used to import goods from Egypt."

A Webb of lies:  "As implausible as the new Soviet man might seem, left-wing radicals in the West applauded the Soviet Experiment. They clearly believed Trotsky[s] description in Literature and Revolution: the 'average human type' under communism would be the equal of Aristotle and 'above this ridge new peaks' of humanity would rise. Among the loudest voices cheering ere the prominent British socialist utopians, Sidney and Beatrice Webb."

Oklahoma challenges health care tax in federally-run insurance exchanges:  "Many employers under the ACA can be fined/taxed if they do not provide health insurance to individuals who qualify for the federal government’s subsidies. However, if a state does not build its own exchange, then no employee would qualify for the subsidy, and therefore employers in the state not would be subject to the tax because none of their employees would meet the criteria set out in the law. ... Not surprisingly, it was only recently that Washington woke up to this reality."

There is a  new  lot of postings by Chris Brand just up -- on his usual vastly "incorrect" themes of race, genes, IQ etc.

*************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, GUN WATCH,  FOOD & HEALTH SKEPTIC,  AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL, EYE ON BRITAIN and Paralipomena

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************

The Big Lie of the late 20th century was that Nazism was Rightist.  It was in fact typical of the Leftism of its day.  It was only to the Right of  Stalin's Communism.  The very word "Nazi" is a German abbreviation for "National Socialist" (Nationalsozialist) and the full name of Hitler's political party (translated) was "The National Socialist German Workers' Party" (In German: Nationalsozialistische Deutsche Arbeiterpartei)

****************************

No comments: