Tuesday, July 11, 2017

Three things to know about North Korea's missile tests


















With advances in its missile programme and the July 4 test, here are three technical milestones and why they matter.













Since Kim Jong-un's ascendancy in December 2011, North Korea has accelerated its missile development programme, the tempo of tests increasing considerably from those under his father Kim Jong Il. After failures in 2016, North Korea has this year made genuine advances in its missile programme.

On Tuesday, July 4, Independence Day in the United States, Pyongyang tested what it claimed was the country's first intercontinental ballistic missile (ICBM), which Kim Jong-un called a "gift" to the US.

This test is significant as the projected range of the missile puts North Korea firmly in the select group of countries that have produced an ICBM. With Kim Jong-un threatening to reduce the US "to ashes", North Korea now has expanded the missile's range to potentially hit targets in the US.

Aside from the gradual improvements, here are three of North Korea's recent technical milestones which stand out and why they matter.


1. Firing almost vertically and reaching higher altitudes

There are significant challenges to testing a long-range missile in a country that is too small to run test flights within its own border. Initially, North Korea's only option was to launch these flights over its neighbours. It did that in 1998 by test-firing the Taepodong-1 missile over mainland Japan, to instant international condemnation.

Now, North Korea has started launching longer-range missiles in what is known as a "lofted trajectory", firing the missile almost vertically. This allows the missile to land a short horizontal distance from launch but travel a great distance overall. Higher altitudes are a strong indication of new, more powerful engines and a greater ability to carry a payload that distance. 

These launches enable Pyongyang to conduct realistic tests of longer-range missiles. They also allow engineers to gather data sent back from the test missile to better understand the challenges faced when a long-range warhead re-enters the Earth's atmosphere at hypersonic speeds, something that generates vast amounts of frictional heat. 

This is exactly what North Korea did when it tested the Hwasong-14 (Mars-14) on July 4.

According to the US military's Pacific Command, or PACOM, which monitors these launches, it flew for 37 minutes, rising to a maximum altitude (known as an apogee) of nearly 2,800km, over seven times higher than the International Space Station which is in orbit only some 400km above the Earth.  

A steep, near-vertical launch allowed the missile to travel a distance that roughly simulated long-range flight by travelling higher than most missiles but splashing down only a short distance away into the Sea of Japan, limiting the diplomatic damage that would inevitably be caused by a random projectile flying through a neighbour's airspace.

The ICBM, a clunky Cold War-era name for a long-range missile, is formally defined as one that can fly more than 5,500km.

According to David Wright of the Union of Concerned Scientists, if the Hwasong-14 was fired under proper flight conditions, it could reach a target more than 6,700km away. This puts the missile firmly in the long-range or ICBM bracket, and means it could potentially hit the US base on the island of Guam as well as Alaska, although the naval base on Hawaii and the rest of the continental US are still out of reach.


2. Solid fuel means faster launches

First tested by the North just over a decade ago, solid-fuelled missiles are faster to set up and easier to fire.

Unlike liquid fuels, which take time to load and are extremely toxic and corrosive to handle, solid fuels are easier to maintain and are more stable. A crude analogy between the two is to liken solid-fuelled missiles to setting off gunpowder-filled fireworks rather than filling each one with liquid fuel every time you wanted to fire one. Solid fuel reduces launch times from hours to minutes. 

Reducing the time from when a missile battery is taken out into the open to be readied for a launch, and therefore is exposed to enemy observation, makes it far less likely to be discovered and destroyed. Using solid fuels also scales back on the additional vehicles needed to transport volatile and dangerous liquid fuels, making a missile battery smaller and harder to spot. As this fuel is more stable, it can also take a few knocks when moved around.


3. Toughening up missile batteries

Fortifying a missile battery so it can travel anywhere on land rather than along North Korea's tiny road system - the country has 724km of paved and 24,830km of unpaved roads respectively - gives it more places to hide.

North Korea has done this by ruggedising the missile transporter (formally called a transporter erector launcher or TEL). Tracks are used instead of wheels, allowing the heavy vehicles to cross rough ground off the road system, which would be monitored by an enemy trying to track down missile batteries. The thin-skinned missile is also sheathed in a canister so it survives bumpy off-road travel.

These improvements came together in the successful February launch of the Pukguksong-2 (Polaris-2) medium-range missile. Analysts across the world quickly realised the test's importance as the combination of solid fuel, a ruggedised transporter and a protected weapon, meant a battery could potentially hide in forests, underneath cliff overhangs, under bridges - virtually anywhere - and launch within minutes from a cold start.


What is next for North Korea's missile programme?

Producing next-generation missiles that can reach the US will be key for North Korea.

This will not be an easy feat given the trickier aspects of long-range flight. Designs must be able to withstand the stresses and incredible heat produced in missiles by re-entering the atmosphere.

The challenges will come from improving the warhead and delivery system and coupling the two. North Korean scientists will struggle to extend the missiles' ranges while shrinking their still rudimentary nuclear devices so theý are light enough to be carried by the missile to the target. 

Then there is the quest for accuracy, if the missiles are to have any military use.

North Korea has bragged that its latest batch of missile tests were extremely accurate. It is still vague how this accuracy is being assessed given that North Korea does not have a network of satellites able to guide distant warheads to their targets, relying on the projectiles' much less accurate inertial guidance system.

This electronic system is used in older missiles such as the Scud. It works on the principle of the missile using internally measured basic data on its speed, direction, and so on, to try to roughly assess where it ended up rather than being told where it was exactly by, say, the Global Positioning System (GPS).

If the sharp tempo of tests doesn't abate, North Korea is likely to see substantial improvements in its missile programme. Kim Jong-un seems determined to "frequently send big and small 'gift packages' to the Yankees", as he instructed scientists this week, according to the country's state media KCNA.



















Stock Market Tsunami Siren Goes Off
















by Wolf Richter • Jul 10, 2017















It will be ignored until it’s too late.

Everyone who’s watching the stock market has their own reasons for their endless optimism, their doom-and-gloom visions, their bouts of anxiety that come with trying to sit on the fence until the very last moment, or their blasé attitude that nothing can go wrong because the Fed has their back. But there are some factors that are like a tsunami siren that should send inhabitants scrambling to higher ground.

Since July 2012 – so over the past five years – the trailing 12-month earnings per share of all the companies in the S&P 500 index rose just 12% in total. Or just over 2% per year on average. Or barely at the rate of inflation – nothing more.

These are not earnings under the Generally Accepted Accounting Principles (GAAP) but “adjusted earnings” as reported by companies to make their earnings look better. Not all companies report “adjusted earnings.” Some just stick to GAAP earnings and live with the consequences. But many others also report “adjusted earnings,” and that’s what Wall Street propagates. “Adjusted earnings” are earnings with the bad stuff adjusted out of them, at the will of management. They generally display earnings in the most favorable light – hence significantly higher earnings than under GAAP.

This is the most optimistic earnings number. It’s the number that data provider FactSet uses for its analyses, and these adjusted earnings seen in the most favorable light grew only a little over 2% per year on average for the S&P 500 companies over the past five years, or 12% in total.

Yet, over the same period, the S&P 500 Index itself soared 80%.

And these adjusted earnings are now back where they’d been on March 2014, with no growth whatsoever. Total stagnation, even for adjusted earnings. And yet, over the same three-plus years, the S&P 500 index has soared 33%.

This chart shows those adjusted earnings per share for all S&P 500 companies (black line) and the S&P 500 index (blue line). I marked July 2012 and March 2014 (via FactSet, click to enlarge):



 
 


Given that there has been zero earnings growth over the past three years, even under the most optimistic “adjusted earnings” scenario, and only about 2% per year on average over the past five years, the S&P 500 companies are not high-growth companies. On average, they’re stagnating companies with stagnating earnings. And the price-earnings ratio for stagnating companies should be low. In 2012 it was around 15.5. Now, as of July 7, it is nearly 26.








In other words, earnings didn’t expand. The only thing that expanded was the multiple of those earnings to the share prices – the P/E ratio. Such periods of multiple expansion are common. They’re part of the stock market’s boom and bust cycle. And they’re invariably followed by periods of multiple contraction.



How long can this period of multiple expansion go on? That’s what everyone wants to know. Projections include “forever.” But “forever” doesn’t exist in the stock market. The next segment of the cycle is a multiple contraction.

The 10-year average P/E ratio, using once again the inflated “adjusted earnings,” not earnings under GAAP, is 16.7, according to FactSet. This includes two big stock market bubbles, the one leading up to the Financial Crisis, and the current one, but it includes only one crash. This imbalance skews the results. Two complete cycles would bring the average substantially below 16.7.

Nevertheless, even getting back to a P/E ratio of 16.7 for the S&P 500, when the current PE ratio is 25.6, would signify either miraculously skyrocketing earnings or a sharp contraction of the market. The first option is a near impossibility. And the second option?

Markets overshoot, which is what reversion to the mean entails: the average isn’t going to be the floor! And that’s why this type of unsustainably high earnings-multiple is like a tsunami siren where the arrival time of the tsunami remains unknown – and that’s why it is ignored until it’s too late.

Already, bankruptcies are surging as the “credit cycle” exacts its pound of flesh.

























That climate hiatus: what’s really going on























Scientists Solve Climate 'Puzzle of the Century'



By Tim Radford






http://climatenewsnetwork.net/that-climate-hiatus-whats-really-going-on/







Two U.S. scientists have solved the hypothetical puzzle of the century: how to explain the reported climate "hiatus" and reconcile two different ways of predicting the global temperature by 2100.

They say they now know why computer simulations and the forecasts made by a study of the historical record don't seem to agree.

The good news is that scholarly conflict may have been resolved. The bad news is that, if carbon dioxide levels in the atmosphere are permitted to double, then the average global temperatures could reach 4.5°C by the century's end, or even up to 6°C.

The debate may seem entirely academic, if only because 197 nations of the world undertook to contain global warming to "well below" 2°C by the end of the century by drastically reducing the consumption of fossil fuels.

Not enough

But collectively, the national plans so far proposed do not look likely to meet this target, and the U.S. has threatened to withdraw from the undertaking anyway. So there remains a "what-if" case to settle a long-standing conflict.

And the conflict is this: examine the earth's climate over millions of years, and reconstruct greenhouse gas levels, and you get a prediction that says if carbon dioxide in the atmosphere—for most of human history it has been 280 parts per million—doubles, then the average global temperatures will rise by between 1.5°C and 4.5°C. Use computer simulations, and you get much the same result.

But when you examine the results of temperature measurements taken since the thermometer was invented, and extrapolate, the answer is a bit different: 1°C to 3°C.

A new study in the journal Science Advances proposes a simple solution: the predictions based on recent historical evidence do not take into account all the natural cycles of long-term warming and cooling. Factor those in, and the circle can be squared.

Apparent pause

Research like this offers a glimpse of science in action. Scientists are never happy when prediction and observation don't match. For years, they have worried away at what has become known as the "so-called hiatus" or apparent pause in the rate of global warming in the first dozen or so years of this century.

In fact the world continued to warm, but the rate of warming was significantly slower than that measured in the last two decades of the 20th century.

Some argued that the world had warmed, but all the heat had gone into the oceans. Others argued that any apparent slowdown could only be fleeting and global warming would accelerate again. Yet a third school maintained that the pause was entirely illusory, and that even if there was a pause it would have no effect on long-term predictions.

These competing explanations were in themselves evidence that the mismatch of data and prediction bothered the climate boffins.

Avoiding extremes

For much the same reason, researchers have tried to find what might be called the extreme hypothetical limits to climate change: for instance, could carbon dioxide levels fall so low the planet would entirely freeze? (The answer is, so far, no).

Could the greenhouse gas levels get so high that the oceans could boil dry? The answer is, in theory yes: the earth could become up to 60°C hotter than it is now, and uninhabitable, but mercifully, only in theory.

So the outcome of the latest study is an academic confirmation that different patterns and rates of warming play into the big picture. Land, for instance, warms faster than ocean. Most of the land surface of the planet is in the northern hemisphere. So there is a good reason why global warming is, or seems, uneven.

"The historical pattern of warming is that most of the warming has occurred over land, in particular over the northern hemisphere," said Cristian Proistosescu, who made the study at Harvard University.

"This pattern of warming is known as the fast mode—you put CO2 in the atmosphere and very quickly after that, the land in the northern hemisphere is going to warm."

But the warming of the Southern Ocean, swirling around Antarctica, and the Eastern Equatorial Pacific proceed at a different pace, and with changes in cloud cover which complicate the calculations. So Proistosescu and his co-author worked on the mathematics necessary to resolve their little local difficulty.

"The models simulate a warming pattern like today's, but indicate that strong feedbacks kick in when the Southern Ocean and Eastern Equatorial Pacific eventually warm, leading to higher overall temperatures than would simply be extrapolated from the warming seen to date," said Peter Huybers, an earth and planetary scientist at Harvard, and the other author.

The message is that the slow mode matters, but only in the long term. What can be measured now, and recently, does not necessarily indicate how things will end up eight decades on.

"Historical observations give us a lot of insight into how climate changes and are an important test of our climate models," said Huybers, "but there is no perfect analogue for the changes that are coming."