Yes and Nein: Borrowing the best from German and American cultures (and not borrowing the worst)

120px-Yes_Check_Circle.svg.png

After nine years living in Germany (as an American), I have distilled a difference in cultural instincts into a simple heuristic that balances the pros and cons of opposite tendencies. This is, of course, a large generalization and there are many individual exceptions, but I think it has some merit as a statement of tendencies.

In America, a first instinctive response to new ideas or ways of doing things tends to be: “Yes, that sounds interesting. Let’s try that and see how it works!” [for the West Coast, interject “wow” or “cool”].

On the negative side, this enthusiasm for the new can sometimes be applied to terrible ideas, which then waste time and money or worse. On the positive side, this makes innovation at a fundamental level much easier than elsewhere. America is a global engine of innovations that transform or create entire industries. In modern times, think of Apple (which has made a recurring habit of this), Facebook, Airbnb, and Uber.

In a simple contrast, in Germany, a first instinctive response to new ideas or ways of doing things tends instead to be: “No, that is not how it is done, that is impossible, no one does it that way and therefore it can’t work.”[1]

On the positive side, this tends to weed out terrible ideas and concentrate time and attention on things that solidly do what they are supposed to, like Autobahns, BMWs, long-lasting buildings, and MRI machines. On the negative side, this makes big-concept innovation much more challenging, since “that is not how it is done” is the whole point of a big-concept innovation—just add “yet.” In contrast, incremental technical and quality improvement within a given track proceed well, particularly in mechanical domains. Things are built to work very well and to keep doing so for a very long time.

From German culture, I embrace a healthy respect for things that actually work and a healthy skepticism about things that are not yet known to work (and that might just fail spectacularly—like socialism and wind power, oops). From American culture, I choose to embrace a style of fundamental innovation and re-thinking that has the power to reshuffle the structure of entire industries and ways of life in a legitimately and lastingly positive direction (unlike the low-fat diet, oops).

So: enthusiasm for the new—when warranted.

 

[1] There are some stark exceptions. One is the historical enthusiasm for the horrific ideologies of socialism (national and otherwise). This might be partly understood as misapplying mechanistic thinking to the decidedly non-mechanistic domain of society and economy. Another is homeopathy, which is immensely popular in Germany, even though it appears to lack any scientific basis. Besides placebo effects, which could be significant, one of my theories is that it does work in an odd sense: it helps protect people from greater exposure to conventional medicine, drugs in particular. By doing this, it sometimes accidentally leaves people healthier than if they had been subjected to certain unnecessary net-negative conventional treatments instead. First, do no harm.

 

Ancient travel food meets modern travel: My ultimate paleo travel meal option

A larger batch than I made.

A larger batch than I made.

Modern travel can be especially unfriendly to ancestral eating strategies that emphasize fresh whole foods. Although airlines try hard, the logistics are tough, and airplane food in general has a poor reputation. Some low- and zero-carbers tell me they just take the opportunity to fast. Even when traveling by train or car, it can be tough to impossible to scavenge much if any "real" food from rest stops and kiosks.

Fortunately, it is possible to pack something to bring, but what? Weight and spoilage are concerns for many fresh foods, especially animal foods. So this trip, I am going to borrow from a very old approach to preservation and portability of high-powered food. Yes, this time, I'm going to fly with pemmican.

Pemmican is credited as an innovation of native North Americans. After reading and watching as much as I could about food preservation and pemmican's particulars online, I came to think of pemmican less as a specific recipe and more abstractly as a versatile food preservation approach. The strategy hinges on the fact that lean meat and animal fat have very different requirements for long-term preservation. Reflecting this, the lean and the fat are first divided up, after which quite different preservation methods are applied to each. Finally, the results are combined back into a single product.

Fully drying lean meat helps prevent bacterial activity, which depends on moisture. Rendering fat on low heat separates the pure fat out of the source tissues, which degrade quickly. The pure fat by itself, once rendered, can last a long time. It mainly needs to be protected from air and light, which promote rancidity. Traditional packaging methods do just this.

My second trial production run

I chose salmon this time for the lean and beef tallow for the fat, a combination I have not seen, but that sounded good. I also added a few blueberries for flavoring, though this is optional. I started with 625g (22 oz.) of wild-caught salmon, fresh frozen. I thawed it, removed some of the extra water with paper towels, cut it up, and placed it in a food dryer for about 15 hours, turning once early on. The key point here is that the lean must be completely dry and brittle, far dryer than jerky, which should still have some bend to it.

Next, I placed the result in a blender. This time, I blended for longer than I did in my first trial batch awhile back. Sure enough, I got the sought-after "powder" result this time. It took a solid minute or more of blending at different speeds to get there though. A mortar and pestle is traditional for this step.

For this page, I followed the same process as with the salmon with about 200g of wild blueberries, which were also fresh frozen. I dried them on another rack right along with the fish and then blended the dried result down to a powder.

I didn't have to render the fat myself since I was finally able to find a source of beef tallow from a butcher (tallow, once commonplace, has proven hard to find after a half century of relentless, but scientifically baseless, slandering of animal fats). I weighed the powders and then chipped out about an equal weight from my refrigerated tallow supply. I warmed this up on very low heat. The only goal here is to melt it so it can be mixed with the powders.

After stirring the dry and liquid ingredients together in a bowl, I spooned the result into two small plastic containers with sealing lids. The point here is to help prevent any oil from escaping and getting tough stains on clothing and luggage (I will also place the containers inside ziplock bags for this reason, just in case). I lined the containers with butcher's paper, filled them up, closed the paper inside, and sealed the lids.

I will wait until the airplane meals come, and just pull out one of these as a supplement. Very low key. Little can anyone imagine how much paleo nutritional power is going to be packed into those innocent looking containers. If they did know, though, it would look about like this:

Macronutrient analysis

The inputs were 625g of salmon, 150g of tallow, and about 200g of blueberries (respectively, 22 oz, 5.3 oz, 7 oz), which totaled about 975g of ingredients. These were reduced with drying to 290g, so about 30% of the original total weight. Great for travel!

I ended up with about 130g (4.6 oz) of content in each container, plus another 30g that I sampled right away. So what macronutrients are in those two containers?

The estimated macros on all the inputs together (based on the breakdowns on the frozen product boxes) were: 1,940 calories, 126g of protein, 159g of fat, and 12g of carbs (from the berries). So each 130g container includes 873 kcals, 57g of protein, 72g of fat, and about 5g of carbs (rounded). Calories are 228 from protein, 648 from fat, and 20 from carbs. That's 72% of calories from fat.

And so: the perfect travel power food, inspired by the old, old, school, along for my next flight.

 

UPDATE: Pro tip. Watch out of the 100ml limit on "liquids and gels" at the security line. I would have never thought of this as a "liquid or gel," but the x-ray machine guy was curious. I said it was my lunch and they let me through. Safest way would be to create sub-100ml (g) packages and put them in the clear plastic bag for the security line. Also, recall that if you are flying internationally, this could not be brought into a country with a quarantine on meats. Eat it before landing (no trouble there!).

SpaceX can get there, but biology a probable Mars residence limiter

SpaceX chief Elon Musk laid out a long-term vision for regular interplanetary transport and colonization in a 27 September presentation at the International Astronautical Congress. Details and vision alike were further steps along the path SpaceX has been pursuing for years, as it repeatedly counters naysayers by taking up the so-called impossible—and getting it done.

Yet while Musk concentrated on engineering, propulsion, efficiency, and finance, the toughest limiters on long-term Mars habitation may well turn out to be biological. Could life evolved on Earth, especially more complex organisms such as ourselves, thrive there indefinitely and across generations?

Musk’s aim is to make humanity a multiplanetary species. He envisions a city of a million people on Mars that could become “self-sustaining.” In other words, if Earth becomes uninhabitable, humanity would have a second home, and avoid extinction.

Most of the technical issues with Mars habitation can be addressed with technical means. Radiation can be shielded against. Water, air, and regulated temperatures can be produced, and chemical plants such as for ship propellant can be built. Psychological and other factors in long-term, small-scale hab confinement have already been under study both in space and in remote desert sims.

The gravity of the situation

However, the harshest sticking point for a colonization plan could be something that Musk mentioned, but characterized only as a source of fun—38% Earth gravity on Mars. He presented images of jumping high and lifting heavy things with ease.

The possible problems would only appear, as they so often do, over the longer term. Research on the health effects of low gravity has already begun to suggest a quite unfavorable pattern. Much of this research as been done in zero g, but long-term exposure to 38% Earth gravity—Mars g—could well produce many similar effects along the same spectrum, just more slowly.

Zero g has been found to produce not only the expected muscle atrophy in astronauts, but a host of other health issues, which isometrics and exercise bikes can only partially limit. Research on both astronauts and lab animals point to falling bone mineral density and circulatory issues, including impaired heart health.

Limited research to date thus already suggests negative effects on three major physical systems. Yet muscular, skeletal, and circulatory systems are hardly footnotes to transporting brains; they are most of what a complex organism consists. Moreover, there is no reason to expect nervous and reproductive systems to get free passes either, especially over years and decades.

Studies of zero-g animal embryonic development raise even greater concerns for long-term Mars colonization. Reproduction among spacefaring rodents has gone quite badly. Experiments with mice on a Space Shuttle mission resulted in normal embryos for the earthside controls and no growing embryos in zero g. Rat groups sent into orbit produced some weightless pregnancies, but with no resulting births. The pregnancies spontaneously terminated—all of them.

Evolutionary and developmental processes could always assume 1g

Simple organisms such as bacteria are the least likely to be bothered by gravity changes. The more complex the developmental process, however, the more likely that aspects of this process will be fine-tuned to happen in 1g. That said, Mars g could well be better for development than zero g because it would at least supply developmental processes with some vertical orientation, an up and a down, albeit with a much weaker signal.

The plans encoded in DNA for growing an organism are completely unlike engineering plans. They are decentralized developmental instructions. Each cell responds to its immediate environment. It takes cues from the type of cell it has become, from the types of cells around it, and from the specific chemistry and hormones in its blood supply. The so-far unquestioned constant has been that all earthly life has evolved in 1g (with very tiny variations) and every embryonic developmental process has evolved to take place in this 1g.

What about adaptation? As powerful a force as evolution by natural selection is, it tends to require extremely long time scales, on the order of thousands and more generations, especially for larger-scale adaptations. Too great a change—or an entirely unprecedented type of change—and a species will simply not make it.

Adaptations to something so pervasive and otherwise constant as gravity would have to proceed in steps. If a hypothetical planet’s gravity were to (somehow) shift to 38% of its former level, but do so over several million years or more, then life there would have a decent chance of adapting because any given generation would only be subject to minute changes. However, by the time gravity reached 95% of its former level, organisms then would already tend to be optimally adapted to that new 95% level. Checking in again a thousand generations later, organisms would tend to be well adapted to the newly current 90% gravity, and so on as gravity crept down. In contrast, evolution copes far less well with sudden large jumps, which tend to be associated with mass extinctions.

Temperature variation is a variable to which earthly life is widely adapted, both across species and to a lesser degree within each organism. Temperature has changed remarkably and continuously throughout Earth’s 4.5 billion year history and it also varies starkly with season and geography. Temperature adaptation therefore has a vast range of evolutionary precedent. Atmospheric composition, pressure, and radiation levels have also changed back and forth over geologic history.

What earthly life has never had to do, not even once, is what a Mars relocation would ask of it. Low g is something that evolution has had no opportunity to tackle. One of the few rough constants throughout the 3 billion or more years of earthly life has been 1g.

This still does not make some degree of individual gravity adaptation impossible now, but it does suggest that this could be a very serious issue for colonization and a potential deal-breaker for both indefinite stays on Mars and natural reproduction of future generations there.

The probably need for artificial gravity and how to produce it

For long-term extra-terrestrial colonization, artificial structures capable of producing artificial gravity that approximate 1g seem more promising. One concept involves large cylindrical spacecraft on axial rotations. The interior surface of the cylinder can be built to a size and given a rotation to approximate 1g over a large habitable interior surface area. That would be another huge engineering challenge. Yet SpaceX’s work in interplanetary transport, along with advancements in asteroid mining, would help lead to a future in which this too could become more feasible.

Given the grave potential health and reproductive risks of long-term exposure to zero g and/or Mars g for Earth-evolved organisms, those interested in space colonization ought to assign a high priority, alongside ongoing engineering work, to low- and zero-g health research. Critical for colonization are three research areas: effects of Mars g on the health of Earth-leavers, likely health of long-term Mars residents upon potential return to Earth, and effects of low and no g on embryonic and childhood development.

Getting people to Mars is an engineering challenge. Musk, SpaceX, and collaborators are up to the task and well on their way. But the length of time that hopeful new Martian arrivals can expect to live there, in what state of health, and with what likelihood of producing healthy offspring, are critical questions in need of serious research and consideration in relation to any developing colonization plans. Early animal and astronaut studies combined with an evolutionary perspective suggest that shorter-term Mars visits are likely to be far more feasible from a health perspective, that natural reproduction among colonists might well be out of the question, and that the development of spacecraft and stations with artificial gravity is likely to be a biological priority for any future long-term extra-terrestrial residents.

This provides a more realistic base scenario from which to refine the engineering details of an early Mars transport and habitation system. It may well be that 1g environments would have to be available at least part of the time to support health longer term. The most realistic approach to creating artificial gravity is a rotating habitat, but this could well prove easier to achieve in space than on a planet with gravitational and atmospheric resistance, albeit both much lower than Earth’s.

At minimum, it should be clear that lab mice and rats ought to be the first serious colonists on Mars—and this for quite some time. Their mission: to live where no earthly creature has lived before. Godspeed to those pioneering rodents; I suspect they’ll need it.

The curious case of the faster-healing knee and the larger steaks

Some loose ends needed addressing, but why was the recovery so fast?

Some loose ends needed addressing, but why was the recovery so fast?

My doctor took one look at my knee and his jaw dropped. He had hardly ever—or perhaps never—seen a knee that looked that good just four days after arthroscopic surgery.

This clinic specializes in these surgeries, so he sees patients, many of them young athletes, in post-surgical recovery checkups daily. He kept looking at my knee and then looking at me—a middle aged guy. He checked the chart to make sure the surgery was actually just four days ago. Still in disbelief, he asked me what I had done.

What came next was instructive. I told him I thought the surprisingly fast recovery might be due to my very low carb diet.

His response was surreal, because non-existent. He did not acknowledge what I had said. He just kept going on about how good the knee looked and how he had hardly ever seen such a fast recovery.

The rest of the conversation was clear and normal. We talked about how the stitches were coming out next time. We talked about how, given the fast recovery, I probably didn’t need that physical therapy after all.

I was still curious, so I mentioned just once more that maybe the notable recovery could be due to my low-carb diet, because that seems to reduce inflammation.

Once again, no reaction. It was as if I had spoken just that one line in Chinese. No, not even that. Switching to Mandarin would have elicited some noticeable reaction. Would the fact of my statement cease to exist if not acknowledged?

A tale of two otherwise identical surgeries

I have spent several decades in somewhat rough activities including martial arts earlier and amateur adult soccer later. With such activities, it can seem at times like rolling dice when an individual’s luck might run a bit low and an arthroscopic meniscus repair will be called for. Once this kind of tissue tears a little, it just does not heal by itself. Worse, the torn piece can obstruct the joint and lead to additional tearing and other problems. It’s a little like having a hand-knit sweater with a hanging thread that is just waiting to get caught and unravel some more (but with pain involved). The hanging thread just needs to be trimmed off. The invention of arthroscopic technology revolutionized the ease with which this could be done.

My first such surgery was in 2010. Of course, I thought I had learned my lesson and would not be back. But alas, in a single lapse of focus, the other knee over-extended on a bad landing on the futsal court in late 2015, the fault only of myself. So my second such surgery, on the other knee this time, was done recently in 2016.

My recovery six years ago was good, but relatively more ordinary. It at least did not elicit any jaw dropping from the specialist. I recovered nicely, above average, but I do not think I recovered this well.

It was also striking to me after this 2016 surgery that I awakened from anesthesia crisply, with perfect clarity, as if from an unusually excellent night’s sleep. I do not remember a feeling at all like that from my corresponding 2010 recovery room awakening. I recall it as groggy and gradual, more as I would have expected. This may or may not be important or coincidental, but I note that a major effect of a low-carb ketogenic diet is a gradual transformation of preferred cellular fuel sources, including for the brain, so an effect like this is plausible. I have noticed clear improvements in sleep patterns following dietary changes and full anesthesia and sleep are related states.

There is certainly individual variation in recovery rates, but the interesting thing here is the rare opportunity to compare the same person recovering from two identical surgeries at two different times. Six years apart, these were the same surgeries, performed by the same surgeon, and conducted at the same clinic with the same anesthesiologist in the same room. They were for remarkably similar injuries. The surgeries were conducted a similar length of time after the initial incident (in both cases, after about eight months of “conservative” recovery and training efforts). I am the same person. Almost every factor was the same.

So what changed between the two events?

First, I am six years older. But this would predict a slower recovery, not a notably faster one.

Second, I have completely changed my diet, including adding fasting periods. Both low-carb and fasting are known to reduce systemic inflammation compared with more conventional modern diets, with their high frequency, high refined carbs, and brutally high omega six. With lower systemic inflammation (call it immune-system noise), specific inflammation as a healing response at the surgery site (immune-system signal) might proceed with more appropriate focus on the local site and without undue exaggeration.

It was only several months after the 2010 surgery that I discovered The Primal Blueprint and first ditched grains, started even more thoroughly avoiding refined sugars, and replaced industrial processed seed (“vegetable”) oils with natural fats. Then, starting around 2013, I moved toward a still lower-carb, higher-fat whole food ketogenic approach. More recently, just within 2016, and mainly in the past few months, I have been trying out a largely carnivorous approach and have introduced more fasting and intermittent fasting as well.

Analysis and implications

Unfortunately, I have no comparable record of the state of the other knee after exactly four days in 2010, only necessarily unreliable memories and impressions. I could be making this up from memory and confirmation bias. Or the difference between the surgeries could be random or due to some other unnoticed factor.

Still, even as anecdote, these recollections strike me as notable. And on reflection, it occurs to me that one of the sad symptoms of diabetes is poorer wound healing. If a low-carb (and natural fats) diet tends to lead toward the very opposite of a diabetes crisis metabolism, might it not likewise lead to the opposite of compromised wound healing? That is, improved and above-average wound healing. This seems plausible.

The hypothesis here is that conventional high-frequency, high-carb diets might keep most people’s post-operative and other wounds from healing as quickly as they might otherwise. The effects of the resulting unnecessary systemic inflammation would come to appear “normal” only because it would be what clinics would see from day to day within the particular afflicted populations. Anyone doing something quite atypical of that population, such as a very low carb diet, might produce seeming anomalies—relative to this afflicted population. If lower carb and fasting are superior to higher carb and frequent eating, as I have come to think they are—through research, countless biographical and ethnographic reports, and accumulating personal experience—those anomalies would show up as positive surprises.

Greater clarity here could support dietary practices that improve health outcomes while reducing, or at least not adding to, reliance on the medication industry. As I put it in the title of my book review essay on Jason Fung's The Obesity Code,Only the faster profits.”

Relatively little research money floods in to verify or falsify these types of potential effects, perhaps, because no potentially profitable pills would be entailed in producing them. If such benefits might be real, people changing their own habits would be the primary beneficiaries and direct action to make personal changes would be the primary method.

Block Size Political Economy Follow-Up 3: Differentiation from the 21-million Coin Production Schedule

Continues from Part 2: Market Intervention through Voluntary Community Rules

One popular argument compares the Bitcoin block size limit to the coin production schedule that sets up a terminal maximum of 21 million bitcoins that can ever be created. Raising the block size limit, this argument continues, could set a precedent for changing the coin production schedule, and then what? Changing the block size limit opens up a slippery slope that could threaten to lead to the end of cryptocurrency standards and boundaries. Just as the coin limit is an essential value proposition of Bitcoin, so other types limits must be conservatively protected as well.

How can this type of argument be considered?

First, note that this represents an approach opposite to the one I have taken. I have identified and discussed the block size limit as something uniquely and importantly different within Bitcoin from an economic standpoint. The above argument, in contrast, presents these different “limits” as quite similar to one another for this purpose and therefore ripe for analogizing.

Next, one might note how Bitcoin started with its production schedule already in place, whereas the block size limit was added about 20 months later and at just under 1,200 times larger than the average block size of the time. The limit’s original proponents defended it from critics as a merely temporary measure and thus of no real concern.

A common retort to such observations is, in effect, “that was then, this is now.” The project is at a more advanced stage. The current developers have more experience and a more mature view than the early pioneers. The system now carries far more value and the stakes are higher. Today, we can no longer afford to be so cavalier as to just put a supposedly temporary limit right into the protocol code where it could prove difficult to change later…

That is…we can no longer be so cavalier as to just remove such a previously cavalierly added temporary limit...That is…it is time to move on from reciting old founder tales and look to the present concerns.

And indeed, such matters of historical and technical interpretation are subject to many differing assessments. However, there is an altogether different and more enduring level on which to consider this matter. There are substantive economic distinctions between a block size limit and a coin production schedule that render the two remarkably different in kind and thus weaker objects for analogy than they could at first appear.

When “any number will do” and when it will not

This is because raising the total quantity of a monetary unit by changing its production schedule has completely different types of effects from changing the total quantity of a given service that can be provided. Producing an increased quantity of a given cryptocurrency is entirely unlike producing an increased quantity of transaction-inclusion services. This follows from a unique feature of monetary units as contrasted with all other economic goods and services. An arbitrary initial setting for the production of new coins (which operates to define an all-time maximum possible production quantity) works quite well for a cryptocurrency, but does so only for unique and distinctive reasons.

With money, barring certain divisibility issues of mainly historical interest, any given total quantity of money units across a society of users facilitates the same activities as any other such total quantity. This includes mediating indirect exchange (facilitating buying and selling), addressing uncertainty through keeping cash balances (saving; the yield from money held), and facilitating lending and legitimate commercial credit (not to be confused with “credit expansion”). The particular total number of money units across a society of money users is practically irrelevant to these functions. What is critical to a money unit’s value is users’ confidence that whatever this total number (or production schedule) is, money producers cannot arbitrarily alter it, especially upward, so as to rob money holders through devaluation.

Subject to constraints of mineral reality.

Subject to constraints of mineral reality.

A hypothetical model of physical commodity money production on a free market differs in certain important respects from both cryptocurrency and fiat money and bank-credit models. We should therefore closely consider the meaning of arbitrary with regard to these distinct cases.

With precious metal coins produced by ordinary businesses on a free market, the number of units cannot be increased arbitrarily for reasons rooted directly in physical constraints. Each additional precious metal coin to be produced requires specific scarce materials and energy combined with various manufacturing and other business costs, from mining to minting. Each such coin is much like any other good produced and exchanged on the market in that it is a product to be used in the market as money as opposed to a product to be used in the kitchen as dinner. Material scarcity itself protects money users from rouge money producers by preventing arbitrary changes to the quantity of money units. Changes in quantity supplied reflect supply and demand for such coins, including marginal production costs, as with other products.

In sharp contrast to this, a state-run system of fiat money and bank credit supports “flexible” increases in the “money supply.” These are arbitrary in that, unlike hypothetical commercial precious metal coin makers, these legally privileged money producers can generate additional money units at little to no cost to themselves. Notes can be printed and differing numbers of zeroes can be designed into printing plates as the denomination at no difference in printing cost. Likewise, cartel-member bankers can issue “loans” of nothing, filling customer accounts with what has been aptly described as “fountain pen money,” limited to a degree by the current policies and practices of those managing the banking cartel (“regulators,” etc.). Legal frameworks provide some protection for users of such money, most of the time (except when they do not), but such protections are far weaker and less reliable than those from the harder constraints of mineral reality.

Against this backdrop, some cryptocurrencies, led by Bitcoin, feature a novel and innovative third way to protect money users from arbitrary increases in new add-on supply. A production schedule can be specified within the effective definition of what a given cryptocurrency is.

Now in considering the exact number of possible units of a given cryptocurrency, consider two almost identical parallel universes, A and B, which differ in only one respect. Assuming sufficient divisibility in both cases (plentiful unit sub-division is possible), 30 widgetcoins out of a 300-trillion widgetcoin supply across a given society in Universe A carry the same purchasing power as 60 halfwidgetcoins out of a 600-trillion halfwidgetcoin supply across a given society in Universe B.

In each universe, one can buy the same kilogram of roast beef, in one case with 30 units, in the other with 60. Since the 300-trillion versus 600-trillion total money supply is the only difference between these two universes, it makes no difference whether the roast beef is bought with 30 units in Universe A or with 60 units in Universe B. Since the people in the two universes are wholly accustomed to their own respective numerical pricing conditions, their psychological and felt interpretations of the value associated with “30” in the one case and “60” in the other, are likewise indistinguishable.

Naturally, many individuals and organizations in any universe dream of having “more money.” For example, considering that 20 units of a good is worth more than 10, it is easy to equate having more units with having more wealth. Twenty good apples represent an amount of wealth (ordinally) greater than 10 such apples do. This is also the case with holding quantities of the same monetary unit. Twenty krone represents more wealth than 10.

But the crucial point now arrives: the foregoing “more is better” with regard to money applies to the number of units in a given party’s possession, but does not apply—as it does with ordinary non-money goods and services—to the wealth of the society of money users as a whole. Viewed across an entire society, intuitive associations from personal and business experience between larger numbers and greater wealth do not translate into a way to raise overall wealth. Political funny-money schemes with names such as “monetary policy” and “credit expansion” instead produce only sub-zero-sum transfers of wealth from some monetary system participants to others. Such transfers produce win/lose results in which some gain at the expense of others, not to mention the additional net losses from the transfer process itself (thus sub-zero-sum).

With Bitcoin, when the initial design was set—but not afterwards—42 million units, or other possible numbers, would have been as serviceable as 21 million. After the system launched, however, no general benefits could follow from increasing the quantity of possible bitcoins beyond their initially defined schedule. Such a later increase would instead tend to 1) reduce the purchasing power of each unit below what it would have otherwise been, 2) transfer wealth to recipients of new add-on units away from all other holders of existing units, 3) raise uncertainty about the coin’s reliability, likely depressing its market value with an uncertainty discount, 4) create demand for an analog of a “Fed watching industry” that speculates on what might happen next with the malleable production schedule, and 5) give rise to an industry of lobbyists, academics, and other experts dedicated to influencing such decisions.

While the block reward framework does indeed also “transfer wealth” in a sense to miners from existing bitcoin holders as in item (2) above, it crucially does so only in a predefined way, knowable to all participants in advance. The block reward schedule, defined before launch, provides a form of compensation for mining services in the system’s early days. This has enabled the system to evolve and succeed from its launch to the present. This follows not from any arbitrary change to the production schedule, but merely from the ongoing operation of the production schedule initially set.

One free pass only

In sum, a peculiar characteristic of money units when viewed across an entire society of money users provided a one-time and unique economic free pass for setting an arbitrary number of possible bitcoins at 21 million. This free pass could only be valid before initial launch (prior to 2009, or at the very latest, prior to the evolution of any tradable unit value). Changing the schedule later, especially in such a way as to increase unit creation, would have completely different and wholly negative effects from a systemic perspective.

Now returning to non-money goods and services the case is quite different again. The foregoing unique monetary free pass is entirely absent, whether after launch or before it. When non-money goods and services are likewise viewed at the level of a given society as a whole, “almost any number will do” does not apply. An increased total quantity of a non-monetary good or service supplied can be in the general interest, not only in special interests. It can be win/win and not win/lose. If there are more apples or cattle to go around in a given society (as opposed to just more pesos), this does tend to lower the costs of acquiring those goods in a meaningful way. This does enhance wealth in society, not just transfer it around. It represents a real increase in production, not just a “flexible” money fraud as in the case of arbitrary inflation on the part of money producers.

Miners provide one such ordinary “non-money” service when including a given transaction in a candidate block. This is a scarce service provided (or not) to a specific end user by specific miners. It does not fall under the unique category of the total number of monetary units in a society of money users. The total possible number of bitcoins, however, does fall under this unique category. The two numbers differ in kind and for that reason make poor objects for analogy. Both may, indeed, be viewed as “limits,” but it is important to recognize the contrasting economic roles and natures of these two types of limits.

Block Size Political Economy Follow-Up 2: Market Intervention through Voluntary Community Rules

Continues from Part 1: Software Choice, Market Differentiation, and Term Selection

If a given block size limit is part of a given cryptocurrency at a given time, can economists legitimately say anything with regard to such a limit? Must this topic be left alone as a mere qualitative characteristic of a product that users have freely selected?

From one perspective, if user preferences are subjective matters of taste and opinion, nothing can be said other than that Ravi prefers this, Setsuko prefers that, and Heinrich prefers some other thing. If various users prefer a cryptocurrency with one block size limit or another, economists must remain silent and leave users to their purely subjective preferences, only taking note in abstract and neutral terms of the shape of these preferences. Personal preferences are “ultimate givens,” their specific content irreducible “black box” starting points for economists.

This appears to be a sounder critique. Block size limits are indeed characteristics of specific cryptocurrencies as products. Users may well differ in their subjective preferences on such matters for reasons not even fully understandable. Users differ in their values. Motivations can even include various grades of membership signaling. An economist speaking on such things, this criticism goes, merely “smuggles in” his own particular personal preferences or party affiliation “dressed up as” objective analysis.

Can any role for economic analysis here be rescued from this critique? It may help to take a step back and consider some other scenarios to gain perspective and then return to apply that perspective to the case under consideration.

First, consider two hypothetical cryptocurrencies, one with a block size limit that directly influences the ordinary structure of supply and demand in its transaction-inclusion market, and another that does not (this can equally be the same cryptocurrency, such as Bitcoin, at two different phases in its history). The first cryptocurrency’s code alters the operation of the market between transaction senders and miners, limiting the total quantity of services that can be supplied per time period. Certain economic and industry-structure effects follow. These effects apply to a coin with this characteristic, but not to one without it. What are those differences? Those differences were the central theme of the interview to which this series follows.

Yet subjective individual preferences do not alter the distinctions analyzed. Thus, even though the content of the preferences themselves may be a black box for economists, the two differing transaction-inclusion markets still have objectively describable economic distinctions independent of any such preferences. Dropping a stone from the Tower of Pisa is a choice, one with all manner of possible motivations, but the resulting acceleration of gravity is not altered by any personal opinion as to the nature and effects of such gravity.

Three intentional communities and their altcoins

Next, consider several hypothetical intentional communities. It is possible to establish and run such communities under various rule sets. Although intentional communities have often been to some degree communistic (“commune”), it is possible to set up other idealistic havens, perhaps some real-life attempt at an Ayn-Rand-style Galt’s Gulch or a Neal-Stephenson-style Thousander retreat. Participation is governed by a kind of “social contract,” but in this context the contract is more likely to be one that actually exists, including specified conditions to which participants have assented by joining and staying, possibly even signing a written agreement with terms of residence.

Let us assume that in all cases, no matter what the other internal rules and cultures, participants are not forced to either join or stay. This freedom of entry and exit corresponds to cryptocurrency participation choices.

Now consider three such voluntary intentional communities. Bernieland features a $20 minimum wage. MagicCorner bans "wage relations" altogether. Finally, Murrayville has no numerical restrictions on wage agreements. Even though all three are voluntary communities, only Bernieland and MagicCorner include labor rules that restrict wage rates. The voluntarily agreed community rules specify certain wage-market restrictions. These types of restrictions are traditionally analyzed under the rubric of market intervention by state agencies, which are often subsumed under the term “government.” Whether one wants to also call a complex around intentional community rules and enforcement measures a type of “government” or not is beside the point. There may be valid reasons for either using or not using that word, provided suitable definitions and qualifications are set out.

In this case, it is analytically valuable to be able to note how Murrayville is free of rules that specify restrictions on the existence or range of wages in its labor market. Murrayville might therefore be described within this context as having a labor market free of intervention—unlike Bernieland and MagicCorner. Considering this difference alone, one would expect Murrayville to therefore have the best functioning labor market of the three, with more ample employment opportunities for those aiming to work on a wage basis.

The fact that all participants in all three communities voluntarily join and agree to the respective terms of each does not alter the economic distinctions between their differing labor market rules. Even though all three communities are voluntary, it remains that only one has a minimum wage, another bans wages, and a third does neither.

Arguing that the term “intervention” can only apply to state agency actions does not aid in the economic analysis of wage rate restrictions within these voluntary intentional communities. One might try to suggest a better term to use here instead of intervention. However, since the effects of wage restrictions have already been analyzed under the rubric of state-made laws described as “interventions,” using established terms—with suitable qualifications, as was done—easily accesses the appropriate implications.

Now in an effort to compete for residents, each community launches its own altcoin. Berniecoin does not allow any transaction with a fee above 1.5 Bernielashes/byte to be mined. This seeks to create a price ceiling for transaction inclusion. No one can pay more within the protocol. No one can use greater wealth to supersede other transaction senders. MCcoin’s protocol includes no way for transaction fees to be included at all; no one can bid for priority by including a fee. Finally, Murraycoin does neither. Transactions with any fee, or none, can be sent, and each miner is free to include or exclude any of these. Each node is likewise free to either relay any of them or not, or to try to figure out some ways to monetize such services.

Once again, based on this alone, Berniecoin and MCcoin demonstrate forms of what has heretofore been best characterized as “market intervention” within their respective communities. In this case, their protocols specify this directly. Murraycoin alone is free of any such effective intervention in its transaction-inclusion market. The others have policies that place a ceiling on the payment of transaction fees. The voluntary nature of participation in all three does not alter this distinction. One cryptocurrency has a maximum transaction fee, another bans fees, and the third does neither. These respective encoded policies are indeed part of what users implicitly choose when they use one rather than another. Nevertheless, distinct economic and social implications follow from those differences, and do so apart from any beliefs or wishes as to the nature of such implications.

This price-ceiling example demonstrates the general applicability of market intervention analysis within the context of voluntary arrangements. With the issue of a block size limit that restricts normal transaction volume, the relevant concept is not a price ceiling, but an output ceiling.

How to have a cartel without forming one

A subtler misconstrual of my interview assumes that I argued that since a particular situation or dynamic exists, someone must have acted to bring it about. However, I made no mention of any specific persons or groups, nor did I attribute any intentionality or motive. If there is thunder, it does not necessarily follow that Thor must have hammered it out.

Instead, I identified a market. I noted an effective limit to industrywide service provision as actual market volume begins to interact with a limit long in place, but formerly inert for this purpose. I described some of the general effects of any such limit to the extent it actually begins to limit ordinary volume. I argued that these effects are negative, but also easy for observers and participants of all kinds to miss or underestimate because they entail hidden costs and distort industry structure evolution from paths it could have taken instead, but did not, thus rendering those possibly better alternative paths “not seen” in Bastiat’s sense.

Certain economic effects follow from output ceilings and these have commonly been analyzed in terms of cartel situations. Yet this implies no necessary argument that anyone has set out to form a cartel or to create any of these situations or dynamics. That would be a completely different argument, more journalistic in nature and evidence requirements.

Being encoded in a protocol is a new way for an output ceiling to exist. Normally—but not in this case—any given industry actor, either current player or potential entrant, could just violate such a ceiling unless facing some overt or threatened form of legal or quasi-legal enforcement. Consider post-war Japanese steel production. An industrywide output ceiling was maintained for many years to limit competition. The Ministry of International Trade and Industry “recommended” this as a “voluntary” measure for domestic steelmakers. Of course, when some rebels sought to exceed the limit, MITI simply refused to approve their requests for increased purchases of more iron ore and fuel, which it also oversaw. Only through MITI could such a limit be maintained.

This type of limit sets up an upside-down and sub-zero-sum dynamic in an industry. There are concentrated gains for the inefficient (who should otherwise probably quit and sell off assets), somewhat less concentrated losses for the more efficient (who are unable to expand as much), hidden losses for would-be entrants (who are never seen because they avoid entering a market with an arbitrary ceiling), and dispersed and nearly invisible losses for many anonymous end users (who mostly have little clue about any of this and how it is happening at their own expense). Once again, though, all this can be so regardless of anyone’s knowledge or intentions.

To say with regard to the block size limit that there exists an industry situation with effects like those of an enforced cartel does not necessarily also imply that 1) some people set out to create it, or that 2) all or even any such people actually benefit from it on balance, or that 3) any of them fully understands it. Each actor has his own intentionality and working models of causality, but all of this combines into social outcomes that result, but were not necessarily planned from the outset to take the forms taken. Describing such unplanned social effects, Adam Ferguson wrote in 1767 that, “nations stumble upon establishments, which are indeed the result of human action, but not the execution of any human design.”

That said, noting the social science concept of spontaneous emergence as one factor to consider does not also constitute a claim that certain effects have not been planned or that they do not actually produce special interest benefits for some at the expense of others. It only points out that any such intentions and plans as may or may not exist are not directly relevant to the comparative analysis of rule effects. The topics are distinct.

Block Size Political Economy Follow-Up 1: Software Choice, Market Differentiation, and Term Selection

An interview with me on the Bitcoin block size limit appeared on 4 May 2016 on Bitcoin.com. Below, I develop additional clarifications and examples partly inspired by a range of comments and reactions to it. This is meant to build on and develop ideas in the original interview. For ease of reference, here is a PDF version of that interview.

This is a three-part series. Part 1 below covers a range of issues including the need to differentiate the market that was discussed in the interview from other distinct markets and non-market choice phenomena such as free software selection. It also begins to discuss the use of the term market intervention in this context. Part 2 will then continue by arguing that neither the voluntary nature of cryptocurrency participation nor the subjective nature of user preferences nor any alleged motivations on the part of the various actors involved alters my analysis. Finally, Part 3 will focus on economic distinctions between the 21-million bitcoin production schedule and the block size limit, arguing that these are different in kind and thus poor objects for analogy.

Chicago Board of Trade: People buying and selling form a market. Prices are key artifacts that market processes leave behind.

Chicago Board of Trade: People buying and selling form a market. Prices are key artifacts that market processes leave behind.

Two markets and a non-market choice sphere

One idea that showed up in comments was that I had expressed some view as to which Bitcoin software one ought to run. However, I did not address this at all. I have only published one previous preliminary article on the block size limit, on 20 June 2015, and this also did not mention implementation choice. Various views on this topic do not alter my analysis of the topics that I did address.

A related idea is that the current dominant software implementation already reflects “the choice of the market.” Therefore, any discussion of differences between a cryptocurrency having or not having a given block size limit is moot: the “market” has already spoken and this is evident in implementation share statistics.

It should be cautioned, however, that software choice reflects many considerations. Interpreting it as a proxy for a single issue is imprecise. Such choices may well reflect a generalized confidence in perceived quality and reliability. A user could therefore make a particular software choice either: 1) because of one specific code issue, 2) despite that same particular issue, or 3) regardless of it.

Such imprecision and ambiguity are among the reasons I did not discuss this matter at all. A more fundamental reason, however, is that it has no bearing on my analysis. Whether some percentage of a given population prefers Pepsi or Earl Grey tea does not alter the composition of the respective beverages in the slightest way, nor their respective effects on metabolism. Such things can be studied and assessed independently of the current statistical shape of user preferences.

In addition, choice of which free software to run does not really constitute a market, except in a metaphorical sense. Developers offer software products and users select and run such products. In a free software context, nothing is bought or sold between these groups. No price signals exist directly between users and developers.

In contrast, the central topic I addressed—the market for the inclusion of transactions on the Bitcoin blockchain—is indeed a market, one that involves quite different roles and actions than producing or running one version or another of free software. This is a market in which bidders send transactions, which takers (miners) either include or not in each respective candidate block. This market involves specific senders of specific transactions (not senders in general of transactions in general). At the other end, specific miners build each of their respective candidate blocks. In deciding whether to include any, all, or some transactions, fee/byte (bid) is salient. Node operators act as key intermediaries, like referring brokers, currently uncompensated. On-chain and off-chain transacting options, both existing and potential, coexist in this context in a complex blend of competition and synergy.

There are therefore at least several phenomena to differentiate. First, the buying and selling of bitcoin forms textbook markets on the order of commodities and forex markets. Those effectively controlling given bitcoin units can sell such control in exchange for some other money unit, product, or service, or give them away as gifts. Second, bidding for on-chain transaction inclusion and miner decisions to include or not include transactions in candidate blocks forms a distinct open-bid market for on-chain inclusion priority. Third, developers offering free software and users making decisions on which implementations to run for their various purposes does not constitute a market in the sense of a complex of buying and selling behavior.

Whatever one may choose to call these three phenomena, each is meaningfully distinct from the other, describing different sets of actions and roles. To claim that “the market has spoken” in the context of software choice is therefore far less informative that it might at first appear to be. Making such a claim requires specifying what exactly has allegedly spoken (it isn’t a market) and the content of this purportedly speaking thing’s alleged message (ambiguously mixed with considerations such as general perception of code reliability).

The term “market intervention”

Several commenters took issue with my use of the term market intervention in this context. It is true that market intervention has a negative connotation for many readers, though not all. Indeed, a great many persons eagerly advocate some form of governmental intervention in economic affairs as part of their ordinary political opinions. Still, one interpretation would be that I had set out to create negative connotations and thus arrived at my word choice using rhetorical criteria.

A different interpretation would be that I set out to select the most accurate available technical term to describe the phenomenon under consideration. I then specified what I meant in using this term and excluded certain inapplicable historical and institutional associations. This is my own first-hand interpretation of what I did in selecting this language. That it still has negative connotations for some may be natural in that what it describes has negative effects. However, word choice one way or another does not alter such effects.

Another related but more substantive criticism that appeared in several variants argues that a block size limit is just a qualitative characteristic of a cryptocurrency as a good. A given limit is baked into what the good is. As such, it cannot be construed using the model of economic intervention. If a characteristic is already in the product, how could it possibly be construed as intervention (from outside)?

However, I had already stressed in the interview how novel and unprecedented this situation is. My argument was that even though the legal and practical contexts of traditional interventionism conducted by state agencies are completely different, nevertheless, the economic effects are on this transaction-inclusion market as a government enforced industrywide output ceiling would be. This will be addressed further in Part 2.

A commenter suggested that I was arguing from history that the current block size limit was not part of “consensus.” Consensus, in this debate, often seems to transcend a mere computer science fact to also encompass an allusion to a hard Bitcoin Realpolitik. Any other considerations, such as the documented history of the block size limit, are irrelevant to this current reality.

However, I did not reference or use any concept of consensus at all. Nor did I question the reality of any given state of consensus on the network at any given time. What I did was analyze differences between possible states of code and then describe economic and social implications of such differences.

A loosely related idea was that my analysis was tantamount to advocating that cryptocurrencies should not maintain any limits or standards. If calling into question one sort of limit, such as the current Bitcoin block size limit, why not just question all limits? Why not just also advocate raising the maximum coin count? That, after all, is also a “limit,” so why not call keeping that in place an “intervention” too? This will be addressed in greater detail in Part 3.

The interview itself concerned one such limit and not any others. Why? I could have branched off to discuss the sociology of decision-making or described a software preference. But I did no such things. I could have discussed any other protocol characteristic or issue. Why did I discuss only this one? The answer is that I think this limit has unique economic features that are both important and poorly understood. Explaining this was therefore the focus.

Continues with Part 2: Market Intervention through Voluntary Community Rules

Only the faster profits: A powerful health measure and why it is unadvertised

My journey in nutrition science studies and personal nutrition practice over about the past six years has been characterized by “punctuated equilibrium,” long periods of stability, with minor updates from my readings and small alterations to practice. But every couple of years, it seems, such equilibrium is slammed into a rather different shape over just a few days.

What follows is about a book that just did this. It has not overturned anything I was doing before, but has lifted my understanding and led me to try some important practice modifications.

Dr. Jason Fung has produced a new book that is vitally important, well written, argued from the highest quality available evidence, and not lacking in careful doses of wit and humor. This is not just another weight to further depress already strained diet-section bookshelves, it is a brilliant yet concise scientific integration delivered so that a general audience can also benefit directly.

The Obesity Code (March 2016; foreword by the legendary Professor Tim Noakes) states, and largely follows through on, a preference for rigorous controlled human trials over the kinds of associational, epidemiological, and often scientifically weak “studies” (sometimes of a few rats) that typically grab headlines with hyped and unwarranted inferences. The book's central theory does what a good scientific theory should. It explains all the relevant high-quality evidence in a systematic, logical, and accessible way. It also addresses the oversupply of low-quality evidence and non-evidence that leads astray. For hardcore readers, the endnotes run 32 pages, no small proportion of which are research journal citations.

Context: Before I read this book

In October 2010, my long-term general interest in healthful nutrition jumped to the next level when I read The Primal Blueprint by Mark Sisson. This kicked off some major personal changes and a side quest to read in nutrition and exercise science to examine controversies with practical implications for what I decide to do in my daily life.

The intellectual side of this journey included Good Calories, Bad Calories (2007) and Why We Get Fat (2011) by Gary Taubes; numerous books and articles by Robb Wolf, Loren Cordain, and others; biochemical metabolism research; and evolutionary health reasoning and related paleo-archeological controversies.

The next major step came in 2013, when I shifted to a ketogenic approach based on the work of Jeff Volek and Stephen Phinney, two career researchers and pioneering experts on nutritional ketosis and exercise performance. Compared to the Primal Blueprint framework recommendations, this entails reducing daily carbs further to under 20g and increasing natural fats to replace that sugar energy while maintaining moderate protein. This is often labeled “low-carb, high-fat, and moderate protein,” or LCHF. This is not your cringeworthy ketogenic lab-chow from classical research and medical use. It is all quite real food.

To assemble my own thoughts from such widely varied sources of research, inspiration, and practice, I created a webpage called Evolutionary Health. There I summarize the current state of my views and link to standout resources. I update this from time to time with information new to me, and refinements of my working synthesis. That page includes material on food production and environment, particularly desertification. It now includes multiple references to Fung’s work.

Until now, if asked what to read for ways to improve health through nutrition, my top starter book recommendations have been The Primal Blueprint, mentioned above, and The Art and Science of Low Carbohydrate Living (2011) and The Art and Science of Low Carbohydrate Performance (2012) by Volek and Phinney. I then recommend The Big Fat Surprise (2014) by Nina Teicholz, another great contributor in the tradition of Taubes—exposing the modern nutrition emperors to be shockingly underdressed. This adds a larger scientific and historical context, including how modern conventional wisdom on nutrition has been formed: far more by politics, loose intuition, and charisma than by legitimate scientific evidence.

Now, however, I might start people right off with The Obesity Code.

Pinpointing the root of metabolic syndrome

What causes obesity? What are the best weight control practices? Everybody thinks they know the answer. Fung demonstrates that this “everybody,” such as it is, remains quite confused.

The book presents a single central theory of overweight. While this extends to diabetes and metabolic syndrome more generally, the book focuses on overweight as the epicenter of the modern long-term degenerative symptom cluster. It argues that the central underlying phenomenon in obesity is insulin resistance. Successful treatments, especially if they are to have lasting healthy effects, must lower insulin resistance.

Insulin resistance is analogous to drug tolerance. The more of a drug one has taken over a longer period, the higher the dose needed for a similar effect. Likewise, the more time the body must swim in evolutionarily novel quantities of insulin, the more likely it is to up resistance. Such resistance is also stubborn; it rises much more easily than it falls. A self-reinforcing pattern of elevated insulin and elevated resistance begins. When insulin-producing beta cells can no longer keep up in this death race and begin to fail, we call that “type 2 diabetes.” The conventional treatment? Just inject more insulin; the race must go on. But the patient keeps deteriorating.

Genetic differences and age both impact individual insulin resistance response. This helps explain wide variations among people eating similarly and for the same person at different ages. This insight rescues a too-simple carbohydrate-obesity theory from the obvious rebuttal: just point to some carb-eating thin people. The book also emphasizes the better-known distinction between the effects of carbs in natural forms versus those in modern processed and refined forms.

But first, how did we get here and why are we still here?

It would be relatively simple to explain some measures to lower insulin resistance, such as some of those practiced at Fung’s Intensive Dietary Management program. However, the complication he faces, and faces up to squarely in this book, is that entire industries, bodies of officialdom and authority, and entrenched conventional wisdom all combine to promote and sell methods that either do not reduce insulin resistance, or raise it still further. Treating advanced type 2 diabetes with insulin injections is partly comparable to treating an advanced alcoholic with a steady rotgut supply. It patches some symptoms, even as it gradually worsens the condition and leads to further deterioration.

Official bodies and industry interest groups have pushed failing methods and theories relentlessly for decades (whether intentionally or unknowingly does not change the outcomes). Massive failures to promote health never dissuade; more of the same is always their answer. Some “success,” however, is still visible. It shows up in untold billions on the income statements of 1) ag and food companies selling profitable processed products that gradually sicken people and 2) pharma and healthcare organizations producing products and services to treat the resulting chronic degenerative symptoms, mostly without addressing causes. With causes untreated and the sick getting sicker, the massive sums involved not only keep flowing, but keep expanding.

The book must therefore also take the time to expose and refute common, widely accepted, well-funded, officially promoted, and dead wrong claims and practices. In each case, it demonstrates how the highest quality available evidence, common experience, and logic show that conventional weight management methods fail—and that they fail is probably the best that can be said of them.

Don’t just do something, stop

The book’s most important practice implication is less about food and more about the need for its periodic absence. In health, politics, and some other fields, people tend to respond to serious problems with a somewhat desperate “just do something” attitude. But the most helpful measure might instead be to stop doing something. Rather than “solving” a problem, what may be required is to stop creating its causes. In this case, if there is too much eating too often, stop doing it. And there’s a word for that—fasting.

A fasting period is nothing more than the time between eating sessions. Longer pauses can begin to take on names such as intermittent fasting (IF) and still longer pauses just fasting. So in this sense everyone fasts already, Fung reassures. The variation is in how long and how often. Fasting’s true opposite, it comes to appear, is frequent snacking.

Fung notes that fasting has been promoted and practiced through cultural traditions the world over for thousands of years (That, I would add, might mostly just reflect the duration of available records). Fasting has been promoted for health, clarity of mind, and spiritual refinement, often carried through religious practice traditions.

He also does not shy away from explaining that fasting and IF are unique in important ways from a politico-economic standpoint. The person who fasts benefits substantially, but his corresponding cost for this is better than zero. He saves both money and time. He gains freedom through reduced frequency of buying food, preparing it, eating it, and cleaning up, which can add up to large blocks of time and attention.

For example, I have moved mainly to a 23-hour daily fast framework for now (with occasionally longer stretches as well). This simply means eating one meal a day during an approximately one-hour period. Simple as can be. I may next try alternate-day fasting (eating normally one day and not at all the next day) to compare the effects. The latter pattern has been commonly employed in research trials.

The implication is that no one else besides the person fasting stands to profit from it. Only the faster profits. No pharma company sells more of its drug (some may sell less). No food company sells more of any boxed creation (some may sell less). No elaborate diets must be studied and followed, no calorie counting apps employed, no juicing machine bought and fed with plant carcasses, no special shopping list assembled, no exotic ingredients ordered online.

Of course, Fung, a practicing physician and kidney specialist, is careful to warn that at minimum those already on metabolic medications, foremost insulin, must work closely with a physician. This may entail careful adjustments, which should be done only under proper supervision. Significantly low blood sugar is a particularly dangerous condition that can follow from mis-coordination of drug dosages with current health state and eating patterns.

Fasting versus calorie reduction

This book clarifies that just “eating less,” as a method, does not deliver the positive effects of fasting; it has opposite effects on the relevant all-important regulatory hormones. Under calorie reduction, metabolism drops to compensate for the stable lower-energy environment. Metabolic rate then stays lower long afterwards, which explains both stalling progress and later regain.

With true fasting, however, metabolism either stays level or increases. This seems congruent from an evolutionary standpoint. A few days of bad hunting (no food at all) means it is time to get out there and hunt, and do it more effectively than before. Sitting in the cave and getting cold, moody, and depressed is not going to help.

Likewise, the book recommends eating normally (though ideally also low carb) when one does eat. That means not being hungry after the meal, as can happen under conscious calorie-cutting methods. Readjusting the modern unnaturally feasting-heavy “feast and famine” balance away from too much feasting should not, in this view, entail skipping the feasting parts altogether, just extending the fasting phases.

The author emphasizes the distinction between lowering insulin and reducing insulin resistance. Just lowering insulin by changing food content might help, but might not always be enough to fully reverse an existing condition. Chronically high insulin is among the causes of elevated insulin resistance, but influencing insulin resistance itself must remain the real prize. A focus on insulin, per se, then is one way to get off track, a false summit.

The book discusses effects on lean mass. The trial research again shows that fasting has important effects that are opposite to those of calorie reduction within conventional meal timing patterns. It is calorie reduction that leads to lean wasting (“starvation mode”), while fasting does not. Fasting stimulates junk protein breakdown for recycling as well as human growth hormone release, a build-oriented combination. A steady calorie reduction program never gets around to these things. All the way down to actual severe starvation, it never generates the hormonal, metabolic, and cognitive benefits of fasting.

Some other nods to tradition

The book also mentions how certain traditional practices hold up well when judged against the insulin-resistance theory. Eating together at mealtimes, and not in between, automatically sets up longer fasting periods. This is just the opposite of the frequent eating and snacking practices that snack sellers push.

Likewise, widespread traditional uses of vinegar and fermented foods are given a nod based on experimental evidence that vinegar moderates insulin response. For example, the penchant for Japanese cuisine to combine rice with pickles and to make sushi (vinegar-soaked rice) likely affords some protection from rice’s insulin spiking characteristics.

Such factors may help further clarify the “Asian rice paradox.” A simple carbohydrate-obesity theory struggles to explain why East Asians eating large amounts of rice did not become obese in the 20th century. Traditional eating patterns, activity patterns, and food combinations may well all have contributed. Genetic influences on insulin resistance are also possible contributors.

More recently, however, these same populations have started gaining weight, and diabetes is on the rise. This coincides with increased consumption of sugar, flour, and other processed foods, greater fast food intake, more sedentary occupations, and a snacking culture that can spread with processed snack food marketing and distribution. Not only do snack foods (and with them snacking) tend to shorten traditional fasting periods, but most of these items are made almost exclusively from insulogenic processed derivatives of cheap (and often government subsidized) agricultural grain crops, foremost sugar, wheat, and corn.

Optimization, and the final defeat of the “thermodynamics” refrain

For established low-carbers still not entirely happy with their body compositions and looking for more optimization (like me), Fung argues that while LCHF is a powerful approach, it is not the most powerful one. Each food, except perhaps pure refined fat, generates some insulin response, though this varies depending on the food. Regardless, there is no way to beat fasting at getting insulin down to rock bottom and keeping it there for long stretches, providing an environment in which insulin resistance can also gradually sink.

It is insulin resistance, Fung argues, that directs the body’s fat storage “set point,” the fat composition level the body fights to keep and return toward. In any long-running war against a conscious, conventional “eat less; exercise more” strategy, the body’s homeostatic set point always wins. Cutting calories can appear to win a few battles, but this cannot last. Calorie cutting, depending on what is actually eaten in a given program, can also sooner or later lead to weakness and gradually advancing malnutrition. Worse, the stress of being regularly hungry, cold, and malnourished can backfire further by raising stress hormones—which also stimulate insulin.

The way forward is to address the set point itself, and that means modifying insulin resistance. With this, Fung establishes why and how attempts to reduce weight by merely lowering calories within existing meal patterns fail in the long run, ending in regain, often to a level above the starting weight.

And as for the ever-reliable “but, it’s all just thermodynamics” refrain, which insists that weight control is nothing more than regulating calories in and calories out as in a lab beaker, it is true that caloric balance does change with weight loss following from fasting. However, that change is an effect, not a method. Fung demonstrates how and why methods with long-term success must treat the chronic hormonal condition of insulin resistance. Doing so allows the body’s fat storage set point to fall back to a more natural level to which the body then happily self-regulates.

This means that sustainable changes to caloric balance follow from a set point change but do not necessarily cause it, contra standard advice. The body has far more tricks to fight back with than consciously calorie-cutting dieters can possibly overcome for long. The more they fight using the usual failing methods, the stronger the body’s countermeasures become. Thus, seemingly unassailable advice to “just eat less,” offered as a method for change, is worse than useless. And as Taubes had also argued in Why We Get Fat, naive misapplication of a simple physics concept to a complex homeostatic system serves only to support blaming obesity victims on the basis of scientifically unteathered and even primitively moralizing causal theories.

Could be better combined with LCHF literature

Something emphasized in the LCHF literature, but less so in this book, is that being in nutritional ketosis is already a quasi-fasting state compared to the common contemporary glycolytic (“carb burning”) state. It is far easier for those already in nutritional ketosis to simply not bother eating at times. They can start and continue fasting while hardly noticing, especially when compared to typical carb burners in pursuit of their next glucose fix.

People in a dominant glycolytic state transitioning to either nutritional ketosis or to fasting (fasting ketosis) can each report some similar transitional symptoms and discomforts such as headaches and low energy. People already in a dominant lipolytic (“fat burning”) state, however, have only to go from nutritional ketosis to fasting ketosis, a far milder transition. Mainly advising fasting for people coming right from a conventional diet could run them into challenges. Starting with nutritional ketosis makes fasting easier.

But beginning either practice still tends to require an initial transition. In favor of a fasting-first approach, fasting is much simpler to execute and monitor. It just involves not doing something. Changing the content of one’s habitual diet entails more ongoing decisions, leaving more room for errors and subtle program regressions.

On balance, both LCHF and fasting are important and mutually reinforcing. Either could come first or they could be adopted together. There are various pros and cons in emphasizing one or the other to newcomers, a question mainly of strategy and practical experience.

An integrative milestone

This book has enabled me to take what information and practices I had already filed away as solid and useful, and revise that totality into a better-integrated picture. This helps me better harmonize contributions from several schools of thought within the broadly defined evolutionary nutrition movement. Fung suggests that some sub-groups that tend to engage in in-fighting are probably just each right about their own particular puzzle piece. Now we get a clearer look at the frame photo for that whole puzzle at a single glance.

Perhaps the most encouraging message from this book is that, unlike basically every “diet” strategy, there is good reason in existing high-grade research not to expect regain from a fasting approach. Fasting and LCHF to target insulin resistance are quite distinct from the many conscious caloric balance variants that have failed long-term so consistently and so epically for decades. In addition, evidence is also accumulating to indicate likely protective, and especially preventative, effects of fasting on other “diseases of civilization,” including neurodegenerative cognitive conditions, heart disease, and cancer.

We can try to fight the body’s fat composition set point without changing it—and many, many have—but only at great cost and effort and with a near guarantee of long-term failure. A few battles may be won, but the war’s outcome is already clear. The set point wins. Conventional calorie restriction does change the set point—it raises it! This makes apparent temporary successes from calorie-reduction programs Pyrrhic victories.

Armed with methods that can lower the set point instead, we can finally get our bodies and ourselves back on the same side. This is the central message of this brilliant, heroic, and accessible book in a field of crucial importance to human well-being.

Some misplaced explanations of bitcoins as tradable units

This is an excerpt from Chapter 8, “Some illusions of enlightened explanations,” in my book, Are Bitcoins Ownable: Property Rights, IP Wrongs, and Legal-Theory Implications.

As important as it is to gain at least a basic technical understanding of Bitcoin, attempts to describe what its tradable units “really” are, as elaborated from some allegedly more enlightened perch, can sometimes distract more than aid when applying economic and legal concepts. For example, pundits discussing whether bitcoin falls under what they each consider to be “money” or not sometimes explain that bitcoin is really just a “ledger entry” or a “protocol token,” a harmless technical artifact of a promising new “blockchain technology.”

Whatever the root of or strategy behind such discourse, however, a bitcoin buyer does not in fact seek a share in a distributed ledger or any other such tortured monstrosity. He wants to buy a bitcoin in the same sense that he might want to buy a grapefruit. He in no way sets out toward the market to buy a share of a global orchard cooperative that also happens to entitle him to one grapefruit that day.

Molecular diagram of grapefruit mercaptan. Tasty.Nor is it relevant that a grapefruit is “really” organic molecules, water, and some other substances. For that matter, a physicist might go further and insist that a grapefruit is “really” nothing but some occasional quarks suspended in vast stretches of empty space. All such misused reductionism is irrelevant to understanding the buying and selling of grapefruit. It likewise has no bearing on whether grapefruits can be eaten without being paid for and how or if people ought to react if they are.

Really just quarks and empty space (Wikimedia Commons, Aleph)Economic theory and legal theory are fields concerned with human acts, such as acquiring, holding, trading, and stealing. Action is marked by verbs. If one is interested in understanding the grapefruit market, one does not seek first to master grapefruit-tree cellular biology, let alone quantum mechanics. It is sufficient for economics to view those grapefruits actually being traded as the relevant goods, the production, pricing, and distribution of which are to be examined using economics methods.

This implies the importance of taking care in selecting which fields of knowledge, aspects of the phenomenon, and “layers” of reality are the most relevant to consider in understanding what bitcoin “really” is, including with regard to whether it is ownable.

One must also proceed with caution in applying analogies. For example, it is easy to view bitcoin as just like other digital blips buzzing around the internet. However, it should be emphasized that buying a bitcoin is not like buying other digital goods, such as a copy of a song file. One does not buy a copy of a bitcoin, but a bitcoin itself. A bitcoin seller no longer possesses the bitcoin in question after the sale (and contextually sufficient confirmations). When one buys (a copy of) a song file, in contrast, the possessor retains copies from which to make more copies.

Most digital goods, such as documents and song files, are nonrival. They can be copied. Multiple people can use multiple copies simultaneously. “Stealing a copy” leaves the original as it was. It is not gone after being “stolen.”

Likewise, not only can a whole blockchain be copied, but some key part of its value derives from its actually being so copied and distributed with redundancy to numerous independently operated locations. A signed bitcoin transaction is also a short digit string that can be copied, sent, and resent around the globe in fractions of seconds. These are nonrival goods, as are cryptographic signing keys. With nonrival goods, one person can have one copy and another person can have another copy and each person can control these respective copies independently and simultaneously.

However, this is not the case with bitcoins. A bitcoin cannot be copied in any such way. It is rival in the same sense as a physical object or spatial location. In addition, a bitcoin cannot be sufficiently described as “just a ledger entry” because a ledger entry records something. This formulation alone does not yet explain what it is that is recorded.

From a unit perspective, bitcoins function as a digital monetary commodity according to strict economic-theory definitions. From an integral perspective, the units are inseparable aspects of the Bitcoin blockchain. They cannot exist without it and it does not exist without them. There is a nondualistic relationship between bitcoin units and the Bitcoin blockchain; while they are distinguishable conceptually, they are not separable in reality.

Announcing new book on bitcoin and legal theory

The first of several concurrent research and writing projects has just hatched: Are Bitcoins Ownable? Property Rights, IP Wrongs, and Legal-Theory Implications.

This is a study in the foundations and implications of action-based jurisprudence, forged through applying it to bitcoin. This brings together for the first time the two major fields on which I have been writing over the past five years.

The context includes relationships among crypto-anarchist thought (such as contract assurance through software code), conventional legal administration (bureaucratic classificationism and rule through law), and ideal legal practice (actual promotion of justice), as well as related philosophical issues such as the combined use of multiple knowledge fields and the ethics of legal practice. Among the book’s central themes is whether and how the same principles that both support property rights in measurable objects and locations and argue against IP claims in copiable ideas and abstractions may apply to the unique new case of bitcoin.

Here is the back-cover description:

Bitcoin has fresh implications for economics and law at many levels. This book addresses whether bitcoins ought to be considered ownable under an action-based approach to property theory, which—like bitcoin itself—transcends the boundaries of existing positive law jurisdictions. Beyond instinctive answers is a rich opportunity to examine the many technical facts and legal-theory issues involved. Bitcoin has a unique new place among types of economic goods, between the physically and spatially defined goods of property theory and the copiable, abstract ideas, patterns, and methods associated with IP rights. It does not fall so easily into existing categories.

The author brings together here for the first time his work in an approach to legal philosophy grounded directly in the analysis of human action, which he has termed action-based jurisprudence, with his several years of writing about bitcoin from a monetary theory perspective and contributing through articles, presentations, and video productions to raising general public understanding of how Bitcoin works on a technical level.

This content (22,000 words) is licensed under Creative Commons and has been made available in commercial paperback and Kindle versions on Amazon as well as other ebook store versions, and a free PDF of the paper version to facilitate quick and full access to the text, previewing, sharing, text searching (beats an index), quoting, and citation by page number.

Ways to support this work and encourage future work like it include spreading the word and sharing, writing reviews on Amazon and elsewhere, posting quotations, and buying a commercial edition.

Most of all, enjoy. Hopefully, no reader’s views on the topics addressed will remain entirely unaffected. Mine were not.

Paperback edition at Amazon ($6.99)

Ebook stores ($2.99): Kindle edition (free under Kindle MatchBook program for buyers of paper version), iBooks, Kobo, Nook, Oyster, Page Foundry, andScribd.

PDF of paperback edition (Free supplement to commercial editions or consider sending an optional bitcoin tip)

Watch the five-minute video introducing the book on my Amazon author page, which can also be followed for future releases.

The paperback version is available at least on US, UK, and EU area Amazon sites, but not sure about elsewhere. The Kindle version is available on most national Amazon sites worldwide.

Preview: “The market for bitcoin transaction inclusion and the temporal root of scarcity”

What do you see in those blocks? Source: Wikimedia Commons: “Crown Fountain” by Tony Webster.I have been considering the Bitcoin block size debate for quite a few months (next to some other large projects), reading, learning, and applying principles. It is such an important and contentious issue that I have taken extra time before commenting at all to research and keep following the wide range of factors, opinions, and related issues.

In seeking to apply economic theory in new ways, and when addressing Bitcoin in particular with it, I try to take even more care than usual to first acquire a sufficient technical understanding so that I can usefully apply such theory to the case. The block size issue has set that bar still higher than it had been with other Bitcoin topics I have addressed.

I am convinced the roots of much of the contention are based primarily in economic-theory differences and only secondarily a technical or even social ones. Additional issues of governance and decision-making likewise come to the fore mainly when people are severely conflicted on what the right thing to do is and the issues then descend into “political” contests of influence and persuasion. There are also economic ways to understand the kinds of circumstances under which issues tend to become viewed as “political” in nature rather than not.

In short, if it were clear what ought to be done, that could be implemented with some work. Yet not only has widespread consensus on the right thing to do been slow to arrive, but the disagreements appear rooted more in differing opinions on economics, a specialized field entirely distinct from engineering, programming, and network design. Worse, too much of what passes for “economics” in the official mainstream today has been built upon a foundation of long-refuted non-sense. So using that is unlikely to help matters along either.

A 30-page written treatment is in the editing and review phase. For now—in response to numerous behind-the-scenes requests for comment—here is a summary preview of some of the essentials of my take on this as of now. The forthcoming paper contains citations, support, and step-by-step context building and also covers many more related topics than this summary can touch on.

Summary of some findings

The block size limit has for the most part not ever been, and should not now be, used to determine the actual size of average blocks under normal network operating conditions. Real average block size ought to emerge from factors of supply and demand for what I will term “transaction-inclusion services.”

Beginning to use the protocol block size limit to restrict the provision of transaction-inclusion services would be a radical change to Bitcoin. The burden of proof is therefore on persons advocating using the protocol limit in this novel way. This protocol block size limit was introduced in 2010 as an anti-spam measure. It was to be an expedient to be removed or raised at a later stage as normal (non-attack) transaction volumes climbed. It was not envisioned as having anything to do with manipulating transaction fees and transaction-inclusion decisions on a normal operating basis. The idea of using the limit in this new way—not the idea of raising it now by some degree to keep it from beginning to interfere with normal operations—is what constitutes an attempt to change something important about the Bitcoin protocol. And there rests the burden of proof.

If that burden is not met, the limit ought to be (have already been) raised—by some means and by some amount. Those latter details do veer more legitimately into technical-debate territory (2, 8, or 20MB? new fixed limit or adaptive algorithm? Phased in how and when? etc.), but all such discussions would be greatly facilitated by a shared context on the goal and purpose of any such limit having been placed into the code. A case for establishing some completely new reason to retain this same limit—other than as an anti-spam measure—would have to be made by its advocates if they were to overcome the default or “when in doubt” case. The context shows that this when-in-doubt default case is actually raising the limit, not keeping it unchanged.

Casual and/or rhetorical conflation of the block size limit with the actual average size of real blocks is rampant. This terminological laziness begs the key questions of: whether any natural operational economic constraints on block sizes exist (or could become even more relevant in the future), what those natural constraining factors might be, and what degree of influence they might have on practical mining business decisions. In strict terms, nothing can be done without some non-zero cost. For example, including a transaction in a candidate block carries some non-zero-cost and larger blocks propagate more slowly than smaller ones, other things being equal.

How can the real influences of such countervailing factors be discovered within a dynamic complex process? Markets and open competition excel at just this type of unending trial-and-error tinkering problem. However, setting a blanket restriction at an arbitrary numerical level on the output of transaction-inclusion services across the entire network distorts such processes, preventing accurate discovery and inviting both general economic waste and hidden zero-sum transfers.

Transaction-fee levels are not in any general need of being artificially pushed upward. A 130-year transition phase was planned into Bitcoin during which the full transition from block reward revenue to transaction-fee revenue was to take place. The point at which transaction-fee revenue overtakes block reward revenue should not have been expected to arrive any time soon—such as within only the first 5–10% of time that had been planned for a 100% transition. Transaction-fee revenue might naturally come to exceed block reward revenue in say, 20, or 30, or 50 years, or whatever it ends up being. Yet even that is still only a 50% milestone in the full transition process. Envisioning the long-term future of mining revenue should also factor in the clear reasons for anticipating steady secular growth in real bitcoin purchasing power.

Most fundamentally, scarcity is being treated in this debate largely using an intuitive image of “space in blocks.” However, scarcity follows from the nature of action as inevitably occurring within the passage of time. Actors would like to accomplish their objectives sooner rather than later, other things being equal. Time is the ultimate root and template for scarcity, because goods are only definable in relation to action and any action taken precludes some possible alternative action (“cost”). Scarcity of transaction-inclusion should therefore be understood in terms of relative time to confirmation—which is already today statistically influenced by fee levels.

Finally, discussions of whether bitcoin should or should not be used for “buying coffee” sound embarrassingly like Politburo debates. Market discovery through real supply, demand, and pricing over time allow socially best-possible levels of [average fee multiplied by transaction volume relative to real bitcoin purchasing power] at any given point in (in-motion) time, to be discovered dynamically. The same goes, at the same time, for the relative pros and cons for users of the entire possible existing and future spectrum of off-chain transaction options relative to on-chain ones. The protocol block size limit was added as a temporary anti-spam measure, not a technocratic market-manipulation measure. The balance of evidence still seems to indicate that it should remain restricted to its former role.

Bitcoin as a rival digital commodity good: A supplementary comment

Japanese commodity money before the eight century. Source: Wikimedia Commons, PHGCOM.One of the challenges of interpreting bitcoin has been whether it can be classified under certain existing conceptual rubrics such as “money” or “commodity” for purposes of economic analysis. Could it be some strange new kind of “commodity money”? Most people immediately and intuitively dismiss this as a possibility because it is not a physical “thing,” which they feel is a defining characteristic of commodity-ness.

Resort to a word such as “token” seems a convenient escape valve from this situation. However, this could also be misleading. A token in a “token money” context derives its value from having a fixed exchange rate against something else—a 100 pennies for a dollar, a plastic chip for a euro, etc. Bitcoin, in contrast, is traded directly as itself, with utterly no sign of any fixed exchange or substitution rates (see my Bitcoin, price denomination and fixed-rate fiat conversions” 22 July 2013).

My newest paper, “Commodity, scarcity, and monetary value theory in light of Bitcoin” in The Journal of Prices & Markets (Winter 2015) explores some of these issues in detail from a formal conceptual standpoint to check such immediate and intuitive responses. The paper takes the time to define and then apply core economic-theory concepts, including goods, scarcity, and rivalry, as well as classical lists of “commodity money” characteristics, to understanding bitcoin in terms of monetary theory.

True, commodities are usually tightly associated with materiality. However, an economic-theory sense of commodity ought to be differentiable from a physical-descriptive sense. Economics begins with the study of choice and action, as distinct from issues addressed in physical sciences. It may be that the presence of materialness in commodities has just been assumed due to the nature of the available historical examples.

For a supplemental “reality check” beyond the obscure economics library, I thought to simply go and read the Wikipedia article on “Commodity.” This should be reasonably unlikely to represent any arcane or partisan definitions from one school of economics rather than another, and should first of all represent a general-purpose range of typical current understandings of the term.

I extracted some economic-theory elements from the entry, omitting illustrative examples. The examples are mostly material items, but this is to be expected due to the overwhelmingly pre-bitcoin scope of economic history so far. Indeed, part of my argument is that bitcoin may be the first rival digital commodity good (defined in the paper), which would mean precisely that it is unprecedented, a new type of example. Between the few excerpts below, I relate these presumably mainstream characterizations of commodity-ness to bitcoin.

Extracts from Wikipedia entry on “Commodity”

The exact definition of the term commodity is specifically used to describe a class of goods for which there is demand, but which is supplied without qualitative differentiation across a market. A commodity has full or partial fungibility; that is, the market treats its instances as equivalent or nearly so with no regard to who produced them. As the saying goes, “From the taste of wheat it is not possible to tell who produced it, a Russian serf, a French peasant or an English capitalist.”

No one generally considers which mining pool mined the block that a bitcoin originated in when deciding whether to accept payment. 50 Cent, for example, is unlikely to refuse bitcoin payments for his albums from anyone using coins mined by pools other than 50 BTC.

In the original and simplified sense, commodities were things of value, of uniform quality, that were produced in large quantities by many different producers; the items from each different producer were considered equivalent.

Multiple producers: All the various Bitcoin miners produce interchangeable new coins.

One of the characteristics of a commodity good is that its price is determined as a function of its market as a whole. Well-established physical commodities have actively traded spot and derivative markets.

There are numerous bitcoin spot markets and even some derivatives markets.

Commoditization occurs as a goods or services market loses differentiation across its supply base. As such, goods that formerly carried premium margins for market participants have become commodities, such as generic pharmaceuticals and DRAM chips. There is a spectrum of commoditization, rather than a binary distinction of “commodity versus differentiable product”. Few products have complete undifferentiability.

Coin tracking is sometimes cited as a risk for weakening the completeness of bitcoin fungibility, so while fungibility largely holds, there is some risk of entering onto a “spectrum of commoditization” in which some differentiation could creep in under certain circumstances.

Overall, I thought the entry was surprisingly clear in defining commodity in terms of economic rather than material concepts. While most of the examples of commodity were material, the economic meaning was conceptually independent of materiality. As should be expected, the discussion was about economic issues such as quality differentiation, pricing, market organization, and trading patterns—not chemistry. If we are using a term in economic analysis, a strictly economic definition should be most suitable.

 

Jeff Volek presents research indicating low-carb also best way to improve lipid profiles

Ultra-low-carb eating has appeared for about a century already to be the most effective treatment for both overweight and diabetes (both linked to the more general metabolic syndrome), beating all drugs and other interventions by wide margins (on this, see Taubes’ groundbreaking Good Calories, Bad Calories). However, the establishment has resisted or ignored (if not memory-holed) this information for general application primarily based on separate claims about lipid profile risk and heart disease. The conventional view has been that these risks outweigh the benefits, so other treatments are likely to be better on balance.

Turning this view completely on its head, in this 1 July 2014 conference presentation, cholesterol researcher Jeff Volek explains how his and related carefully controlled research over the past two decades indicates that ultra-low-carb eating appears to also be the best known intervention for improving lipid profile markers, properly interpreted. He focuses particularly on issues with the measurement, context, and interpretation of LDL-C, and by the end appears to have left the strong conventional view concerning low-carb and lipid profiles in the dustbin of failed scientific claims.

[Gets going at about 1.05.]

See my Evolutionary Health page for more perspective and selected references.

Sidechained bitcoin substitutes: A monetary commentary

Abstract

A 22 October 2014 white paper on cryptocurrency sidechains formalizes and advances the innovative sidechain concept and examines pros and cons in terms of both technical and economic factors. The current reply focuses on likely general factors in market valuations of bitcoin-pegged units on sidechains. This is an important topic for clarification as people begin to imagine and work to develop practical uses for sidechains. Assuming that the two-way peg will necessarily assure a matching, or even consistently discounted, market price relative to bitcoin could prove unrealistic. A scenario of independent floating market prices among sidecoins could prevail, with implications for the scope and types of sidechain applications.

Download the seven-page PDF of “Sidechained bitcoin substitutes: A monetary commentary.”

 

Bitcoin: Magic, fraud, or 'sufficiently advanced technology'?

Arthur C. Clarke’s third law famously states: “Any sufficiently advanced technology is indistinguishable from magic.” What Bitcoin makes possible can at first seem almost magical, or just impossible (and therefore most likely fraudulent or otherwise doomed). The following describes the basic technical elements behind Bitcoin and how it brings them together in new ways to make seeming magic possible in the real world.

Clarke’s second law states: “The only way of discovering the limits of the possible is to venture a little way past them into the impossible.” And this, we can see in retrospect, is basically what Bitcoin creator Satoshi Nakamoto did. Few at the time, even among top experts in relevant fields, thought it could really ever work.

It works.

One reason many people have a hard time understanding Bitcoin is that it uses several major streams of technology and method, each of which is quite recent in historical perspective. The main raw ingredients include: an open-source free software model, peer-to-peer networking, digital signatures, and hashing algorithms. The very first pioneering developments in each of these areas occurred almost entirely within the 1970s through the 1990s. Effectively no such things existed prior to about 40 years ago, a microsecond in historical time, but a geological age in digital-revolution time.

Some representative milestone beginnings in each area were: for open-source software, the GNU project (1983) and the Linux project (1991); for peer-to-peer networking, ARPANET (1979) and Napster (1999); for digital signatures, Diffie–Hellman theory (1976) and the first RSA test concept (1978); and for hashing algorithms, the earliest ideas (around 1953) and key advances from Merkle–Damgård (1979). Bitcoin combines some of the best later developments in each of these areas to make new things possible.

Since few people in the general population understand much about any of these essential components, understanding Bitcoin as an innovation that combines them in new and surprising ways, surprising even to experts within each of those specialized fields, is naturally a challenge without at least a little study. Not only do most people not understand how the Bitcoin puzzle fits together technically, they do not even understand any of the puzzle pieces! The intent here is not to enter into much detail on the content of any of these technical fields, but rather to provide just enough detail to achieve a quick increase in the general level of public understanding.

What Bitcoin is about in one word: Verification

It may help to focus to begin with not on the details of each field, but at how each part contributes strategically to Bitcoin’s central function. This is to create and maintain a single unforgeable record that shows the assignment of every bitcoin unit to addresses. This record is structured in the form of a linked chain of blocks of transactions. The Bitcoin protocol, network, and all of its parts maintain and update this blockchain in a way that anyone can verify. Bitcoin revises the Russian proverb, “doveryai, no proveryai,” “Trust, but verify,” to just “verify.”

If a single word could describe what the Bitcoin network does, it would be verification. For a borderless global currency, relying on trust would be the ultimate bad idea. Previous monetary systems have all let users down just where they had little alternative but to rely on some trusted third party.

First, the core Bitcoin software is open source and free. Anyone can use it, examine it, propose changes, or start a new branch under a different name. Indeed, a large number of Bitcoin variations with minor differences have already existed for some time. The open source approach can be especially good for security, because more sets of eyes are more likely to find weaknesses and see improvement paths.

Open source also tends to promote a natural-order meritocracy. Contributors who tend to display the best judgment also tend to have more of their contributions reflected over time. Unending forum discussions and controversies are a feature rather than a bug. They focus attention on problems—both real and imagined—which helps better assure that whatever is implemented has been looked at and tested from diverse angles.

Many computers worldwide run software that implements the Bitcoin protocol. A protocol is something roughly like a spoken language. Participants must speak that language and not some other, and they must speak it well enough to get their messages across and understand others. New protocols can be made up, but just as with making up new languages, it is usually rather unproductive. Such things only take off and become useful if enough others see a sufficient advantage to actually participate.

Second, as a peer-to-peer network, there is no center. Anyone can download core Bitcoin software and start a new node. This node will discover and start communicating with other nodes or “peers.” No node has any special authority or position. Each connects with at least eight peers, but sometimes many more. Some faster and always-on nodes relay more information and have more connections, but this conveys no special status. Any node can connect or drop out any time and join again later. A user does not have to run a full node just to use bitcoin for ordinary purposes.

It is common to say that Bitcoin is “decentralized” or doesn’t have a center. But then, Where is it? Thousands of active peering nodes are spread over most countries of the world and each one carries an up to date full copy of the entire blockchain.

Some nodes not only relay valid transactions and blocks, but also join the process of discovering and adding new blocks to the chain. Such “mining” activities both secure the final verification of transactions and assign first possession of new bitcoin to participating nodes as a reward. Understanding basically how mining works requires a look at the distinct functions of several different types of cryptography.

Bitcoin cryptography dehomogenized

Bitcoin relies on two different types of cryptography that few people understand. Both are counter-intuitive in what they make possible. When most people hear “cryptography,” they think of keeping data private and secure through encryption. File encryption can be used to help secure individual bitcoin wallet files, just as it can be used for the password protection of any other files. This is called symmetric key cryptography, which means the same key is used to encrypt and decrypt (AES256 is common in this role). Encryption may also be used for secure communication among users about transactions, as with any other kind of secure traffic. This is called asymmetric key cryptography, which means a public key encrypts a message and its matching private key decrypts it at the other end.

However, all of this is peripheral. Nothing inside the core Bitcoin protocol and network is encrypted. Instead, two quite different types of cryptography are used. They are not for keeping secrets, but for making sure the truth is being told. Bitcoin is a robust global system of truth verification. It is in this sense the opposite of the “memory hole” from George Orwell’s 1984; it is a remembering chain.

The first type of cryptography within Bitcoin is used to create a message digest, or informally a “hash.” Bitcoin uses hashing at many different levels (the most central one is an SHA256 hash run twice). The second type is used to create and verify digital signatures. This uses pairs of signing keys and verification keys (ECDSA sepc256k1 for signatures).

The keys to the kingdom

Despite intuitive appearances to users, bitcoin wallets do not contain any bitcoin! They only contain pairs of keys and addresses that enable digital signatures and verifications. Wallet software searches the blockchain for references to the addresses it contains and uses all the related transaction history there to arrive at a live balance to show the user. Some of the seemingly magical things that one can do with bitcoin, such as store access to the same units in different places, result from the fact that the user only deals with keys while the actual bitcoin “exists,” so to speak, only in the context of the blockchain record, not in wallets. It is only multiple copies of the keys that can be stored in different places at the same time. Still, the effective possession of the coins, that is, the ability to make use of them, stays with whoever has the corresponding signing keys.

While software designers are working hard to put complex strings of numbers in the background of user interfaces and replace or supplement them with more intuitive usernames and so forth, our purpose here is precisely to touch on some technical details of how the system works, so here is a real example of a set of bitcoin keys. This is a real signing key (do not use!):

5JWJASjTYCS9N2niU8X9W8DNVVSYdRvYywNsEzhHJozErBqMC3H

From this, a unique verification (public) key is cryptographically generated (compressed version):

03F33DECCF1FCDEE4007A0B8C71F18A8C916974D1BA2D81F1639D95B1314515BFC

This verification key is then hashed into a public address to which bitcoin can be sent. In this case:

12ctspmoULfwmeva9aZCmLFMkEssZ5CM3x

Because this particular signing key has been made public, it has been rendered permanently insecure—sacrificed for the cause of Bitcoin education.

Making a hash of it

Hashing plays a role quite different from digital signatures. It proves that a message has not been altered. Running a hash of the same message always produces the same result. If a hash does not match a previous one, it is a warning that the current version of the message does not match the original.

To illustrate, here is a message from Murray Rothbard. He wrote in Man, Economy, and State that:

“It must be reiterated here that value scales do not exist in a void apart from the concrete choices of action.” —Murray Rothbard, 1962

And here is the SHA256 digest of this message and attribution (the same algorithm that Bitcoin uses):

68ea16d5ddbbd5c9129710e4c816bebe83c8cf7d52647416302d590290ce2ba8

Any message of any size can go into a hash function. The algorithm breaks it down, mixes the parts, and otherwise “digests” it, until it produces a fixed-length result called “a digest,” which for SHA256 takes the above form, but is in each case different in content.

There are some critical properties of a good hash algorithm. First, the same message always produces the same digest. Second, it only works in one direction. Nothing about the message that went in can be reconstructed from the digest that came out. Even the tiniest change produces a completely different digest, with no relationship between the change in input and the change in output. This is called “the avalanche effect.” Third, the chances of producing the same digest from an altered message are miniscule. This is called “collision resistance.” It is impossible to craft an altered message that produces the same digest as the original unaltered message.

To demonstrate, here is the same quote without the two quotation marks.

It must be reiterated here that value scales do not exist in a void apart from the concrete choices of action. —Murray Rothbard, 1962

Which produces this digest:

0a7a163d989cf1987e1025d859ce797e060f939e2c9505b54b33fe25a9e860ff

Compare it with the previous digest:

68ea16d5ddbbd5c9129710e4c816bebe83c8cf7d52647416302d590290ce2ba8

The tiniest change in the message, removing the two quotation marks, produced a completely different digest that has no relationship whatsoever to the previous digest. In sum, a digest gives a quick yes or no answer to a single question: Is the message still exactly the same as it was before? If the message differs, the digest cannot indicate how or by how much, only that it either has changed at all or has not.

How could such a seemingly blunt instrument be useful? Bitcoin is one application in which hashing has proven very useful indeed. In Bitcoin, hashing is used in the lynchpin role of making it impossible to alter transactions and records once they have been recorded. Once the hashes are hashed together within the blockchain, record forgery anywhere is impossible.

Transactions and how miners compete to discover blocks

Wallet software is used to create transactions. These include the amount to be sent, sending and receiving addresses, and some other information, which is all hashed together. This hash is signed with any required signing keys to create a unique digital signature valid only for this transaction and no other. All of this is broadcast to the network as unencrypted, public information. What makes this possible is that the signature and the verification key do not reveal the signing key.

To keep someone from trying to spend the same unit twice and commit a kind of fraud called double-spending, nodes check new transactions against the blockchain and against other new transactions to make sure the same units are not being referenced more than once.

Each miner collects valid new transactions and incorporates them into a candidate in the competition to publish the next recognized block on the chain. Each miner hashes all the new transactions together. This produces a single hash (“mrkl_root”) that makes the records of every other transaction in a block interdependent.

Each hash for any candidate block differs from every other candidate block, not least because the miner includes his own unique mining address so he can collect the rewards if his candidate block does happen to become recognized as next in the chain.

Whose candidate block becomes the winner?

For the competing miners to recognize a block as the next valid one, the winning miner has to generate a certain hash of his candidate block’s header that meets a stringent condition. All of the other miners can immediately check this answer and recognize it as being correct or not.

However, even though it is a correct solution, it works only for the miner who found it for his own block. No one else can just take another’s correct answer and use it to promote his own candidate block as the real winner instead. This is why the correct answer can be freely published without being misappropriated by others. This unique qualifying hash is called a “proof of work.”

The nature and uses of message digests are counter-intuitive at first, but they are indispensable elements in what makes Bitcoin possible.

An example of a mined block

Here is an example of some key data from an actual block.

“hash”:”0000000000000000163440df04bc24eccb48a9d46c64dce3be979e2e6a35aa13”,

“prev_block”:”00000000000000001b84f85fca41040c558f26f5c225b430eaad05b7cc72668d”,

“mrkl_root”:”83d3359adae0a0e7d211d983ab3805dd05883353a1d84957823389f0cbbba1ad”,

“nonce”:3013750715,

The top line (“hash”) was the actual successful block header hash for this block. It starts with a large number of zeros because a winning hash has to be below the value set in the current difficulty level. The only way to find a winner is to keep trying over and over again.

This process is often described in the popular press as “solving a complex math problem,” but this is somewhat misleading. It is rather an extremely simple and brutally stupid task, one only computers could tolerate. The hash function must simply be run over and over millions and billions of times until a qualifying answer happens to finally be found somewhere on the network. The chances of a given miner finding such a hash for his own candidate block on any given try are miniscule, but somewhere in the network, one is found at a target average of about every 10 minutes. The winner collects the block reward—currently 25 new bitcoins—and any fees for included transactions.

How is the reward collected?

The candidate blocks are already set up in advance so that rewards are controlled by the winning miner’s own unique mining address. This is possible because the miner already included this address in his own unique candidate block before it became a winner. The reward address was already incorporated in the block data to begin with. Altering the reward address in any way would invalidate the winning hash and with it that entire candidate block.

In addition, a miner can only spend rewards from blocks that actually become part of the main chain, because only those blocks can be referenced in future transactions. This design fully specifies the initial control of all first appropriations of new bitcoins. Exactly who wins each next block is random. To raise the probability of winning, a miner can only try to contribute a greater share of the current total network hashing capacity in competition with all of the others trying to do the same.

As shown above with the Rothbard quote, a completely different hash comes out even after the slightest change to the message. This is why the protocol includes a place for a number that is started at zero and changed by one for each new hash try (“nonce”). Only this tiny alteration, even if the rest of the candidate block data is unchanged, generates a completely different hash each time in search of a winner. In the example above, it looks like this miner found a winning hash for this block at some point after the three billionth attempt (“nonce”:3013750715), and this was just for that one miner or mining pool, not including the similar parallel but unsuccessful attempts of all the other miners, and all this just for the competition for this one block.

The key point to understand is that finding a hash under the difficulty level is extremely competitive and difficult, but verifying afterwards that one has been found is trivial. The rest of the miners do so and move right along. They use the newly discovered hash of the previous block header (“prev_block”) as one of the inputs for their next crop of block candidates (which assures the vertical integrity of the single chain of blocks) and the race continues based on the remaining pool of unconfirmed transactions.

A powerful, self-financing, verification network

The Bitcoin mining network is, as of late September 2014, running at about 250 petahashes per second and rising at a logarithmic pace that will soon make this figure look small (rate tracked here). This means that about 250 quadrillion hashes are currently being tried across the network every second all the time. This is the world’s most powerful distributed computing network, by far, and has already been steadily extending this lead for quite some time.

Block rewards and transaction fees help promote the production and maintenance of this entire network in a decentralized way. Since block generation is random and distributed on average in proportion to hashing power contribution, it helps incentivize all contributors all the time. Many miners participate in cooperative mining pools so that at least some rewards arrive on a fairly regular basis.

The network is designed to be entirely self-financed by participants from the beginning indefinitely into the future. Early on, new coin rewards are larger and transaction-fee revenue smaller. Finally, only transaction-fee revenue is to remain, with a long and gradual transition phase built in.

If Bitcoin does remain successful over the longer term, by the time transaction-fee revenue predominates, there would likely be many orders of magnitude more transactions per block by which to multiply the average competitive fee per transaction.

This has been a summary look at a few of the key technical elements of Bitcoin. Hashing algorithms and digital signatures are especially counter-intuitive and relatively new inventions, but knowing what they make possible is essential for understanding how Bitcoin works. Each of Bitcoin’s major elements contribute to the central functions of verification, unforgeable record-keeping, and fraud prevention. These technical underpinnings and the functions they support sound about as far from the systematic deceptions of a fraud such as a Ponzi scheme as it would be possible to get.

Adapted and revised from Bitcoin Decrypted Part II: Technical Aspects and cross-posted to actiontheory.liberty.me.