Translate

Monday, September 30, 2019

Study champions inland fisheries as rural nutrition hero


Synthesizing new data and assessment methods is showing how freshwater fish is an invisible superhero in the global challenge to feed poor rural populations in many areas of the world.

But there's a problem: Invisibility is the wrong superpower.

 

Researchers from Michigan State University (MSU) and the Food and Agriculture Organization (FAO) of the United Nations have pulled together the most recent data and innovative approaches to measuring and communicating the impact inland fisheries have on food security, sustainability and economies. Fish harvests from the world's rivers, streams, floodplains, wetlands, lakes, inland seas, canals, reservoirs and even rice fields can seem like forgotten, poor relatives of the better documented global fisheries in the oceans.

 

Yet in "A fresh look at inland fisheries and their role in food security and livelihoods" in the latest edition of Fish and Fisheries, scientists Abigail Bennett and Simon Funge-Smith (FAO), both members of MSU's Center for Systems Integration and Sustainability, point out the power the world's freshwater fisheries hold. Collating new assessments -- for example modeling fish catch from household food consumption data, basin-level estimates and extrapolation from data-rich case studies -- all reinforce the conclusion that inland fisheries catch is greater than current estimates, perhaps between 21 to 51%.

 

Now enough data has been compiled to generate a global economic valuation of inland fisheries. The economic value of reported global inland catch (in terms of first-sale value) is estimated at $24 billion. That's approximately 24% of estimated first-sale value for marine fisheries even though total inland catch is only about 13% of marine catch. These new values suggest that the first landing prices of inland fish are higher than the average for marine fish, indicating their importance in local rural economies.

 

"Inland fisheries are providing crucial micronutrients and animal-source protein to sustain some 159 million people -- sometimes the only protein accessible and affordable to them," said Bennett, an assistant professor of fisheries at CSIS. "At the same time, as we're worried food production systems are threatened by climate change, agricultural runoff, habitat alteration from hydropower development and other alternative uses of freshwater. We need to be looking more closely at the efficiency of inland fisheries in respect to land, water and energy use."

 

The paper explores the scope of freshwater fish, and the difficulties in getting accurate representation of impact. What is clear is those who depend on the operations -- particularly women who are underrepresented both in fishing and post-harvest work like fish trade and processing -- are at risk both in access to work, and health and safety risks.
 

 

The paper outlines the known and unknown factors of inland fisheries, and the need to better quantify that which is currently compelling anecdotes. It also points out the tantalizing aspects of this sector of fishing -- freshwater fish is one of the only accessible source of animal food to many of the poor, and it can be harvested fruitfully at a small scale, often without needing motor-powered boats and can have low-tech processing. Ninety-five percent of inland fisheries catch comes from developing countries, and 43 percent comes from low-income food-insecure countries.

 

To manage and sustain those benefits, the authors note it is critical to continue to improve methods for understanding of how much inland fish is being caught and how catches are changing over time.

 

"Even though inland fisheries catch equals 12% of global fish production (not accounting for underreporting), in particular places these catches are absolutely crucial to survival," Funge-Smith said. "Improving assessment methods is key toward ending the vicious cycle in which data gaps lead to inland fisheries being undervalued in policy discussions, and lack of policy support undermines important data tracking systems."
 

 

Story Source:

 

Materials provided by Michigan State University. Note: Content may be edited for style and length.

Thursday, August 29, 2019

New, healthier 'butter' spread almost entirely water


Cornell University food scientists have created a new low-calorie 'butter' spread that consists mostly of water. A tablespoon of this low-calorie spread has 2.8 grams of fat and 25.2 calories. Butter, on the other hand, which is 84% fat and about 16% water, has about 11 grams of fat and nearly 100 calories.


They figured out a new process to emulsify a large amount of water with miniscule drops of vegetable oil and milk fat to mimic butter, at approximately one-fourth the calories of real butter and without artificial stabilizers.

"Imagine 80% water in 20% oil and we create something with the consistency of butter, with the mouth feel of butter and creaminess of butter," said food science professor and senior author Alireza Abbaspourrad.

Emulsifying water and oil is nothing new, said Abbaspourrad, but by using high-internal phase emulsions (HIPE), "we keep adding water to that oil until the final composition is 80% water and 20% oil."

The demand for low-fat, high-protein products has rapidly increased due to consumers' growing health awareness, said lead author Michelle C. Lee, a doctoral candidate in Abbaspourrad's research group.


"Since the HIPE technology features high water-to-oil ratios -- while simultaneously delivering unique texture and functionality -- it can play a role in providing healthier solutions for consumers," Lee said.

Abbaspourrad said food chemists can adjust for taste, preferences and health.

"We can add milk protein or plant-based protein, and since the water acts like a carrier, we can adjust for nutrition and load it with vitamins or add flavors," he said. "Essentially, we can create something that makes it feel like butter -- and instead of seeing a lot of saturated fat, this has minute amounts. It's a completely different formulation."

Story Source:

Materials provided by Cornell University. Note: Content may be edited for style and length.

Friday, February 8, 2019

Fatty Diets Was Good In The Past



Long before human ancestors began hunting large mammals for meat, a fatty diet provided them with the nutrition to develop bigger brains, posits a new paper in Current Anthropology.
The paper argues that our early ancestors acquired a taste for fat by eating marrow scavenged from the skeletal remains of large animals that had been killed and eaten by other predators. The argument challenges the widely held view among anthropologists that eating meat was the critical factor in setting the stage for the evolution of humans.
"Our ancestors likely began acquiring a taste for fat 4 million years ago, which explains why we crave it today," says Jessica Thompson, the paper's lead author and an anthropologist at Yale University. "The reservoirs of fat in the long bones of carcasses were a huge calorie package on a calorie-poor landscape. That could have been what gave an ancestral population the advantage it needed to set off the chain of human evolution."
Thompson, who recently joined Yale's faculty, completed the paper while on the faculty at Emory University.
While focusing on fat over meat may seem like a subtle distinction, the difference is significant, Thompson says. The nutrients of meat and fat are different, as are the technologies required to access them. Meat eating is traditionally paired with the manufacture of sharp, flaked-stone tools, while obtaining fat-rich marrow only required smashing bones with a rock, Thompson notes.
The authors review evidence that a craving for marrow could have fueled not just a growing brain size, but the quest to go beyond smashing bones with rocks to make more sophisticated tools and to hunt large animals.

"That's how all technology originated -- taking one thing and using it to alter something else," Thompson says. "That's the origin of the iPhone right there."
Co-authors of the paper include anthropologists Susana Carvalho of Oxford University, Curtis Marean of Arizona State University, and Zeresenay Alemseged of the University of Chicago.
The human brain consumes 20% of the body's energy at rest, or twice that of the brains of other primates, which are almost exclusively vegetarian. It's a mystery to scientists how our human ancestors met the calorie demands to develop and sustain our larger brains.
A meat-centered paradigm for human evolution hypothesizes that an ape population began more actively hunting and eating small game, which became an evolutionary stepping stone to the human behavior of hunting large animals.
The paper argues that this theory does not make nutritional sense. "The meat of wild animals is lean," Thompson says. "It actually takes more work to metabolize lean protein than you get back."
In fact, eating lean meat without a good source of fat can lead to protein poisoning and acute malnutrition. Early Arctic explorers, who attempted to survive on rabbit meat exclusively, described the condition as "rabbit starvation."
This protein problem, coupled with the energy required for an upright ape with small canines to capture and eat small animals, would seem to rule out eating meat as a pathway to fueling brain growth, Thompson says.
The new paper presents a new hypothesis, going back about 4 million years, to the Pliocene. As the human ancestor began walking primarily on two legs, heavily forested regions of Africa were breaking into mosaics, creating open grasslands.
"Our human ancestors were likely awkward creatures," Thompson says. "They weren't good in trees, like chimpanzees are, but they weren't necessarily all that good on the ground either. So, what did the first upright walking apes in our lineage do to make them so successful? At this stage, there was already a small increase in the size of the brains. How were they feeding that?"
Thompson and her co-authors propose that our early ancestors wielded rocks as they foraged on open grassland. After a predator had finished eating a large mammal, these upright apes explored the leftovers by smashing them and discovered the marrow hidden in the limb bones.
"The bones sealed up the marrow like a Tupperware container, preventing bacterial growth," Thompson says. And the only things that could crack open these containers, she adds, were the bone-cracking jaws of hyenas or a clever ape wielding a rock.
The hypothesis offers an explanation for how the human ancestor may have garnered the extra calories needed to foster a larger brain, long before there is evidence for controlled fire, which could have mitigated the problem of bacteria in rotting, scavenged meat. The fat hypothesis also predates by more than 1 million years most evidence for even basic toolmaking of simple stone flakes.
Scientists ought to begin looking for evidence of bone-smashing behavior in early human ancestors, Thompson said.

"Paleoanthropologists are looking for mostly complete bones, and then concentrating on identifying the animal that died," Thompson says. "But instead of just wondering about the bone's creature of origin, we should be asking, 'What broke this bone?' We need to start collecting tiny pieces of shattered bone to help piece together this kind of behavioral information."

Story Source:
Materials provided by Yale University. Note: Content may be edited for style and length.

Wednesday, January 23, 2019

You've Been Storing Peanut Butter Wrong All Along





Ever since I was physically able to, I have eaten peanut butter. I've eaten it on bread. I've eaten it in a creamy blend of fluff heaven. Hell, I've eaten it straight out of the jar with a spoon to my face for hours on end. We all have. And though I had until today considered myself to be a peanut butter eating expert, I have just received the most useful information that my peanut-devouring self could have ever asked for.
PureWow alerted us to the best way to store your peanut butter jar: upside down. Gone are the days of me (or you) standing there, hammering away with a spoon, churning and twisting your innocent jar of PB into a vortex of oil induced slime. Gone are the wasted scoops of PB thrown away because they were infected by the slippery after effects of puddled oil on the top of the jar. 
If you're as deeply invested in the life of peanut butter as I am, you know that when you have a jar of natural PB, it develops a little pool of oil on the top. Which is when the churning and twisting (in a very dramatic way) is necessary, in order to happily consume oil-free PB. But, when you store the good stuff upside down, the oil will evenly distribute throughout the entire jar, instead of in a stagnant liquid bath on top.


Another way to ensure that oil stops pooling at the top? Refrigerate the stuff upside down. Not only will you not have to deal with a liquid-y, nut butter-y mess when you fervently open your jar, but you'll also have a creamier peanut butter that'll hold its own against whatever it is you're pairing it with. 
Pro tips right here.



Tuesday, August 28, 2018

What’s the Difference Between Table Salt and Sea Salt?


SEA SALT AND OTHER:
specialty salts (like Himalayan salt) are being touted as healthier than good old table salt. Many food companies have started adding sea salt in lieu of table salt, and many folks have switched over to using sea salt for everyday use. However, the switch from table salt does come with consequences – namely, iodine deficiency, which has started to reemerge in the United States. Here's a look at the differences between the salts, and how you can make the best decision for you

Iodized Table Salt:
Iodine is an essential mineral that must be obtained through food, but not a lot of foods outside of sea vegetables (like nori, wakame and kombu kelp) and saltwater fish contain it. Dairy products also contain iodine, partly due to the iodine feed supplements, which can vary. You can also find the mineral in produce, though the amounts vary depending on the iodine content of the soil, fertilizer use and irrigation practices.
When the body lacks iodine, the thyroid does not produce enough hormones to help it grow and develop. Iodine deficiency can also result in a goiter, or enlarged thyroid gland. During pregnancy and early infancy, iodine deficiency can cause irreversible effects. So, in the 1920s, when iodine deficiencies were rampant in the U.S., many food manufacturers in the U.S. began iodizing table salt. As a result, about 90 percent of homes in this country use iodized salt.


Recent data, however, indicate that more Americans have low iodine levels. A 2015 commentary published in the American Association of Clinical Endocrinologists discusses the 50 percent decrease in iodine since the 1970s and the reemergence of mild iodine deficiency. The same 2015 issue of AACE published clinical case reports about four New Jersey women who were diagnosed with goiter that was likely related to iodine deficiency.

According to the 2015-2020 Dietary Guidelines for Americas, almost all Americans consume too much salt. So why is there an increase in iodine deficiency in the U.S.? There are several reasons. First, the fortification of iodine in salt is voluntary. As such, manufacturers of most sea salt, kosher salt and other types of salt do not iodize their products. In addition, salt consumption from the shaker has declined and much of the salt consumed is from commercially processed foods, which almost always contain non-iodized salt. Further, cow's milk consumption has been declining. Although the general health message is to decrease salt consumption, the type of salt is not specified. In order to prevent iodine deficiency, using iodized salt is a must.
Here's how much iodine you should get each day:
Children 1 to 8 years old: 90 micrograms
Children 9 to 13 years old: 120 micrograms
Children 14 years old and older: 150 micrograms
Pregnant and lactating women: 220 micrograms and 290 micrograms, respectively
Consuming a half-teaspoon of iodized salt provides 150 milligrams and can meet the needs of the general population. As such, iodized salt should be the primary salt used in your kitchen, in moderation.

This option includes any salt that's been harvested from the sea, as opposed to the earth. Various regions have different harvesting methods, but making sea salt generally entails the evaporation of seawater. The different methods result in variations in mineral content, color, flavor and texture. Most sea salts have large irregular shaped crystals, tend to be more expensive and are used as finishing salts. Sea salt is commonly thought of as a healthier alternative to common table salt; however, the sodium content is comparable. One advantage of sea salt is that you can use less because it takes up more volume (teaspoon for teaspoon). If you're a sea salt lover, look for brands with iodine like Morton and Hain.

Kosher salt:
This type of salt has also gained popularity because of its coarse flake-like crystals that have a subtle flavor, are easy to pinch and quickly dissolve. Kosher salt is derived from either the sea or the earth. Similar to table salt, it is made of sodium chloride, but usually without any additives. It's versatile in the kitchen and is perfect for cooking, brining, topping popcorn and rimming margarita glasses. Kosher salt is not usually iodized.

Himalayan pink salt:
This pretty option is a coarse sea salt mined in the foothills of the Himalayas. It is found in various shades of pink due to its iron, calcium, potassium and magnesium content. Typically it is not iodized.



By Toby Amidor