找回密码
 注册
搜索
热搜: java php web
查看: 671|回复: 2

Memory Management: Algorithms and Implementation in C/C++

[复制链接]
发表于 2009-11-23 22:34:25 | 显示全部楼层 |阅读模式


作者:Bill Blunden
出版社:其它
ISBN:1556223471
文件格式:CHM

“Pay no attention to the man behind the curtain.”
—The Wizard of Oz
There are a multitude of academic computer science texts thatdiscuss memory management. They typically devote a chapter or less tothe subject and then move on. Rarely are concrete, machine-leveldetails provided, and actual source code is even scarcer. When theauthor is done with his whirlwind tour, the reader tends to have a verylimited idea about what is happening behind the curtain. This is nosurprise, given that the nature of the discussion is rampantlyambiguous. Imagine trying to appreciate Beethoven by having someoneread the sheet music to you or experience the Mona Lisa by reading adescription in a guidebook.
This book is different. Very different.
In this book, I am going to pull the curtain back and let you seethe little man operating the switches and pulleys. You may be excitedby what you see, or you may feel sorry that you decided to look. But asEnrico Fermi would agree, knowledge is always better than ignorance.
This book provides an in-depth look at memory subsystems and offersextensive source code examples. In cases where I do not have access tosource code (i.e., Windows), I offer advice on how to gather forensicevidence, which will nurture insight. While some books only givereaders a peak under the hood, this book will give readers a powerdrill and allow them to rip out the transmission. The idea behind thisis to allow readers to step into the garage and get their hands dirty.
My own experience with memory managers began back in the late 1980swhen Borland’s nifty Turbo C 1.0 compiler was released. This was myfirst taste of the C language. I can remember using a disassembler toreverse engineer library code in an attempt to see how the malloc() andfree() standard library functions operated. I don’t know how manyschool nights I spent staring at an 80×25 monochrome screen,deciphering hex dumps. It was tough going and not horribly rewarding(but I was curious, and I couldn’t help myself). Fortunately, I havedone most of the dirty work for you. You will conveniently be able tosidestep all of the hurdles and tedious manual labor that confronted me.
If you were like me and enjoyed taking your toys apart when you werea child to see how they worked, then this is the book for you. So layyour computer on a tarpaulin, break out your compilers, and grab an oilrag. We’re going to take apart memory management subsystems and putthem back together. Let the dust fly where it may!
Historical Setting
In the late 1930s, a group of scholars arrived at Bletchley Park in anattempt to break the Nazis’ famous Enigma cipher. This group ofcodebreakers included a number of notable thinkers, like Tommy Flowersand Alan Turing. As a result of the effort to crack Enigma, the firstelectronic computer was constructed in 1943. It was named Colossus andused thermionic valves (known today as vacuum tubes) for storing data.Other vacuum tube computers followed. For example, ENIAC (electronicnumerical integrator and computer) was built by the U.S. Army in 1945to compute ballistic firing tables.
Note Science fiction aficionados might enjoy a movie calledColossus: The Forbin Project. It was made in 1969 and centers aroundColossus, a supercomputer designed by a scientist named Charles Forbin.Forbin convinces the military that they should give control of the U.S.nuclear arsenal to Colossus in order to eliminate the potential ofhuman error accidentally starting World War III. The movie is similarin spirit to Stanley Kubrick’s 2001: A Space Odyssey, but without thehappy ending: Robot is built, robot becomes sentient, robot runs amok.I was told that everyone who has ever worked at Control Data has seenthis movie.
The next earth-shaking development arrived in 1949 when ferrite(iron) core memory was invented. Each bit of memory was made of asmall, circular iron magnet. The value of the bit switched from “1″ to“0″ by using electrical wires to magnetize the circular loops in one oftwo possible directions. The first computer to utilize ferrite corememory was IBM’s 705, which was put into production in 1955. Back inthose days, 8KB of memory was considered a huge piece of real estate.
Everything changed once transistors became the standard way to storebits. The transistor was presented to the world in 1948 when Bell Labsdecided to go public with its new device. In 1954, Bell Labsconstructed the first transistor-based computer. It was named TRADIC(TRAnsistorized DIgital Computer). TRADIC was much smaller and moreefficient than vacuum tube computers. For example, ENIAC required 1,000square feet and caused power outages in Philadelphia when it was turnedon. TRADIC, on the other hand, was roughly three cubic feet in size andran on 100 watts of electricity.
Note Before electronic computers became a feasible alternative,heavy mathematical computation relied on human computers. Large groupsof people would be assembled to carry out massive numerical algorithms.Each person would do a part of a computation and pass it on to someoneelse. This accounts for the prevalance of logarithm tables inmathematical references like the one published by the Chemical RubberCompany (CRC). Slide rules and math tables were standard fare beforethe rise of the digital calculator.
ASIDE
“After 45 minutes or so, we’ll see that the results are obvious.”
—David M. Lee
I have heard Nobel laureates in physics, like Dave Lee, complainthat students who rely too heavily on calculators lose theirmathematical intuition. To an extent, Dave is correct. Before the dawnof calculators, errors were more common, and developing a feel fornumeric techniques was a useful way to help catch errors when theyoccurred.
During the Los Alamos project, a scientist named Dick Feynman ran amassive human computer. He once mentioned that the performance andaccuracy of his group’s computations were often more a function of hisability to motivate people. He would sometimes assemble people intoteams and have them compete against each other. Not only was this agood idea from the standpoint of making things more interesting, but itwas also an effective technique for catching discrepancies.
In 1958, the first integrated circuit was invented. The inventor wasa fellow named Jack Kilby, who was hanging out in the basement of TexasInstruments one summer while everyone else was on vacation. A littleover a decade later, in 1969, Intel came out with a 1 kilobit memorychip. After that, things really took off. By 1999, I was working on aWindows NT 4.0 workstation (service pack 3) that had 2GB of SDRAMmemory.
The general trend you should be able to glean from the previousdiscussion is that memory components have solved performancerequirements by getting smaller, faster, and cheaper. The hardwarepeople have been able to have their cake and eat it too. However, thelaws of physics place a limit on how small and how fast we can actuallymake electronic components. Eventually, nature itself will stand in theway of advancement. Heisenberg’s Uncertainty Principle, shown below, iswhat prevents us from building infinitely small components.
ΔxΔ p ≥ (h/4π)
For those who are math-phobic, I will use Heinsenberg’s own words to describe what this equation means:
“The more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa.”
In other words, if you know exactly where a particle is, then youwill not be able to contain it because its momentum will be huge. Thinkof this like trying to catch a tomato seed. Every time you try tosqueeze down and catch it, the seed shoots out of your hands and fliesacross the dinner table into Uncle Don’s face.
Einstein’s General Theory of Relativity is what keeps us frombuilding infinitely fast components. With the exception of black holes,the speed limit in this universe is 3×108 meters per second.Eventually, these two physical limits are going to creep up on us.
When this happens, the hardware industry will have to either makelarger chips (in an effort to fit more transistors in a given area) oruse more efficient algorithms so that they can make better use ofexisting space. My guess is that relying on better algorithms will bethe cheaper option. This is particularly true with regard to memorymanagement. Memory manipulation is so frequent and crucial toperformance that designing better memory management subsystems willtake center stage in the future. This will make the time spent readingthis book a good investment.

本帖子中包含更多资源

您需要 登录 才可以下载或查看,没有账号?注册

×
发表于 2011-1-22 02:09:55 | 显示全部楼层
正在找这方面的东西 谢谢你

评分

1

查看全部评分

回复

使用道具 举报

发表于 2011-1-24 14:57:11 | 显示全部楼层
云峰2011博爱开示及地震研究三号文继续发生验证中

云峰2011年博爱开示及国际地震研究中心三号文继续发生验证中
云峰2011年博爱开示
朋友们、同胞们、云峰大家庭的家人们:
     新年好!
     多灾多难的2010年已经过去,2011年悄然而至,大家一直很关心2011年的世界将会怎样,还会有哪些灾难发生等问题。
     特做以下简短开示:
     一、全球性气候灾难频繁发生。
     二、“地震频发期”继续,强震将不断发生。
     三、世界政局动荡不稳。
     四、“恐怖行为”空前高涨
----------
     面对以上所述灾难,各国政府、政要、专家、组织都不能精确有效提前预知、防范和解决。
     希望你们早日成为云峰大家庭成员,和我们一起博爱世界。
     祝大家2011年幸福平安!
           
                              云峰
                          2011年1月1日
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 注册

本版积分规则

Archiver|手机版|小黑屋|软晨网(RuanChen.com)

GMT+8, 2024-11-22 12:12

Powered by Discuz! X3.5

Copyright © 2001-2023 Tencent Cloud.

快速回复 返回顶部 返回列表