填空题The millionaires daughter left her parents home because she didnt want to lead an ______ and meaningless life. 那个百万富翁的女儿离开了家,因为她不想过空虚和无意义的生活。
填空题Martins article on present-day economics ______ careful study. 马丁关于当今经济学的文章值得认真研究。
填空题The survey found a wide seasonal var______ in attendance and it was difficult, if not impossible, to predict future attendance or profits.
填空题
填空题After the war, the German officer ______ the concentration camp was sentenced to death by hanging.
填空题The book is a______(humor) account of a young man's travels in Asia.
填空题47次列车
填空题In spite of the hard winter, the roses in our garden are still ______.尽管已经是严冬了,我们花园里的玫瑰还没凋谢。
填空题When going through the customs, {{U}}even really honest people are often made to feel guilty{{/U}}.
填空题______are produced by constricting or obstructing the vocal tract at some place to divert, impede, or completely shut off the flow of air in the oral cavity.(中山大学2006研)
填空题If there is ______ weather, there are strong winds and heavy rain. 如果有暴风雨,那么将会狂风大作,大雨倾盆。
填空题Computer Languages 计算机语言 A computer must be given instructions in a language that it understands, that is, a particular pattern of binary digital information. On the earliest computers, programming was a difficult, laborious task, because vacuum tube ON/OFF switches had to be set by hand. Teams of programmers often took days to program simple tasks, such as sorting a list of names. Since that time a number of computer languages have been devised, some with particular kinds of functioning in mind and others aimed more at ease of use-the user-friendly approach. Machine Language Unfortunately, the computers own binary based language, or machine language, is difficult for humans to use. The programmer must input every command and all data in binary form, and a basic operation such as comparing the contents of a register to the data in a memory chip location might look like this: 11001010 00010111 11110101 00101011. Machine language programming is such a tedious, time-consuming task that the time saved in running the program rarely justifies the days or weeks needed to write the program. Assembly Language One method programmers devised to shorten and simplify the process is called assembly language programming. By assigning a short (usually three letter) mnemonic code to each machine language command, assembly language programs could be written and-debugged-cleaned of logic and date errors-in a fraction of the time needed by machine language programmers. In assembly language, each mnemonic command and its symbolic operands equals one machine instruction. An assembler program translates the mnemonic opcodes (operation codes) and symbolic operands into binary language and executes the program. Assembly language is a type of low level computer programming language in which each statement corresponds directly to a single machine instruction. Assembly languages are, thus, specific to a given processor. After writing an assembly language program, the programmer must use the assembler language into machine code. Assembly language provides precise control of the computer, but assembly language programs written for one type of computer must be rewritten to operate on another type. Assembly language might be used instead of a high levcl language for any of three major reasons: speed, control, and preference. Programs written in assembly language usually run faster than those generated by a compiler; use of assembly language lets a programmer interact directly with the hardware (processor, memory, display, and input/output ports). Assembly language, however, can be used only with one type of CPU chip or microprocessor. Programmers who expended much time and effort to learn how to program one computer had to learn a new programming style each time they worked on another machine. What was needed was a shorthand method by which one symbolic statement could represent a sequence of many machine language instructions, and a way that would allow the same program to run on several types of machines. These needs led to the development of so-called high level languages. High Level Languages High level languages often use English-Iike words-for example, LIST, PRINT, OPEN, and so on-as commands that might stand for a sequence of tens or hundreds of machine language instructions. The commands are entered from the keyboard or from a program in memory or in a storage device, and they are interpreted by a program that translates them into machine language instructions. Translator programs are of two kinds: interpreters and compilers. With an interpreter, programs that loop back to reexecute part of their instructions reinterpret the same instructions each time it appears, so interpreted programs run much more slowly than machine language programs. Compilers, by contrast, translate an entire program into machine language prior to execution, so such programs run as rapidly as though they were written directly in machine language. American computer scientist Grace Hopper is credited with implementing the first commercially oriented computer language. After programming an experimental computer at Harvard University[1], she worked on the UNIVAC[2] I and II computers and developed a commercially usable high level programming language called FLOW MATIC to facilitate computer use in scientific applications. IBM[3] then developed a language that would simplify work involving complicated mathematical formulas. Begun in 1954 and completed in 1957, FORTRAN (FORmula TRANslator)[4] was the first comprehensive high level programming language that was widely used. In 1957, the Association for Computing Machinery[5] set out to develop a universal language that would correct some of FORTRAN s perceived faults. A year later, they released ALGOL[6] (ALGOrithmic Language), another scientifically oriented language; widely used in Europe in the 1960s and 1970s, it has since been superseded by newer languages, while FORTRAN continues to be used because of the huge investment in existing programs. COBOL[7] (COmmon Business Oriented Language), a commercial and business programming language, concentrates on data organization and file handling and is widely used today in business. BASIC[8] (Beginners All-purpose Symbolic Instruction Code) was developed at Dartmouth College in the early 1960s for use by nonprofessional computer users. The language came into almost universal use with the microcomputer explosion of the 1970s and 1980s. Condemned as slow, inefficient, and inelegant by its detractors, BASIC is nevertheless simple to learn and easy to use. Because many early microcomputers were sold with BASIC built into the hardware (in ROM memory) the language rapidly came into widespread use. As a very simple example of a BASIC program, consider the addition of the numbers 1 and 2, and the display of the result. This is written as follows (the numerals 10-40 are line numbers): 10 A=1 20 B=2 30 C=A+B 40 PRINT C Although hundreds of different computer languages and variants exist, several others deserve mention. PASCAL[9], originally designed as a teaching tool, is now one of the most popular microcomputer languages. LOGO was developed to introduce children to computers. C, a language Bell Laboratories designed in the 1970s, is widely used in developing systems programs, such as language translators. LISP[10] and PROLOG are widely used in artificial intelligence. COBOL COBOL, in computer science, acronym for COmmon Business-oriented language, is a verbose, English-like programming language developed between 1959 and 1961. Its establishment as a required language by the U. S. Department of Defense, its emphasis on data structures. and its English-like syntax (compared to those of FORTRAN and ALGOL) led to its widespread acceptance and usage, especially in business applications. Programs written in COBOL, which is a compiled language, are split into four divisions: Identification, Environment, Data, and Procedure. The Identification division specifies the name of the program and contains any other documentation the programmer wants to add. The Environment division specifies the computer(s) being used and the files used in the program for input and output. The Data division describes the data used in the program. The Procedure division contains the procedures that dictate the actions of the program. C C++ A widely used programming language, C was developed by Dennis Ritchie at Bell Laboratories in 1972; it was so named because its immediate predecessor was the B programming language. Although C is considered by many to be more a machine independent assembly language than a high level language, its close association with the UNIX[11] operating system, its enormous popularity, and its standardization by the American National Standards Institute (ANSl)[12] have made it perhaps the closest thing to a standard programming language in the microcomputer/workstation marketplace. C is a compiled language that contains a small set of built in functions that are machine dependent. The rest of the C functions are machine independent and are contained in libraries that can be accessed from C programs. C programs are composed of one or more functions defined by the programmer; thus, C is a structured programming language. C+ +, in computer science, is an object oriented version of the C programming language, developed by Bjarne Stroustrup in the early 1980s at Bell Laboratories and adopted by a number of vendors, including Apple Computer, Sun Microsystems, Borland International, and Microsoft Corporation. Notes [1]Harvard University:美国哈佛大学。 [2]UNIVAC(Universal Automatic Computer):通用自动计算机。 [3]IBM(International Business Machine Corp):国际商用机器公司。 [4]FORTRAN(FORmula TRANslator):公式翻译程序设计语言。 [5]the Association for Computing Machinery:计算机协会(美国)。 [6]ALGOL(ALGOrithmic Language):面向代数的语言。 [7]COBOL(Common Business Oriented Language):面向商业的通用语言。 [8]BASIC(Beginners All-purpose Symbolic Instruction Code):初学者通用符号指令码。 [9]PASCAL(Philips Automatic Sequence Calculator):菲利浦自动顺序计算机语言。 [10]LISP(List Process):表处理程序,或表处理语言。 [11]UNIX(Uniplexed Information and Computer Systems):UNIX操作系统,1969年在 ATT Bell实验室开发的多用户多任务操作系统。 [12]ANSI(American National Standards Institute):美国国家标准学(协)会。
填空题
填空题A recession marked the early years of Reagan's presidency, but conditions started to improve in 1983 and the United States entered one of the longest periods of sustained economic growth since World War Ⅱ. However, an alarming percentage of this growth was based on deficit spending. In 1988, former vice president George Bush became President. He continued many of Reagan's policies. Bush's efforts to gain control over the federal budget deficit, however, were problematic. The 1990s brought a new president, Bill Clinton, a cautious, moderate Democrat, whose liberal initiatives created a myth for the American economy. 41)__________. Still, although Clinton reduced the size of the federal work force, the government continued to play a crucial role in the nation's economy. Mast of the major innovations of the New Deal, and a good many of the Great Society, remained in place. And the Federal Reserve system continued to regulate the overall pace of economic activity, with a watchful eye for any signs of renewed inflation. 42)__________. Technological developments brought a wide range of sophisticated new electronic products. Innovations in telecommunications and computer networking spawned a vast computer hardware and software industry and revolutionized the way many industries operate. 43)__________. No longer are Americans afraid that the Japanese will overwhelm them with superior technology or that they will saddle their children with government debt. America's labor force changed markedly during the 1990s. Continuing a long term trend, the number of farmers declined. A small portion of workers had jobs in industry, while a much greater share worked in the service sector, in jobs ranging from store clerks to financial planners. If steel and shoes were no longer American manufacturing mainstays, computers and the software that make them run were. 44)__________. Economists, surprised at the combination of rapid growth and continued low inflation, debated whether the United States had a "new economy" capable of sustaining a faster growth rate than seemed possible based on the experiences of the previous 40 years. 45)__________. Asia, which had grown especially rapidly during the 1980s, joined Europe as a major supplier of finished goods and a market for American exports. Sophisticated worldwide telecommunications systems linked the world's financial markets in a way unimaginable even a few years earlier.A. The economy, meanwhile, turned in an increasingly healthy performance as the 1990s progressed. With the fall of the Soviet Union and Eastern European communism in the late 1980s, trade opportunities expanded greatly.B. Still, Americans ended the 1990s with a restored sense of confidence. By the end of 1999, the economy had grown continuously since March 1991, the longest peacetime economic expansion in history.C. Clinton sounded some of the same themes as his predecessors. After unsuccessfully urging Congress to enact an ambitious proposal to expand health-insurance coverage, Clinton declared that the era of "big government" was over in America. He pushed to strengthen market forces in some sectors, working with Congress to open local telephone service to competition. He also joined Republicans to reduce welfare benefits.D. Finally, the American economy was more closely intertwined with the global economy than it ever had been. Clinton, like his predecessors, had continued to push for elimination of trade barriers. A North American Free Trade Agreement (NAFTA. had further increased economic ties between the United States and its largest trading partners, Canada and Mexico.E. While many Americans remained convinced that global economic integration benefited all nations, the growing interdependence created some dislocations as well. Workers in high-technology industries at which the United States excelled fared rather well, but competition from many foreign countries that generally had lower labor costs tended to dampen wages in traditional manufacturing industries.F. The expansion that began in March 1991 has raised real gross domestic product by more than a third, minted 100,000 more people earning a million dollars a year. After peaking at $290,000 million in 1992, the federal budget deficit steadily shrank as economic growth increased tax revenues. In 1998, the government posted its first surplus in 30 years, although a huge debt mainly in the form of promised future Social Security payments to the baby boomers remained.G. Best of all, the healthy economy has transformed the psyche of millions of Americans. The pervasive gloom at the beginning of the 1990s is gon
填空题Don"t
make
Helen"s remarks too
seriously
. She is so
upset
that I don"t think she really knows
what she is saying
.
A. make
B. seriously
C. upset
D. what she is saying
填空题Computer Systems Architecture[1] 计算机系统结构 1. Computer systems-the importance of networking Since there is not yet a universal definition of Computer Systems Architecture (CSA), interpretations vary. Student confusion increases because commercial terminology can be even more creative! Sometimes CSA appears in the hardware orientation of digital electronics; at other times it takes on the guise of a unified software specification for a family of computers. Rarely is the central importance of network facilities,[2] both to the computer designer and to the end user, sufficiently acknowledged, even though we are all aware of its growing significance in society. Indeed, more and more computer science graduates become involved in the data communications industry, and would therefore benefit from grounding in this field. Thus an aim of this text is to place networks solidly within CSA.[3] It is clear that computers and networks require both hardware and software in order to work. But the historical academic separation of the two poses a difficult balancing problem when presenting such a course.[4] Both are equally important and strongly connected by their enthusiastic supporters. The distinction between hardware and software can be likened to the distant relationship between the formal team player, rigidly posed in front of the goalmouth, and the exciting unpredictability of the World Cup final.[5] The static photograph of the players only vaguely hints at the limitless possibilities of the dynamic game. With the increasing sophistication of computer hardware, perhaps it is unfortunate that the taking-apart and exploration of old computers is no longer encouraged. The unexpectedly rising prices of electronic components, added to the need to have modern equipment to run the latest games, have resulted, for the moment, in a salespersons dream. Unexpectedly, this trend, although attracting many more people to use computers, has had an adverse effect on the fundamental level of knowledge among computing undergraduates on entry to university. Although we cannot turn the clock back to the self- build hobbyist days of home computing,[6] knowledge of the interaction of hardware and software is still useful, if not necessary, for anyone wanting to be fully involved in the professional use of computers. Curiosity about the computer systems which surround us, the Internet that frustrates us, and the mobile telephone networks that we increasingly rely on should drive us to investigate and question what is happening in the fields of software and electronics. The facilities that will become available to us in the next few years will depend largely on current developments in microelectronics and software design methodology. It is here that we can look for the future. Throughout this passage we will treat CSA as a study of the interaction of hardware and software which determines the performance of network computer systems. We will also try to show that computers can always be viewed as hierarchical ordered systems which can be broken down into simpler component parts in order to fully understand their operation.[7] Unlike other areas of study, such as physics or chemistry, complex ideas can always be split into simpler concepts which may then be understood more easily. This progressive decomposition approach not only is useful when studying computers, but can also be invaluable when designing and building new systems.[8] 2. Hardware and software-mutual dependence Although it is widely recognized that computer systems involve both hardware and software, it is still rare for college computer courses to require you to have a comparable understanding in both fields. Perhaps the analogy of only eating half a boiled egg is appropriate-you risk missing out on the yolk. This separation, or specialization, has a number of serious negative results. When teams of developers are separately recruited as hardware engineers or programmers, the danger of an opposing split progressively opening up between the two camps is always present. Professional rivalry can arise through simple misunderstandings due to the different approaches and vocabulary used by hardware and software engineers. Problems, when they inevitably occur, can be blamed on the other camp and then take longer to resolve. Programmers sometimes find that unsuitable equipment has already been specified without consultation, and hardware designers can sit helplessly by as unsuitable software fails to exploit the performance advantages offered by their revolutionary new circuits. It has been claimed by some business analysts that hardware manufacturing will be of no great commercial consequence. The profit lies in programming: lead the world in the development of systems software! But it is now clear that in such a rapidly changing world, early access to new hardware designs gives the software industry an important marketing lead. The first software products to exploit some new hardware faculty have a clear leadership in the market-place. The neglect of the hardware side of the computing industry has never delivered any long-term advantage. Understanding basic principles and appreciating their application by modem technology within a range of current products is a central aim of this text. Programmers neglect developments in hardware at their peril. The opposite situation, where software is overlooked, can lead to similar failures. Consider the much greater commercial success of the PC since running the Windows operating system and the recent explosion in use of the Internet. Many excellent machines became commercial failures because of their sub-standard software. These well-rehearsed public examples can be added to and confirmed by thousands of private disasters which all underline the need to pursue hardware and software developments in concert. We now recognize that despite their technical superiority, computer systems can fail to win acceptance for many reasons, such as a poorly thought-out user interface, a lack of applications s.oftware, or an inappropriate choice of operating system.[9] Many recent developments have arisen from a simultaneous advance in hardware and software: windowing interfaces are only possible through sophisticated software and powerful graphics cards; network connections are supported by autonomous coprocessors working with complex driver routines; laser printers became universally popular when the xerography print engine was supplemented by the PostScript interpreter.[10] Many such examples demonstrate the value of keeping abreast of developments in both hardware and software. An increasing difficulty with investigating the interaction of hardware and software i.s gaining access to the relevant facilities. With large, multi-user mainframe computers it was understandable that the ordinary programmer was denied[11] access to the hardware and critical software to protect other users. However, with the introduction of Windows NT such security constraints were introduced to single-user personal workstations, making it impossible to access the hardware directly. Only the operating system code has this privilege, while ordinary programs are forced to call trusted system routines to read or write to any part of the hardware. 3. Programming your way into hardware A remarkable empirical law describing the rapid growth of silicon technology was proposed by Gordon Moore, one of the founders of Intel.[12] His well-known rule, Moores Law, states that the amount of circuitry (number of transistors) which can be placed on a given chip area approximately doubles every two years. A circuit designed 24 months ago can now be shrunk to fit into an area of half the size. Intels original 4004 processor involved 2300 transistors, while the Pentium 4 has somewhere of the order of 42 million. The chip area had not increased by a factor of 2000! This capability to progressively shrink the size of electronic circuits could reduce the chip cost, because more circuits are processed on a single slice of silicon, but the technical advance has more often been exploited by enhancing the chip s functionality. Surprisingly, this law has held true since the early 1970s and is likely to stand well into the 2020s before the size of circuit elements become so small that quantum physics intervenes through Heisenberg s uncertainty principle.[13] Already the on-chip circuit interconnections are only 0.25 um long and the insulating layers can be as thin as a couple of dozen molecules. However, Moores Law remains somewhat of a mystery, given that the underlying variables responsible for the trend are as diverse as the ability to maintain ultra-clean factory environments, reduction of international trade barriers, development of increasingly high-resolution technology and the marketing success of games consoles! Although the split between those concerned with hardware and those concerned with software is now deeply rooted, there are developments which might reverse this trend. As manufacturing techniques allow components to shrink in size, hardware engineers find it increasingly difficult to wire up breadboard prototypes because the circuits they are building have grown too complicated.[14] In any case. the performance of the large-sized components which they can handle easily in a traditional way is not identical to that of the miniature equivalents which will make up the final integrated circuit that is produced. In the past there was a tendency for trained electronic engineers to migrate towards software, to pick up programming skills and to get involved in systems programming. Will this trend now be reversed? Programmers, software engineers, trained to deal with large systems and complex specifications, may take the opportunity of contributing to the hardware design. This is another example of how hardware and software can come together through the tools and skills demanded by systems developers. Notes [1]Computer Systems Architecture (CSA)计算机系统结构。计算机系统结构是从外部来研究计算机系统的一门学科,一般说来,凡是计算机系统的使用者(包括一般用户和系统程序员)所能看到的计算机系统的属性都是计算机系统结构所要研究的对象。 [2]Rarely is the central importance of network facilities……计算机设计者和终端用户很少能认识到网络设备的重要性。否定副词rarely放在句首表示强调,句子要倒装。例如:Rarely can he finish his work in time. 他很少按时完成作业。 [3]Thus an aim of this text is to place networks solidly within CSA. 因此,本文旨在于计算机系统结构中打下坚实的基础。 [4]But the historical academic separation of the two poses a difficult balancing problem when presenting such a course. 由于历史上两者的教学相互分离,开设这门课程,就出现了怎么样兼顾的问题。 [5]The distinction between hardware and software can be likened to the distant relationship between the formal team player, rigidly posed in front of the goalmouth, and the exciting unpredictability of the World Cup final.计算机硬件与软件的区别,正如足球场上生硬地点在球门前准备射门的球员,与世界杯决赛的结果,扑朔迷离,难以预测。 [6]Although we cannot turn the clock back to the self-build hobbyist days of home computing……虽然我们不能使时光倒流到自己组装家用电脑的年代…… [7]We will also try to show that computers can always be viewed as hierarchical ordered systems which can be broken down into simpler component parts in order to fully understand their operation. 我们试图说明计算机是一个分级有序的系统,可以分解成更简单的组成部分,这样的话我们就可以更好地了解他们是如何运行的。 [8]This progressive decomposition approach……这种循序渐进的分解方法 [9]...such as a poorly thought-out user interface……例如一个缺乏深思熟虑的用户界面 [10]...the PostScript interpreter... PostScript解码器……。PostScript是由Adobe公司所开发的页面描述语言,是一种桌面系统向输出设备输出的界面语言,专门为描述图像及文字而设计。PostScript是国际是最流行的页面描述语言,其最大特点是能够综合处理文字和图形、图像,也是事实上的工业标准。几乎所有的印前输出设备都支持PS语言,PS语言的成功也使开放式的电子出版系统在国际上广泛流行。 [11]was denied access to the hardware……被拒绝使用硬件 [12]Gordon Moore戈登·摩尔。1929年出生在美国加州,并在加州理工大学(CIT)获得物理和化学两个博士学位。1950s中期他和集成电路的发明者罗伯特·诺伊斯(Robert Noyce)一起,在威廉·肖克利半导体公司工作。后来,诺伊斯和摩尔等8人集体辞职创办了半导体工业史上有名的仙童半导体公司(Fairchild Semiconductor)。仙童成为现在的Intel和AMD之父。1968年,摩尔和诺伊斯一起退出仙童公司,创办了Intel。摩尔定律是指IC上可容纳的晶体管数目,约每隔18个月便会增加一倍,性能也将提升一倍。 [13]Heisenbergs uncertainty principle. 海森堡测不准原理,又名“测不准原理”、“不确定关系”。该原理表明:一个微观粒子的某些物理量(如位置和动量,或方位角与动量矩,还有时间和能量等),不可能同时具有确定的数值,其中一个量越确定,另一个量的不确定程度就越大。测量一对共轭量的误差的乘积必然大于常数h/2π(h是普朗克常数)是海森伯在1927年首先提出的,它反映了微观粒子运动的基本规律,是物理学中又一条重要原理。 [14]...wire up breadboard prototypes……为电路实验板原型接通电源
填空题Sandy: How' s the young man? Kazi: He' s ______ .
填空题breaking news
填空题崇祯五年(1632)十二月,余住西湖。大雪三日,湖中人鸟声俱绝。是日更定矣,余挈一小舟,拥毳衣炉火,独往湖心亭看雪。雾凇沆砀,天与云与山与水,上下一白。湖上影子,惟长堤一痕、湖心亭一点、与余舟一芥,舟中人两三粒而已。 到亭上,有两人铺毡对坐,一童子烧酒炉正沸。见余,大喜日:“湖中焉得更有此人!”拉余同饮。余强饮三大白而别,问其姓氏,是金陵人,客此。及下船,舟子喃喃曰:“莫说相公痴,更有痴似相公者!”
填空题In their book ______ written in 1923,C. K. Ogden and I.A. Richards presented a " representative list of the main definitions which reputable students of meaning have favoured. " There are 16 major categories of them, with sub-categories all together, numbering 22.
