Explore the wonderful quotes under this tag
Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters - all connected to the next-generation internet using abundant, low-cost, and high-power computing.
Sep 10, 2025
If you have a procedure with 10 parameters, you probably missed some.
Computer scientists are the historians of computing.
There is no reason for any individual to have a computer in their home.
A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable.
If you stay up late and you have another hour of work to do, you can just stay up another hour later without running into a wall and having to stop. Whereas it might take three or four hours if you start over, you might finish if you just work that extra hour. If you're a morning person, the day always intrudes a fixed amount of time in the future. So it's much less efficient. Which is why I think computer people tend to be night people - because a machine doesn't get sleepy.
The purpose of computing is insight, not numbers.
Computers are good at following instructions, but not at reading your mind.
What's needed now are software technologies that interconnect computing systems, people and data to produce more rapid answers to the questions of science, and to help researchers use computation in the most effective manner.
Such is modern computing: everything simple is made too complicated because it's easy to fiddle with; everything complicated stays complicated because it's hard to fix.
Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity.
More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity.
I have to admit it: I'm not a huge fan of the cloud computing concept.
I can't think of anything that isn't cloud computing with all of these announcements. The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it?
The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?
The computer industry is the only industry that is more fashion-driven than women's fashion.
Every kid coming out of Harvard, every kid coming out of school now thinks he can be the next Mark Zuckerberg, and with these new technologies like cloud computing, he actually has a shot.
The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do.
The cloud services companies of all sizes; the cloud is for everyone. The cloud is a democracy.
Cloud is about how you do computing, not where you do computing
Because of its vitality, the computing field is always in desperate need of new cliches: Banality soothes our nerves.
Computing is not about computers any more. It is about living.
I would like to emphasize strongly my belief that the era of computing chemists, when hundreds if not thousands of chemists will go to the computing machine instead of the laboratory for increasingly many facets of chemical information, is already at hand. There is only one obstacle, namely that someone must pay for the computing time.
Fairly cheap home computing was what changed my life.
Personal computing today is a rich ecosystem encompassing massive PC-based data centers, notebook and Tablet PCs, handheld devices, and smart cell phones. It has expanded from the desktop and the data center to wherever people need it - at their desks, in a meeting, on the road or even in the air.
Customers get vested in certain paradigms of computing, and those large vendors will try to keep those customers in those paradigms of computing for as long as possible. That's where you basically get the term cash cow.
The Eee Pad Transformer Prime is a category-defining product. Powered by Tegra 3, it launches us into a new era of mobile computing, in which quad-core performance and super energy-efficiency provide capabilities never available before. With Transformer Prime, ASUS has once again led the industry into the next generation.
I don't need a hard disk in my computer if I can get to the server faster... carrying around these non-connected computers is byzantine by comparison.
Cloud computing is actually a spectrum of things complementing one another and building on a foundation of sharing. Inherent dualities in the cloud computing phenomenon are spawning divergent strategies for cloud computing success. The public cloud, hybrid clouds, and private clouds now dot the landscape of IT based solutions. Because of that, the basic issues have moved from 'what is cloud' to 'how will cloud projects evolve'.
Computer games tend to be boys' games, warlike games with more violence. We have not spent enough time thinking through how to encourage more girls to be involved in computing before coming to college so they can see a possible career in information technology.
To create a new standard it takes something that's not just a little bit different. It takes something that's really new and really captures people's imagination. And the Macintosh, of all the machines I've ever seen, is the only one that meets that standard.
The power efficiency of computing has improved by a factor of a billion from the ENIAC computer of the 1950s to today's handheld devices. Fundamental physics indicates that it should be possible to compute even another billion times more efficiently. That would put the power of all of today's present computers in the palm of your hand. That says to me that the age of computing really hasn't even begun yet.
The Internet is becoming the town square for the global village of tomorrow.
Cloud computing seems to be following this evolutionary path: A - Internet backbone. B - Information Superhighway. C - The Net. D - The Web. E - The Cloud. F - "Ubiquity" G- ???
I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun.
The increasing presence of cloud computing and mobile smart phones is driving the digitization of everything across both consumer and enterprise domains. It is hard to imagine any area of human activity which is not being reengineered under this influence, either at present or in the very near future.
In 2000, when my partner Ben Horowitz was CEO of the first cloud computing company, Loudcloud, the cost of a customer running a basic Internet application was approximately $150,000 a month.
At every juncture, advanced tools have been the key to a new wave of applications, and each wave of applications has been key to driving computing to the next level.
The most important application of quantum computing in the future is likely to be a computer simulation of quantum systems, because that's an application where we know for sure that quantum systems in general cannot be efficiently simulated on a classical computer.
Steve Wozniak and Steve Jobs founded Apple Inc, which set the computing world on its ear with the Macintosh in 1984.
Meaning and value depend on human mind space and the commitment of time and energy by very smart people to a creative enterprise. And the time, energy, and brain power of smart, creative people are not abundant. These are the things that are scare, and in some sense they become scarcer as the demand for these talents increases in proportion to the amount of abundant computing power available.
If everybody would agree that their current reality is A reality, and that what we essentially share is our capacity for constructing a reality, then perhaps we could all agree on a meta-agreement for computing a reality that would mean survival and dignity for everyone on the planet, rather than each group being sold on a particular way of doing things.
The best programs are written so that computing machines can perform them quickly and so that human beings can understand them clearly. A programmer is ideally an essayist who works with traditional aesthetic and literary forms as well as mathematical concepts, to communicate the way that an algorithm works and to convince a reader that the results will be correct.
Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity. ... The geniuses of the computer field, on the the other hand, are the people with the keenest aesthetic senses, the ones who are capable of creating beauty. Beauty is decisive at every level: the most important interfaces, the most important programming languages, the winning algorithms are the beautiful ones.
Our civilization is experiencing unprecedented changes across many realms, largely due to the rapid advancement of information technology. The ability to code and understand the power of computing is crucial to success in today's hyper-connected world.
Computer literacy is a contact with the activity of computing deep enough to make the computational equivalent of reading and writing fluent and enjoyable. As in all the arts, a romance with the material must be well under way. If we value the lifelong learning of arts and letters as a springboard for personal and societal growth, should any less effort be spent to make computing a part of our lives?
[The] dynamics of computational artifacts extend beyond the interface narrowly defined, to relations of people with each other and to the place of computing in their ongoing activities. System design, it follows, must include not only the design of innovative technologies, but their artful integration with the rest of the social and material world.
What do you think will be the biggest problem in computing in the 90's? There are only 17,000 three-letter acronyms.
Access to supercomputers. The science is well ahead of our ability to implement it. It's quite clear that if we could run our models at a higher resolution we could do a much better job-tomorrow-in terms of our seasonal and decadal predictions. It's so frustrating. We keep saying we need four times the computing power. We're talking just 10 or 20 million a year-dollars or pounds-which is tiny compared to the damage done by disasters. Yet it's a difficult argument to win.
Right up till the 1980s, SF envisioned giant mainframe computers that ran everything remotely, that ingested huge amounts of information and regurgitated it in startling ways, and that behaved (or were programmed to behave) very much like human beings... Now we have 14-year-olds with more computing power on their desktops than existed in the entire world in 1960. But computers in fiction are still behaving in much the same way as they did in the Sixties. That's because in fiction [artificial intelligence] has to follow the laws of dramatic logic, just like human characters.