Best invention during your lifetime?

icerfan

Nikkala made me do it!
FreeOnes :nanner:

(I can't believe you guys left that one for me!) :rofl:
 
i agree to a certain extent. when i see 10 year olds with cellphones, i see a problem. when i see drivers paying more attention to their conversation than to driving, then i see a major problem.

I agree 100%, kids have no business having cell phones just to be cool and call their friends whenever they feel like it. Here in NY they passed a law requiring the use of handsfree sets for cell phones, everyone obeyed it for a bit, but the same old assholes are there in the car gabbing away paying no mind to the road, pisses me off.
 
Tough call .. the MRI scanner is pretty amazing, but then so it the wind-up radio and skype (internet telephony) MIDI (I'm a musician) autopilot, the ziplok bag, digital recording (for fixing stuff up .. not so much for the original recording) .. the pc , the internet (by a Brit!!)
 

om3ga

It's good to be the king...
PC / Internet
 
Tough call .. the MRI scanner is pretty amazing, but then so it the wind-up radio and skype (internet telephony) MIDI (I'm a musician) autopilot, the ziplok bag, digital recording (for fixing stuff up .. not so much for the original recording) .. the pc , the internet (by a Brit!!)

MRI's? :1orglaugh an mri couldnt see what was wrong with my foot. they said, my limp was fake because the mri showed nothing. then i went to a specialist. he spotted what was wrong in a minute. mri's are pointless.
 
Three, just prior to my birthdate who are the parents of all, commodity technology

All modern, informational technology has 3 parents from relatively the same time period -- 1966 to 1970.

1) Packet switched networks

In 1966, AT&T -- the US telephone regime -- would not even consider moving away from circuit switching, because it considered it a threat to their industry. The US Department of Defense (DoD) wanted a communication system of multiple notes and routes between them that could fail-over in the case of nuclear attack, so it went the academia route with its [Defense] Advanced Research Projects Agency ([D]ARPA).

The first, four (4) nodes of the ARPAnet would follow within a half decade.

2) The C Programming Language and UNIX Operating System

In 1969, computer user "time share" processing systems were proprietary, tied to proprietary hardware, written in machine code (or an psuedo-language assembly) and software was not directly portable between systems. Kernigan and Ritchie at Bell Laboratories of -- ironically enough -- the AT&T monopoly came up with an idea for a general purpose, 3rd generation programming language that would be portable across all platforms -- hardware and OS. Almost simultaneous, another group in Bell Labs started to explore developing a new, more commodity implementation of a time share system.

Within five (5) years, UNIX was the first operating system written and self-hosted entirely in a 3rd generation language, the C programming language.

Furthermore, because the US considered AT&T a monopoly in the telephony space, it would not allow them to enter the computing space. So AT&T's new UNIX System Laboratories (USL) freely redistributed the C programming language and UNIX operating system written in it throughout the academic community, especially over #3 and one of its primary nodes, the University of California at Berkeley (UCB).

Much of the code of the Berkeley Software Distribution (BSD) of UNIX is at the heart of every operating system today (especially after the 1993 settlement of the lawsuit between USL and UCB, the primary codebase being known as 4.4BSDLite). This includes major, core components of NT/Windows (and even some direct UNIX code rips, some licensed from AT&T, in DOS/Windows prior), which is not even a UNIX-like OS, but makes heavily use of C/UNIX code and API approaches (especially the IEEE Portable OS Interchange, POSIX, API of C). Apple's Darwin platform -- at the heart of MacOS X -- is a direct decendent of BSD. The GNU project (the "clean room UNIX replacement" started in 1984 after the USL v. UCB lawsuit) which is the foundation of Linux (a UNIX-like, but not direct UNIX descendent OS), used a lot of 4.4BSDLite code early on.

3) Microprocessing

The microprocessor and the resulting, very large scale integration (VLSI) began with the Intel 4004, released in 1970. Since then we've integrated more and more logic into processors down to the atomic level. Now we're finding new ways to push the envelope farther than anyone thought before.

The first, massively produced, commodity 8 and 16-bit microprocessors would follow within a half decade, and modern, 32/64-bit computing withing fifteen (15) years later.

Honorable mentions would include ...

1964: The 3-button mouse and 5-chorded keyboard (1964), the latter never taking hold but the former being ubiquitous today (yes, the original mouse had 3 buttons!)
1972: The Xerox PARC "W" environment (including mouse support), which Apple would model 10 years later into its Lisa and eventual MacOS environment
1984: The Xerox/MIT "X-Window" environment, still the only major, true, distributed, network-capable (including 3D with OpenGL over X11, GLX, less than a decade later) graphical environment (which Microsoft still doesn't offer an equivalent to, and the primary reason why Linux has gained popularity over Windows as even a desktop in many organizations)
 
MRIs ...

MRI's? :1orglaugh an mri couldnt see what was wrong with my foot. they said, my limp was fake because the mri showed nothing. then i went to a specialist. he spotted what was wrong in a minute. mri's are pointless.
MRIs typically require you to hold still, and won't show changes during motion (depending on the scanner). So that won't always show what is wrong with something that doesn't have an issue until it is in motion. MRI is just another tool. It's not the only tool, but it's probably the greatest move forward in medical scanning technology, ever.

Chiropractors have found a newfound birth and respect in the last few decades, despite advances in technology. Why? Because technology -- especially static scanning equipment -- has proven it cannot explain things that a well-trained doctor can. Especially a doctor who studies the motion of the human body, which you don't have to open up or scan to understand.
 

bigbadbrody

Banned
Internet, Mp3 Players, Cheese in a spray Can and Solar Powered Flashlights
 
ARPAnet had no military use, and MILnet was extremely limited ...

Well, to be technical the internet has existed since the early 1960s, though more for use by military and such.
That is a common myth!

ARPAnet was funded by the military because the telcos wouldn't research packet switching as they saw it as a threat to their monopoly. In fact, when some executives from AT&T saw it in the early '70s, and it has serious reliability, they breathed a sigh of relief. The NSF took over the majority of the original infrastructure.

BTW, more info on the origins of the Internet (and its complementary, required technologies) is in my post in this thread:
http://board.freeones.com/showthread.php?p=1298804

By the '80s, the telcos were knee-deep in adding the physical hardware for packet switching. They themselves were using it because it was far more effective and cheaper than circuit switching.

The ARPAnet was never used by the military for "operational" other than basic, unsecured communication. In 1984 the segmented the separate MILnet, although it was still routed to/from the ARPAnet, and used for little more than a tertiary role (common client/server programs, like SMTP/e-mail). The 1982-84 era was when most of today's core protocols and approaches were founded -- from the TCP/IP transport/network protocol to the Domain Name System (DNS).

Some of '70s and '80s maps of the ARPAnet and other, routed networks can be seen here:
http://www.cybergeography.org/atlas/historical.html

I supported part on the NSFnet in the late '80s at my local university and its Class B. I was dealing with porn on the UseNet (often to restrict the sheer bandwidth it was eating up, not out of any censorship attitude) almost 2 decades ago (especially once the '90s hit). I still remember the early days of Danni Ashe (mid-'90s) and when her stuff used to flood the UseNet (one of the reasons we cut off many of our select group feeds).

But in the 80s it expanded into domestic/civilian use.
The ARPAnet has always been academic, great portions of the original taken over by the NSF early on.

It wasn't until Al Gore "invented the Internet" by pushing the law that allowed commercial traffic on the NSFnet/backbone in the mid-'90s that the original ARPAnet became usable by everyone, and not just universities. The reality at that time was that most of the commercial backbones and networks were already overtaking, so it was more about the NSFnet not being an "island" than promoting commercial use.
 
Artificial invisibility is still really reduced visibility ...

The invisibility suit! Some may not believe me, but it has been invented. Using small camera's and projectors on the suit, it's still semi-crude funcionality, but in the near future, people will be (mainly military and the like) wearing skin-tight, full-body invisibility suits.
It's not James Bond type of invisibility, it's still reduced visibility.
The material blends in with the environment, but because its still a 3 object in space, it's hardly "invisible" from the overwhelming majority of arcs.
But yes, it works a crapload better than traditional camo, as it is dynamic.
 
Digital compression ...

Mp3 Players
That would be digital compression -- namely lossless.
That required advanced, integrated circuits as well for codecs.
A codec is a limited instance of the greater concept of the digital signal processor (DSP) invented by Texas Instruments (TI).
With more and more powerful processors (some with basic DSP functionality), more and more common DSP functionality is being moved into software.
Solar Powered Flashlights
Solar power was invented long ago, and used to power lights almost a half-century earlier.
It's become more commodity more recently because of the advances in battery technology.
Battery technology has moved forward because of both portable computers, but even more so, hybrid electric-fossil fuel vehicle R&D.
 
Top