Nanotechnology Crossroads

Originally published August 19, 2004

I would highly recommend that you read my first two articles on Nanotechnology. The first article "Nanowarehousing: Nanotechnology and Data Warehousing" is an in-depth discovery discussion on the form versus function debate at the macro-level of computing.  The second article “The Nanotechnology Revolution” discusses how, if we reduce the concepts to a micro level (Nano-level), we arrive at the conclusion that form, function and data must be combined through electrical signals and concentrated structures such as Nano-tubules and Quantum Dots.

What does the timeline look like?

In the world of Nanotechnology, I have identified several phases that are emerging in the market place.  Each phase leads to the next and of course, time is relative based on setbacks and advancements made in the Nanotech world.  The timeline is of particular interest because it helps us to visualize the evolution of Nanotechnology and its impact on the business world.  One thing is for sure: progress is being made and it’s not reversible.

Figure 1-1: Nanotech Timeline

This diagram depicts the authors’ view of how the paradigm will shift over time.  Phase 1 represents where we are today, Phase 2 could be 1 to 2 years out, Phase 2.5 is an interim phase between 2 to 4 years, Phase 3 could be 3 to 6 years, and Phase 4 is the 5 to 8 year mark.  The phases and their content will be discussed shortly. 

Phase 1 - Today

Today we are at a crossroads with Nanotechnology and our ability to produce and apply its abilities to our common daily tasks.  The following list summarizes where we are with Nanotech today : 

  • Macro atomic devices (15-35 microns across) – an ANT is 8,000,000 microns across;
  • Experimental chemical elements;
  • High-cost fabrication (average: $0.15 cents per RFID);
  • Difficulty in mass-production of quality elements;
  • No control over repair, detection, shut-off or removal;
  • Ideas, thoughts on the application of Nanotech to information, storage, retrieval (nanohousing): and
  • Lab experiments with “growing” nano-clusters, self-assembly of crystalline structures. 

There have been recent reports of significant advances in self-assembling crystals of nano-particles.  This is truly phenomenal or phenomenally scary (depending on your point of view).  As you will see, I’ve predicted self-assembling molecular structures, outside the laboratory environment, within the next 5 years. 

Today’s devices are quite small.  Tomorrow’s devices will be even smaller.  As indicated above, an ant is 8 million microns across; an RFID (radio frequency identifier tag) is 15 to 35 microns across.  You can fit 228,571 RFID tags on the back of an ant (at 35 microns).  There’s a new tracking mechanism in use today called: RTLS (real-time locator systems).  RTLS are much larger tags which can be re-programmed and re-charged.  They are currently in use on shipping crates by the auto industry. 

Also in the labs today are experimental chemical elements.  There are atomic structures and compounds being built and assembled; man-made elements on the atomic charts.  Some of these elements are difficult to classify, let alone to determine their uses.  There are elements that look and feel like fiber (cloth), but are stronger than steel.  There are elements that are thinner than a thread, but are more bouncy than rubber and of course there are elements that look and feel like gold, but are lighter in weight. 

Today it is extremely difficult to get mass production of these elements.  The current machinery (cooling mechanisms, light sources, etc.) are hidden away in the backrooms of science labs.  With the amount of investment pouring into the Nanotech world, there will soon be machines that have the capacity to mass produce these elements.  The dynamics will be changing automatically. 

Of course, today’s elements aren’t “smart” (except those which are highly classified).  The elements of today don’t repair themselves, detect situations, shut themselves off, or even remove themselves from the item to which they have been attached.  Unfortunately for the commercial world, this has not been achieved but is drawing nearer every day. 

So what does this all mean to Nanohousing™?

In our case, Nanohousing requires smart elements that can self-assemble, or be assembled rapidly (mass-produced) to meet electronic requirements.  It also requires elements which are capable of being programmed and housing payloads.  Nanohousing is defined as follows: 

Nanohousing: The ability to perform information storage, retrieval, and eventually content awareness on a nano-scale (molecular) device or set of devices, all operating in parallel. 

The concept of Nanohousing is currently a growing and shifting idea. New methods and mechanisms for data manipulation at the atomic level are being explored and discovered.  The purpose of a Nanohouse is to provide basic information storage that is capable of being inter-linked, altered, destroyed, copied, and moved.  The Nanohouse is also comprised of neural net algorithms that have a base understanding of the data/information they contain and the nature of its security and how valid or recent it is.  When these different components of the Nanohouse come together (like DNA replication) they can “decide” on their own the relevance of connection, information sharing, re-programming, and structure alteration.  We will explore the concepts of Nanohousing in detail, in a future article. 

Phases 2-4 - Future Nanotechnology Timeline

Below is a proposed timeline for Nanotechnology and Nanohousing in the future. The phases correspond to Figure 1 above for Moores Law.

Nanotechnology Timeline

Nanohousing Timeline

Phase 2: 1-2 years

Better manufacturing/mass production

Use of photo-electronic devices

Visualization control

Write-back capability

Wave Generation Capabilities

Self-Error Detection

Micro / Atomic “Debuggers”

Form / Function / Payload integration.

Integrated re-programmable data stores, simplistic functionality

Nanoviruses, attacks on nanotech components are introduced.

Phase 2: 1-2 years

More discrete definition of Nanites, and data handling capability.

Definition of data rates of change, size estimations for data storage

New “3+ state” defined for bit operations, 1 = on, 0 = off, and 1/0 = comp

Factoring problems Solved through Wave Dynamics

Attachment of crude functions to query/store/manage data within nano-structures

Study of applicability of storing historical information, possibly within the 1/0 comp states – all possible values represented in a single bit range

Interim Phase 2.5: 2-4 years

Mass production of light emitting devices (wave generation).

Cheap Optical Components

Expensive Atomic Fabrics

Electrical and Wave Stimulation – altering structure on demand.

Increase in Medical and Biological delivery mechanisms.

Break-through to go beyond Moore’s Law of reduction (using Quantum Physics)

Interim Phase 2.5: 2-4 years

Reprogramming of nano-devices through photon wave generators.

Introduction of security check algorithms, and self-correcting nanites

Initial stages of self-discovery, linkage applicability through outside programmatic forces.

Use of physical “disk” begins to be questioned.

Movement towards all atomic computing.

What we know as OLTP, ODS, DW, and Active DW merge to become known as Dynamic Nanohousing (DNH).

Phase 3: 3-6 years

Visual Assembly – “CAD” for atomic devices

End-User Visible Control for Structures

Cheaper photo-resistant electronics

Cheaper Optical Devices

Smart Atoms / fabrics – capable of end-user interfacing.

Auto-on/off transmission.

Atomic tagging (identifying every atom uniquely).

Terabyte computing in massive parallelism (true parallelism) through electronic signaling – invisible wave generation around the world. ON THE HEAD OF A PIN.

Rise of crude, “self-aware” Nano-Tech.  It knows the payload it contains; it knows what it can bind to, and what it must repel.  It knows what signals it can interface with, and it knows its boundaries and configurations.

Lab experiments with self-configuring “nano- clusters”

Phase 3: 3-6 years

Experimentation turns to application of Highly optimized neural algorithms – which are programmed directly into nano-structures.

Data Vault (or similar architecture) utilized to house data in a highly attributive state.

New security algorithms developed to encrypt information within.

Neural net algorithms read and utilize encrypted data without “decrypting” it.

Crime rates of nano-devices rise significantly, particularly with the advent of particle wave technology – nano-devices are not safe anywhere in the open, they must be protected from stray wave generators.

Phase 4: End Game, 5-8 years

True Self Assembly

Smart Nano-Clusters

Self-Interfacing Nano-Clusters (nanons)

Self-Reconfiguring Nanons

Smart Fabrics/Materials

Inward Journeys into the mind through Nano-Tech

Phase 4: End Game, 5-8 years

Nanites meet, and re-assemble based on information content, security, and boundaries.

Nanohousing is in full-swing, information can now be traded without wires, halfway around the world through particle wave physics (convergence)

The notion of “Data Warehouse” is obsolete, but a foundational cornerstone for the Nanohouse which filters current incoming information, and applies it dynamically to the history – utilizing compression, and white-noise neural net technology.

Reports and Analytics are all done near-real-time.  The Nanohouse has to be “plugged in” to projection devices, or other nano-scale devices to imprint images on walls or other mediums.

Summary

Although I cannot speculate how soon Nanohousing will become a reality, what I can do is consider the research implications of coupling form with function, design and models all at an atomic level.  We can be assured of two things: Nanohousing will become a very large industry in the future and the basic building blocks for this technology are currently emerging in the commercial marketplace. 

Other items to think about are the bold statements that have been made about the timeline of Nanohousing and Nanotechnology in general.  While bold, some of these statements may not true.  I encourage you to “think outside the box” for just a minute.  While Nanotech itself raises many questions about security, privacy and ethics, there are many benefits to be gained from it as well. 

Finally, Nanohousing is the combination of form (data structure or information modeling/representation) within the atomic layer, function (like neural nets, security algorithms, read/write/check functions), and physical structure (the atomic layer and our control of it).  New methods, new tools, new ways of thinking must be built in order to begin the arduous task of handling, and understanding Nanotech and it’s application in the information sciences.

If you are just curious, or you’re a researcher, or you wish to get in touch with me, I’d love to hear your thoughts, comments and feedback on this issue – both critical and thoughtful perspectives. 

  • Dan LinstedtDan Linstedt

    Cofounder of Genesee Academy, RapidACE, and BetterDataModel.com, Daniel Linstedt is an internationally known expert in data warehousing, business intelligence, analytics, very large data warehousing (VLDW), OLTP and performance and tuning. He has been the lead technical architect on enterprise-wide data warehouse projects and refinements for many Fortune 500 companies. Linstedt is an instructor of The Data Warehousing Institute and a featured speaker at industry events. He is a Certified DW2.0 Architect. He has worked with companies including: IBM, Informatica, Ipedo, X-Aware, Netezza, Microsoft, Oracle, Silver Creek Systems, and Teradata.  He is trained in SEI / CMMi Level 5, and is the inventor of The Matrix Methodology, and the Data Vault Data modeling architecture. He has built expert training courses, and trained hundreds of industry professionals, and is the voice of Bill Inmons' Blog on http://www.b-eye-network.com/blogs/linstedt/.

Recent articles by Dan Linstedt



 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!