myspace profile view counter
| About BytePile.com | Contact | News@BytePile.com |

Sun 22-October-2017



 Mass Storage Services:
  RAID Arrays - Storage
  SAN & NAS Storage - Services
  Tape Systems - Services



 Hosting & Web Services:
  Shared Hosting
  Dedicated Hosting
  Collocation Services



 Networks & Technology:
  BytePile Network Maps
  System Hardware
  Software Suites



 Company:
  About BytePile.com
  News@BytePile.com



 Support:
  Definitions & Terms
  Frequently Asked Questions



 Tech - White Papers:
  CAT-6 Tutorial by Lucent (pdf)
  Data Speed Table
  Dbase Conf. SAN or NAS (pdf)
  DSL Types & Categories
  Ethernet (UTP) CAT Cables
  Fibre Channel Overview
  RAID 3 vs. RAID 5 in HPC
  RAID Types & Categories
  T1 & T3 RJ-48 Cables
  The SAN Book 3.0 (7MB pdf)


 Legal:
  Acceptable Use Policy
  Privacy Statement
  Service License Agreement



CaseLabs, The Next Generation of Enthusiast Cases

True Crypt - Free Open Source On - The Fly Encryption

Phisical Psience ΦΨ

StatCounter






Definitions & Terms


Logo Dumbbell Nebula

1-10  A  B  C  D  E  F  G  H  I  J  K  L  M  N  O  P  Q  R  S  T  U  V  W  X  Y  Z




QoS - Quality of Service:

  • See Quality of Service.

Quad FastEthernet:

  • Quad FastEthernet (QFE) is a network interface card (NIC) manufactured by Sun Microsystems that is designed to enhance the bandwidth of a Peripheral Component Interconnect (PCI)-based server using Sun Microsystem's Solaris 8 or later operating environment. Speeds of up to 100 megabits per second (Mbps) are provided by converting PCI data streams into Fast Ethernet traffic. QFE cards are hot-swappable, minimizing downtime, and comply with the IEEE 802/3U Ethernet standard. A single card can work with up to four network interfaces at a time and provide support for multihoming.

Quality of Service - Qos:

  • On the Internet and in other networks, QoS (Quality of Service) is the idea that transmission rates, error rates, and other characteristics can be measured, improved, and, to some extent, guaranteed in advance. QoS is of particular concern for the continuous transmission of high-bandwidth video and multimedia information. Transmitting this kind of content dependably is difficult in public networks using ordinary "best effort" protocols.
  • Using the Internet's Resource Reservation Protocol (RSVP), packets passing through a gateway host can be expedited based on policy and reservation criteria arranged in advance. Using ATM, which also lets a company or user preselect a level of quality in terms of service, QoS can be measured and guaranteed in terms of the average delay at a gateway, the variation in delay in a group of cells (cells are 53-byte transmission units), cell losses, and the transmission error rate.
  • The Common Open Policy Service (COPS) is a relatively new protocol that allows router and layer 3 switches to get QoS policy information from the network policy server.

Quantum Computer:

  • A quantum computer is a machine, as-yet hypothetical, that performs calculations based on the behavior of particles at the sub-atomic level. Such a computer will be, if it is ever developed, capable of executing far more millions of instructions per second (MIPS) than any previous computer. Such an exponential advance in processing capability would be due to the fact that the data units in a quantum computer, unlike those in a binary computer, can exist in more than one state at a time. In a sense, the machine "thinks" several "thoughts" simultaneously, each "thought" being independent of the others even though they all arise from the same set of particles.
  • Engineers have coined the term qubit (pronounced KYEW-bit) to denote the fundamental data unit in a quantum computer. A qubit is essentially a bit (binary digit) that can take on several, or many, values simultaneously. The theory behind this is as bizarre as the theory of quantum mechanics, in which individual particles appear to exist in multiple locations. One way to think of how a qubit can exist in multiple states is to imagine it as having two or more aspects or dimensions, each of which can be high (logic 1) or low (logic 0). Thus if a qubit has two aspects, it can have four simultaneous, independent states (00, 01, 10, and 11); if it has three aspects, there are eight possible states, binary 000 through 111, and so on.
  • Quantum computers might prove especially useful in the following applications:
  • Breaking ciphers
  • Statistical analysis
  • Factoring large numbers
  • Solving problems in theoretical physics
  • Solving optimization problems in many variables
  • The main difficulty that the research-and-development engineers have encountered is the fact that it is extremely difficult to get particles to behave in the proper way for a significant length of time. The slightest disturbance will cause the machine to cease working in quantum fashion and revert to "single-thought" mode like a conventional computer. Stray electromagnetic fields, physical movement, or a tiny electrical discharge can disrupt the process.

Quantum Computing:

  • Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.
  • The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.
  • Quantum Theory
  • Quantum theory's development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called "quanta"), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.
  • The Essential Elements of Quantum Theory:
  • Energy, like matter, consists of discrete units, rather than solely as a continuous wave.
  • Elementary particles of both energy and matter, depending on the conditions, may behave like either particles or waves.
  • The movement of elementary particles is inherently random, and, thus, unpredictable.
  • The simultaneous measurement of two complementary values, such as the position and momentum of an elementary particle, is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value.
  • Further Developments of Quantum Theory
  • Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don't look to check.
  • To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger's Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law - in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.
  • The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory.
  • Whichever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.
  • A Comparison of Classical and Quantum Computing
  • Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time - that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.
  • The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we'll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement.
  • Superposition
  • Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from a laser - let's say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number - 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing - classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.
  • Entanglement
  • Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation. Knowing the spin state of one entangled particle - up or down - allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it "spooky action at a distance"), the mechanism of which cannot, as yet, be explained by any theory - it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.
  • Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.
  • Quantum Programming
  • Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations" - something which is meaningless with a classical computer - which would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of which we discuss below.
  • There have been two notable successes thus far with quantum programming. The first occurred in 1994 by Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large numbers. It centers on a system that uses number theory to estimate the periodicity of a large number sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast algorithm that is proven to be the fastest possible for searching through unstructured databases. The algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total number of elements) searches to find the desired result, as opposed to a search in classical computing, which on average needs N/2 searches.
  • The Problems - And Some Solutions
  • The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:
  • Interference - During the computation phase of a quantum calculation, the slightest disturbance in a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase. Some success has been achieved with the use of qubits in intense magnetic fields, with the use of ions.
  • Error correction - Because truly isolating a quantum system has proven so difficult, error correction systems for quantum computations have been developed. Qubits are not digital bits of data, thus they cannot use conventional (and very effective) error correction, such as the triple redundant method. Given the nature of quantum computing, error correction is ultra critical - even a single error in a calculation can cause the validity of the entire computation to collapse. There has been considerable progress in this area, with an error correction algorithm developed that utilizes 9 qubits (1 computational and 8 correctional). More recently, there was a breakthrough by IBM that makes do with a total of 5 qubits (1 computational and 4 correctional).
  • Output observance - Closely related to the above two, retrieving output data after a quantum calculation is complete risks corrupting the data. In an example of a quantum computer with 500 qubits, we have a 1 in 2^500 chance of observing the right output if we quantify the output. Thus, what is needed is a method to ensure that, as soon as all calculations are made and the act of observation takes place, the observed value will correspond to the correct answer. How can this be done? It has been achieved by Grover with his database search algorithm, that relies on the special "wave" shape of the probability curve inherent in quantum computers, that ensures, once all calculations are done, the act of measurement will see the quantum state decohere into the correct answer.
  • Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis. It is this potential that is rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is very much an open question.

Query:

  • In general, a query (noun) is a question, often required to be expressed in a formal way. The word derives from the Latin quaere (the imperative form of quaerere, meaning to ask or seek). In computers, what a user of a search engine or database enters is sometimes called the query. To query (verb) means to submit a query (noun).
  • A database query can be either a select query or an action query. A select query is simply a data retrieval query. An action query can ask for additional operations on the data, such as insertion, updating, or deletion.
  • Languages used to interact with databases are called query languages, of which the Structured Query Language (SQL) is the well-known standard.

Query by Example:

  • Query by Example (QBE) is a method of query creation that allows the user to search for documents based on an example in the form of a selected text string or in the form of a document name or a list of documents. Because the QBE system formulates the actual query, QBE is easier to learn than formal query languages, such as the standard Structured Query Language (SQL), while still enabling powerful searches.
  • To conduct a search for similar documents based on matching text, the user enters or copies selected text into the form search field. This is then passed to the QBE parser for processing. A query is created using the relevant words (common words such as "and," "is" and "the" are ignored by default) and a search is carried out for documents containing them. Because the meaning of the selected text is less precise than a formal query, results may be more variable than those in a formal query entry.
  • To conduct a search for similar documents based on full document text, the user submits documents or lists of documents to the QBE results template. The QBE parser performs an analysis of these and formulates a query to submit to the search engine, which in turn conducts a search for similar material.
  • In terms of database management system, QBE can be thought of as a "fill-in-the blanks" method of query creation. The Microsoft Access Query Design Grid is an example. To conduct a search for field data matching particular conditions, the user enters criteria into the form, creating search conditions for as many fields as desired. A query is automatically generated to search the database for matching data.

Quick Fix Engineering:

  • Quick Fix Engineering (QFE) is a Microsoft term for the delivery of individual service updates to its operating systems and application programs such as Word. Formerly called a hotfix, "QFE" can be used to describe both the method of delivering and applying a patch or fix, and also to refer to any individual fix. Because of the complexity and sheer number of lines of code in most application programs and operating systems, the delivery of temporary fixes to users has long been provided by major software manufacturers. Typically, not all fixes are necessarily applied by an enterprise since they can occasionally introduce new problems. All of the fixes in any given system are usually incorporated (so they don't have to be reapplied) whenever a new version of a program or operating system comes out.
  • Periodically, all current QFEs (or hotfixes) are delivered together as a service pack, which can be applied more efficiently than applying fixes one at a time.

QWERTY Keyboard:

  • The QWERTY (pronounced KWEHR-tee) keyboard is the standard typewriter and computer keyboard in countries that use a Latin-based alphabet. QWERTY refers to the first six letters on the upper row of the keyboard. The key arrangement was devised by Christopher Latham Sholes whose "Type-Writer," as it was then called, was first mass-produced in 1874. Since that time, it has become what may be the most ubiquitous machine-user interface of all time.
  • The QWERTY arrangement was intended to reduce the jamming of typebars as they moved to strike ink on paper. Separating certain letters from each other on the keyboard reduced the amount of jamming. In 1932, August Dvorak developed what was intended to be a faster keyboard, putting the vowels and the five most common consonants in the middle row, with the idea that an alternating rhythm would be established between left and right hands. Although the Dvorak keyboard has many adherents, it has never overcome the culture of learning to type on a QWERTY.





Google  


MySQL Database Powered Powered by Apache Full with PHP Modules Powered by Perl linux-logo
Last Update - 10 April 2012 All Rights Reserved. Copyright © 2002 BytePile.com Inc.