Archive for 2014

PARALLEL COMPUTATION



Parallel computation is one of the computer programming that allows to execute commands simultaneously and concurrently in a single or multiple processors inside a CPU. Parallel computation itself is useful to improve the performance of the computer as more and more processes that can be done at the same time it will be faster.

Parallel  concept
The concept of parallel  is a processors ability to perform a task or multiple tasks simultaneously or concurrently, in other words, the processor is able to perform one or many tasks at one time.

Distributed Processing 
Distributed processing is the process of parallel processing in distributed parallel processing using multiple machines. So, it could be said the ability of the computers that run simultaneously to solve a problem with the process quickly.

Architectural Parallel Computer
According to a Processor Designer, taxonomy Flynn, Computer Architecture is divided into four sections. 

1.     SISD (Single Instruction Single Data Stream) 
The type of computer that only has one processor and one instruction is executed serially.

2.     SIMD (Single Instruction Multiple Data Stream) 
            This type of computer that has more than one processor, but this computer only executes one instruction in parallel on different data in lock-step level.

3.     MISD (Multiple Instruction Single Data Stream) 
            This type of computer that has one processor and execute multiple instructions in parallel but in practice there is no computer that is built with this architecture because the system is not easily understood, until now there has been no computers that use this type of architecture

4.     MIMD (Multiple Instruction Multiple Data Stream) 
   This type of computer that has more than one processor and execute more than one instruction in parallel. This type of computer that is most widely used to build a parallel computer, even many supercomputer that implement this architecture, because the models and concepts that are not too complicated to understand.




Introduction to Programming Thread 
A thread in computer programming is a relevant information about the use of a single program that can handle multiple users simultaneously.Thread This allows the program to determine how the user entered into the program in turn and the user will go back to using a different user. Multiple threads can run concurrently with other processes divides the resources into memory, while the other processes do not share it.

Introduction to Programming CUDA GPU 
GPU Refers to a specific processor GPU to accelerate and change the memory to speed up image processing. The GPU itself is usually located on the graphics card or laptop computer 
CUDA (Compute Unified Device Architecture) is a scheme created by NVIDIA as the GPU (Graphic Processing Unit) capable of computing not only to graphics processing, but also for general purposes. So with the CUDA we can take advantage of multiple processors from NVIDIA to do the calculation process much or computing.


REFFERENCE
>>   http://uchaaii.blogspot.com/2013/07/parallel-computation.html


Jumat, 20 Juni 2014
Posted by Save Our Nature

QUANTUM COMPUTING

Definition

A quantum computer is a computer design which uses the principles of quantum physics to increase the computational power beyond what is attainable by a traditional computer. Quantum computers have been built on the small scale and work continues to upgrade them to more practical models.



History of Quantum Computing
Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. (This speech is also generally considered the starting point of nanotechnology.)
Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality.

In 1985, the idea of "quantum logic gates" was put forth by University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorith that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers have been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps do show that the fundamental theory is sound.

How a Quantum Computer Would Work
A quantum computer, on the other hand, would store information as either a 1, 0, or a quantum superposition of the two states. Such a "quantum bit," called a qubit, allows for far greater flexibility than the binary system.
Specifically, a quantum computer would be able to perform calculations on a far greater order of magnitude than traditional computers ... a concept which has serious concerns and applications in the realm of cryptography & encryption. Some fear that a successful & practical quantum computer would devastate the world's financial system by ripping through their computer security encryptions, which are based on factoring large numbers that literally cannot be cracked by traditional computers within the life span of the universe. A quantum computer, on the other hand, could factor the numbers in a reasonable period of time.

Entanglement
Entanglement is a term used in quantum theory to describe the way that particles of energy/matter can become correlated to predictably interact with each other regardless of how far apart they are.

Particles, such as photons, electrons, or qubits that have interacted with each other retain a type of connection and can be entangled with each other in pairs, in the process known as correlation. Knowing the spin state of one entangled particle - whether the direction of the spin is up or down - allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superposition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other immediately, in a communication that is not limited to the speed of light. No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Entanglement is a real phenomenon (Einstein called it "spooky action at a distance"), which has been demonstrated repeatedly through experimentation. The mechanism behind it cannot, as yet, be fully explained by any theory. One proposed theory suggests that all particles on earth were once compacted tightly together and, as a consequence, maintain a connectedness. Much current research is focusing on how to harness the potential of entanglement in developing systems for quantum cryptography and quantum computing.

QUBIT
qubit or quantum bit is a unit of quantum information—the quantum analogue of the classical bit.  A qubit is a two-state quantum-mechanical system, such as the polarization of a single photon: here the two states are vertical polarization and horizontal polarization.  In a classical system, a bit would have to be in one state or the other, but quantum mechanics allows the qubit to be in a superposition of both states at the same time, a property which is fundamental to quantum computing.



Operations on pure qubit states

There are various kinds of physical operations that can be performed on pure qubit states
  1. A quantum logic gate can operate on a qubit: mathematically speaking, the qubit undergoes a unitary transformation. Unitary transformations correspond to rotations of the qubit vector in the Bloch sphere.
  2. Standard basis measurement is an operation in which information is gained about the state of the qubit. The result of the measurement will be either ,with probability , or , with probability . Measurement of the state of the qubit alters the values of Î± and Î². For instance, if the result of the measurement is , Î± is changed to 1 (up to phase) and Î² is changed to 0. Note that a measurement of a qubit state entangled with another quantum system transforms a pure state into a mixed state.


Quantum Gate
quantum computing and specifically the quantum circuit model of computation, a quantum gate (or quantum logic gate) is a basic quantum circuit operating on a small number of qubits. They are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.
Unlike many classical logic gates, quantum logic gates are reversible. However, classical computing can be performed using only reversible gates. For example, the reversibleToffoli gate can implement all Boolean functions. This gate has a direct quantum equivalent, showing that quantum circuits can perform all operations performed by classical circuits.
Quantum logic gates are represented by unitary matrices. The most common quantum gates operate on spaces of one or two qubits, just like the common classical logic gates operate on one or two bits. This means that as matrices, quantum gates can be described by 2 × 2 or 4 × 4 unitary matrices.
·         
      Commonly used gates
  1.      Hadamard gate 
  2.      Pauli-X gate 
  3.      Pauli-Y gate 
  4.      Pauli-Z gate 
  5.      Phase shift gates 
  6.      Swap gate 
  7.      Controlled gates 
  8.       Toffoli gate 
  9.       Fredkin gate


Shor's algorithm
named after mathematician Peter Shor, is a quantum algorithm (an algorithm that runs on a quantum computer) for integer factorization formulated in 1994. Informally it solves the following problem: Given an integer N, find its prime factors.
On a quantum computer, to factor an integer N, Shor's algorithm runs in polynomial time (the time taken is polynomial in log N, which is the size of the input). Specifically it takes time O((log N)3), demonstrating that the integer factorization problem can be efficiently solved on a quantum computer and is thus in the complexity class BQP. This is substantially faster than the most efficient known classical factoring algorithm, the general number field sieve, which works in sub-exponential time — aboutO(e1.9 (log N)1/3 (log log N)2/3). The efficiency of Shor's algorithm is due to the efficiency of the quantum Fourier transform, and modular exponentiation by repeated squaring

Shor's algorithm consists of two parts:
1.   A reduction, which can be done on a classical computer, of the factoring problem to the problem of order-finding.
2.   A quantum algorithm to solve the order-finding problem.

Classical part

1.   Pick a random number a < N.
2.   Compute gcd(a, N). This may be done using the Euclidean algorithm.
3.   If gcd(a, N) ≠ 1, then there is a nontrivial factor of N, so we are done.
4.   Otherwise, use the period-finding subroutine (below) to find r, the period of the following function:
i.e. the order r of Î± in , which is the smallest positive integer r for which ,or 
5.   If r is odd, go back to step 1.
6.   If a r /2 ≡ −1 (mod N), go back to step 1.
7.   gcd(ar/2 ± 1, N) is a nontrivial factor of N. We are done.

Explanation of the algorithm
  •    Obtaining factors from period
  •    Finding the period
  •    The bottleneck


       Reference :









Minggu, 11 Mei 2014
Posted by Save Our Nature

CLOUD COMPUTING


Cloud computing is a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications.
In cloud computing, the word cloud (also phrased as "the cloud") is used as a metaphor for "the Internet," so the phrase cloud computing means "a type of Internet-based computing," where different services  such as servers, storage and applications are delivered to an organization's computers and devices through the Internet.
Cloud computing is comparable to grid computing, a type of computing where unused processing cycles of all computers in a network are harnesses to solve problems too intensive for any stand-alone machine.

The world of the cloud has lots of participants:
·         The end user who doesn’t have to know anything about the underlying technology.
·         Business management who needs to take responsibility for the governance of data or services living in a cloud. Cloud service providers must provide a predictable and guaranteed service level and security to all their constituents.

·         The cloud service provider who is responsible for IT assets and maintenance.



Advantages of cloud computing

1.  Worldwide Access. Cloud computing increases mobility, as you can access your documents from any device in any part of the world. For businesses, this means that employees can work from home or on business trips, without having to carry around documents. This increases productivity and allows faster exchange of information. Employees can also work on the same document without having to be in the same place.
2.      More Storage. In the past, memory was limited by the particular device in question. If you ran out of memory, you would need a USB drive to backup your current device. Cloud computing provides increased storage, so you won’t have to worry about running out of space on your hard drive.
3.      Easy Set-Up. You can set up a cloud computing service in a matter of minutes. Adjusting your individual settings, such as choosing a password or selecting which devices you want to connect to the network, is similarly simple. After that, you can immediately start using the resources, software, or information in question.
4.      Automatic Updates. The cloud computing provider is responsible for making sure that updates are available – you just have to download them. This saves you time, and furthermore, you don’t need to be an expert to update your device; the cloud computing provider will automatically notify you and provide you with instructions.
5.      Reduced Cost. Cloud computing is often inexpensive. The software is already installed online, so you won’t need to install it yourself. There are numerous cloud computing applications available for free, such as Dropbox, and increasing storage size and memory is affordable. If you need to pay for a cloud computing service, it is paid for incrementally on a monthly or yearly basis. By choosing a plan that has no contract, you can terminate your use of the services at any time; therefore, you only pay for the services when you need them


The working principle of cloud computing

The principle of cloud computing is almost same with another computer,  just the different of that is in cloud computing, is coupled with another present computer.  In regular computer, file from software when we used is stored in hardisk or another storage media. But on computer clouds if viewed from the side of the user, the files from software we use is in another computer.
In other words we are connected to multiple computers on a network server, but the data we store it was in the data center or in center, so that not only we can open the file that we save but computers or other users can open it and vice versa (Public). Also in a lot of infrastructure servers that we can use and we only pay as needed.

 Characteristics cloud computing

1. On-demand self-service. This means provisioning or de-provisioning computing resources as needed in an automated fashion without human intervention. An analogy to this is electricity as a utility where a consumer can turn on or off a switch on-demand to use as much electricity as required.
2. Ubiquitous network access. This means that computing facilities can be accessed from anywhere over the network using any sort of thin or thick clients (for example smartphones, tablets, laptops, personal computers and so on).
3. Resource pooling. This means that computing resources are pooled to meet the demand of the consumers so that resources (physical or virtual) can be dynamically assigned, reassigned or de-allocated as per the requirement. Generally the consumers are not aware of the exact location of computing resources. However, they may be able to specify location (country, city, region and the like) for their need. For example, I as a consumer might want to host my services with a cloud provider that has cloud data centers within the boundaries of Australia.
4. Rapid elasticity. Cloud computing provides an illusion of infinite computing resources to the users. In cloud models, resources can be elastically provisioned or released according to demand. For example, my cloud-based online services should be able to handle a sudden peak in traffic demand by expanding the resources elastically. When the peak subsides, unnecessary resources can be released automatically.
5. Measured service. This means that consumers only pay for the computing resources they have used. This concept is similar to utilities like water or electricity.

SECURITY

Security. When using a cloud computing service, you are essentially handing over your data to a third party. The fact that the entity, as well as users from all over the world, are accessing the same server can cause a security issue. Companies handling confidential information might be particularly concerned about using cloud computing, as data could possibly be harmed by viruses and other malware. That said, some servers like Google Cloud Connect come with customizable spam filtering, email encryption, and SSL enforcement for secure HTTPS access, among other security measures.

The biggest question most have with Cloud Computing is will it be Safe? The answer is “NO”  Reason why is everything that Cloud Computing is based on is mechanical, although it seems virtual. The Safety of the data (information), is only as Safe as the will and determination of the individual that wants to have at it.

THE CONCEPT OF CLOUD COMPUTING


The first building block is the infrastructure where the cloud will be implemented. Some people make the assumption that environment should be virtualized, but as cloud is a way to request resources in an on-demand way and if you have solutions to provide  on bare metal, then why not? The infrastructure will support the different types of cloud (IaaS, PaaS, SaaS, BPaaS).
To be able to provide these services you will need Operating System Services (OSS), which will be in charge of deploying the requested service, and Business System Services (BSS), mainly used to validate the request and create the invoice for the requested services. Any metrics could be used to create the invoice (for example, number of users, number of CPUs, memory, usage hours/month). It is very flexible and depends on the service provider.
A cloud computing environment will also need to provide interfaces and tools for the service creators and users. This is the role of the Cloud Service Creator and Cloud Service Consumer components.
Now, let’s see how it works in reality.
Generally, you log in to a portal (enterprise or public wise) and you order your services through the Cloud Service Consumer. This service has been created by the cloud service provider and can be a simple virtual machine (VM) based on an image, some network components, an application service such as an WebApp environment and a service such as MongoDB. It depends on the provider and type of resources and services.
The cloud provider will validate, through the BSS, your request and if the validation is okay (credit card, contract), it will provision the request through the OSS.
You will receive, in one way or another, the credentials to access your requested services and you will usually receive a monthly invoice for your consumption.


Reference :


http://12285-if-unsika.blogspot.com/2012/10/prinsip-kerja-cloud-computing-atau.html





Senin, 28 April 2014
Posted by Save Our Nature

permasalahan sistem terdistribusi baru

    Banyak hal yang sering menjadi masalah pada pembangunan sebuah sistem terdisribusi, seperti masalah keamanan, aplikasi yang digunakan, kompleksitas dan lain lain 

    Sistem terdistribusi terdiri dari komputer otonom yang bekerja sama untuk memberikan tampilan sistem yang satu kesatuan.  keuntungan dari system ini ialah memberi kemudahan untuk mengintegrasikan aplikasi yang berbeda berjalan pada komputer yang berbeda ke dalam satu sistem. Keuntungan lain adalah ketika dirancang dengan baik,  skala sistem terdistribusi akan mengikuti dengan ukuran jaringan yang mendasarinya.  Namun system ini memiliki  kekurangan seperti  biaya perangkat lunak yang lebih kompleks, penurunan kinerja, dan keamanan juga sering lemah. Namun demikian, ada minat yang cukup besar di seluruh dunia dalam membangun dan memasang sistem terdistribusi.

      Sistem terdistribusi sering bertujuan menyembunyikan banyak seluk-beluk yang berkaitan dengan distribusi proses, data, dan kontrol. Namun, transparansi distribusi ini tidak hanya datang pada harga kinerja, tetapi dalam situasi praktis  tidak pernah dapat sepenuhnya tercapai. Fakta bahwa trade-off harus dibuat antara pencapaian berbagai bentuk transparansi distribusi melekat pada desain sistem terdistribusi, dan dapat dengan mudah menyulitkan pemahaman pemakai.


REFERENSI :

>> Tanenbaum, distributed systems principles and paradigms 2nd edition, 2006


Selasa, 01 April 2014
Posted by Save Our Nature

The conclution of Mobile Computing Effects to Education Computing Progress



         In the mobile computing digital era has many roles in improving the quality in world of education. Because it could help and facilitate the day-to-day learning. The capabilities and characteristics of mobile computing also allows the distance learning process to be more effective and efficient and get better result. Even according to M. Mukhopadhay M., 1992 “Globalization has triggered a shift in education from to-face education conventional to more open education.
           Many developed countries already implementing mobile computing technology in teaching and learning process. For example, mobile computing in developed countries is learning together in their education, or called collaborative learning, has been proven to improve test score and reduce dropouts by 22%. Mobile technology has found a way to be able to perform collaborative learning, in wich various students can discuss in the web forum to make database together, about anything based their location each other. In France, the project “Flexible Learning” has been applied to the system of education. It is reminiscent of Ivan Illich forecast early 70s on “Education without school (Deschooling Socieiy)”. Meanwhile in developing countries like Malaysia, “Problem Based Learning” with mobile learning technology or M-Learning is said still new in terms of its implementation. For Harvard Medical School project, ArcStream Solutions was hired to develop solutions based on the Palm OS mobile platform that facilitates communication between students and faculty, and which provide detailed program information. Florida State University College of Medicine is used to develop a solution ArcStream Clinical Data Collection System ( CDC ) which allows students to take and edit patient reports . But development continued in order to obtain good results for the quality of education in Indonesia .

The advantages of mobile computing:
 
>>  The use of e-books to be efficient in the learning process.
>>  Being less expensive because of the lack of accommodation for buildings, school supplies, and  transportation.
>>  Academic students can be controlled by the parents.

Disadvantages of mobile computing:

>>   The storage capacity of mobile computing technology becomes a problem 
>>   Depending on the sophistication of the Internet and mobile devices
>>   In terms of psychology, socialization or interaction of neighbor will be reduced this will result in people tend to be apathetic.
Rabu, 26 Maret 2014
Posted by Save Our Nature

Popular Post

Blogger templates

Save Earth


Move your mouse to go back to the page!
Gerakkan mouse anda dan silahkan nikmati kembali posting kami!

- Copyright © "MY FILE" -Metrominimalist- Powered by Blogger - Designed by Johanes Djogan -