Sharing a processor over a lan?

Just wanted to know if its possible to share the CPU over a lan? like if im running a program on one PC (which is pretty slow) can it use the processing power of another PC which i have (Which is idle at that time)?

Is something of that sort possible?

Thanks

nope!

You could access a powerful PC virtually and then play it from there onto your PC

Its called distributed processing.



A good example of this folding@home:

http://folding.stanford.edu/English/Main

[quote=", post:, topic:"]

Its called distributed processing.


A good example of this folding@home:

http://folding.stanford.edu/English/Main

[/quote]

Something like that availible for home PC's ?

[quote=", post:, topic:"]

Something like that availible for home PC’s ?

[/quote]

What application do you want to run on multiple PCs? Perhaps the application has support for distributed computing. You should ask in forums of the application vendor. MySQL, for example, has support for multiple servers.

[quote=", post:, topic:"]

What application do you want to run on multiple PCs? Perhaps the application has support for distributed computing. You should ask in forums of the application vendor. MySQL, for example, has support for multiple servers.

[/quote]

Its not just one application. Actually the thing is, I have one pretty outdated PC which is in use by the rest of the family while my own PC has been upgraded frequently and is pretty fast. So what i was thinking was if there is some program availible through which i can allocate some of the resources of my PC to the other PC inorder to make the other PC run faster. Im not sure if its possible though. I heard about folding at home for PS3 so i thought maybe something similar was also availible for PC’s.

[quote=", post:, topic:"]

Its not just one application. Actually the thing is, I have one pretty outdated PC which is in use by the rest of the family while my own PC has been upgraded frequently and is pretty fast. So what i was thinking was if there is some program availible through which i can allocate some of the resources of my PC to the other PC inorder to make the other PC run faster. Im not sure if its possible though. I heard about folding at home for PS3 so i thought maybe something similar was also availible for PC’s.

[/quote]

It isn’t as simple as that. Programming for distributed systems is a complicated affair. So only apps designed to support such scalability will work for you.

What you can do is install tightvnc server on your pc and tightvnc viewer on the family pc. That way you can control your pc from your family pc. So you can run programs on your own pc while still using your family pc.

http://www.tightvnc.com/download.php

The only other option I can think of is virtualization. What is called cloud computing. But that is too complicated for a home system.

The only solutions I am aware of to do what you want are Linux based. Which means you will need to have Linux running on both your machines to distribute the system resources. But in your case, I doubt your family will be using any processor intensive application that can benefit by offloading some threads to your computer. You should rather look into RDP (Remote desktop protocol) or VNC as rokra suggested.

If you are interested in Clustering them anyway, look into LinuxPMI, Kerrighed, Bewoulf cluster on Linux and Clustering on Ubuntu

@rokra:

Virtualization is not Cloud computing btw.

[quote=", post:, topic:"]

The only solutions I am aware of to do what you want are Linux based. Which means you will need to have Linux running on both your machines to distribute the system resources. But in your case, I doubt your family will be using any processor intensive application that can benefit by offloading some threads to your computer. You should rather look into RDP (Remote desktop protocol) or VNC as rokra suggested.

If you are interested in Clustering them anyway, look into LinuxPMI, Kerrighed, Bewoulf cluster on Linux and Clustering on Ubuntu

@rokra:

Virtualization is not Cloud computing btw.

[/quote]

Cloud computing does use virtualization to create virtual servers that have greater redundancy. These virtual servers use the power of multiple physical servers. If one of the physical servers goes down the others pick up the load. New hardware can be added without interrupting the functioning of the virtual servers (without any downtime). Virtual servers can be moved without downtime as well.

http://www.google.com/search?q=virtualization+and+the+cloud&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a

I stated that in response to

The only other option I can think of is virtualization. What is called cloud computing.

Which implied that Virtualization is cloud computing. They are both different technologies, and in some cases depending on the application, clusters might use virtualization. But definitely not always, and neither is it mandatory.

[quote=", post:, topic:"]

I stated that in response to

The only other option I can think of is virtualization. What is called cloud computing.

Which implied that Virtualization is cloud computing. They are both different technologies, and in some cases depending on the application, clusters might use virtualization. But definitely not always, and neither is it mandatory.

[/quote]

Yes virtualization is different from cloud computing. Virtualization is the underlying technology that enables cloud computing. Cloud computing always uses virtualization.

Cloud computing is different from clusters. Clusters have a) the same hardware B) work on one job most of the time c) are not scalable d) usually limited to one building or one organisation.

Cloud computing is designed to be scalable and to support multiple users who have different jobs to run. Users can demand more resources as and when they need it. Also the ‘cloud’ in ‘cloud computing’ refers to the Internet so cloud computing resources are always provided to users over the Internet.

[quote=", post:, topic:"]

Cloud computing always uses virtualization.

[/quote]

Once again my friend, the “always” part here is wrong. Virtualization is indeed used in a lot of Cloud environments for supporting applications and services which are specific to some OS or platform not being used in the cloud hardware, but it is not a necessary technology in Cloud.

Quote from Difference between virtualization and Cloud Computing

[quote=", post:, topic:"]
Virtualization is not always necessary in cloud computing; however, you can use it as the basis. Cloud computing is an approach for the delivery of services while virtualization is one possible service that could be delivered.
[/quote]

and from Virtualization is Not Cloud…But Does Make It Shine

[quote=", post:, topic:"]
And so we can conclude that the cloud business model does not necessarily need virtualization.
[/quote]

So my point being from the start, Cloud computing != Virtualization.