You read my mind Xanayoshi.
Xanayoshi wrote:(sorry, I reallllly reallly realllllly want to build a cluster)
Sometimes when I get up from sleep, that's all I want to do. You can try Rocks CLuster, its based on Centos which is RHEL.
http://www.rocksclusters.org/wordpress/?page_id=80It has a no nonsense method of setting up.
I've tried Beowulf but unless you know how to script in C, you'll end up broke.
My Idea of a local cluster/cloud/grid is basically wirelessly networking all devices at the hardware level. Ofcourse the networking equipment, protocols and the rest of the stack have layered security otherwise people would be able to inject code directly into the CPU....
Otherwise all your ARM, AMD, Intel and Mips(routers) CPUS, memory and graphics would automatically form a local grid virtualizing the entire CPU and GPU power. Servers do this by using an MPI(message passing interface). No, MPI's wont allow you to run GTA V at 2000fps. Its just a fictional state imagined -- what if you could?
A.I. is almost an oblivious concept. Again its just a matter of time and it sits right there in the future. According to me AI is highly subjective and not an active intention of building advanced networks and systems. It'll happen when it needs to.
Apple is an a** of a company but they have opened up even more since the passing of Jobs.
They made OpenCL (GPGPU compute) to counter the proprietary Nvidia CUDA compute.
Then they made it fully open source.
AMD's entire business now depends on OpenCL and HSAIL. HSA uses opencl to a broad extent. ARM's(android) biggest strength is OpenGL and OpenCL. What Windows and Linux could never achieve, android has done so in less than 5 years using underperforming ARM chips.
Apple uses LLVM(clang) to make their programs, kernels etc. Richard Stallman has gone up against Clang because it uses another license that misinterprets the meaning of freedom........that's another story. Many developers now wish to compile their kernel with Clang instead of GCC. Gallium is compiled using llvm and is not GPL licensed....
AMD has released the SDK for HSA on linux(GCC).
http://www.hsafoundation.com/hsa-developer-tools/So any exclusivity to AI, cognitive computing etc for now can be passed off as Compute porno.
Xanayoshi wrote:A natural language machine would only serve to make those that do not understand technology more subservient to it. It is ridiculous that they do not teach standard computing technology in schools here as it is this technology that their lives revolve around in every conceivable way.
That is an argument and an internal confliction of thoughts I first encountered when I was in high school reading up on quantum well transistors, quantum chromodynamics and string theory
from the library. Back then I was highly into anything quantum and mugged up Q is for quantum by John Gribbin ...
http://www.amazon.com/IS-FOR-QUANTUM-En ... 4863154....
Today languages like Python/Haskell/C# and their vast IDE's(Geany, nano and vim are the best!) etc etc have changed the way programmers program.
Python by far can be called as the most advanced language. It can also be used for writing MPI's. No, it does not allow you to control the CPU but you can write almost anything else limited only by imagination.
The problem is not about teaching languages. The teaching staff would have to be much much more adept in coding apart from a multitude of other factors(real problems). Professors who teach high level unix stuff work for many Co's and unless paid well wouldn't care a bit more unless someone is willing to work for less(like in Asian countries)...programming languages today are like the 386 gen technology...in about 40 years time you could program in English...may the best oxford graduate make the best programs.
...again GOogle has done amazing things in speech recognition. AMD's major goal(one of the many) for APU's is speech recognition.
More than a decade ago when I was playing with C; assembly language was the only way you could control a CPU even though C was used majorly.
I played with mnemonic code on windows and linux and it was fun but soon realised if one were to write a kernel with such a language, it would take nearly 10-15 years. Now ofcourse unless C and C++ was hardcoded into the chip itself then you could even build a higher level language to program a CPU for....and then multicores came. ARM started adding encoding circuits directly into the chip unlike Intel which would sell chips by just upping the clock speeds. Today's chips have encoding stuff like x.264 and advanced instruction sets built in like AVX-256, AES, execute disable bit for security etc etc.
The best way to see how a computer "understands" language is to take a raw C code with .c format and pass it through a compiler. An a.out file along with another machine code file will be created filled with gibberish. That is exactly that goes on inside the CPU. This was achievable on older compilers and not so on new ones...