the notion of virtual machines has been around for almost 30 years. The first emulators in the mid 90’s and the first VMs in the late 90’s.
The first interpreter ( LISP ) was written in the late 50’s, but the first compiler was written in the early 50’s.
A compiler will translate a script in a symbolic language into machine language which then runs on hardware
An interpreter will allow a script written in a symbolic language to determine the behaviour of a running program that could run on a range of hardware. Thus this move separated the outcome ( achieved by the behaviour of the running script ) from the resource that ran that program.
an emulator is a specific kind of interpreter in that it simulates the behaviour of the underlying hardware and fakes being that hardware.
A virtual machine is also a kind of emulator, except that it does not necessarily emulate a known piece of hardware. A VM could simply provide a range of capabilities that are invoked through scripts. The language that governs the script is parsed by the interpreter that in turn invokes and orchestrate the capabilities offered by the Virtual machine.
Now we have even more impressive abstractions in the form of containers.
The number of computing languages is quite staggering but now the language actually used to achieve particular computational outcomes is now irrelevant. Even the Virtual machine that you run them on is not very relevant. Picking a language to write in is not determined now by the hardware you use. This is possible because a specification can be imitated and anything that can satisfy all the demands of the specification can fulfil the relevant function.
A script SC intended to govern a program PR running on a virtual machine VM hosted on a real machine RM.
These layers of abstraction require that each lower layer support the range of expected capabilities. som if L2 rests o layer L1, then the instructions on L2 must be executable in L1, as though they were simply instructions run on L1.
So the phenomenon of emulation means that a previously relevant ontological distinction is no longer relevant. There is now a more formal distinction which is entirely functional. That of being a platform, (or hosting, not sure of the name for this) vs being the thing hosted.
Being a host or being a guest is not an intrinsic but a relational property, and points instead to a protocol or to a specification that the former complies with and which the later presupposes. So the key ontological feature of an object is not what it IS, but how it appears, or how it presents itself. This is phenomenology. This is functionalism. The phenomenon supports multiple realisation and the real cognitive asset is the script, not the platform.
now considering this, could we regard the brain as functioning as a sort of platform. It is a machine that supports…. what? Other machines, or scripts that run on it, or these other machines ? If the mind is the running machine, then it is like a type 2 hypervisor that runs scripts. If the mind IS the script, then the brain is a perhaps either a type 1 hypervisor, or simply a machine capable of running a script perhaps in some symbolic language. If there is no such language, then it simply runs a non-symbolic script – like machine language or assembler language.
The range of computational solutions does map to theories about the mind in my opinion.
Just floating an idea that has been at the back of my mind for a while.
Seeing the second season of Westworld brought it to mind, and I would be interested in the opinion of other on this.
I am working towards presenting something on this in a more formal context. eg. a research paper for a journal…. but not sure if I can really be bothered.
anyway, dialogue partners on this topic would be much appreciated.