Some define a language as its notation and syntax. And they're usually wrong.
Niklaus Wirth himself, an expert in computer science and an inventor of several languages, claims that syntax, sentences, and notation, are what describe the language. One can use the English language with French symbols and notation and it is no longer English. One can add a new meaning (semantics) to the word ass (it can mean donkey, but later on it could mean a bad person) without changing English.. since semantics can change, while the language is still the language.
Niklaus Wirth himself, an expert in computer science and inventor of several languages, described every single one of his several languages with a semantics. Therefore, either he must have been referring to 'semantics' when he referred to 'notation' and 'sentences', or his actions speak much louder than his words.
And when you start giving 'ass' new semantics, you end up with a divergent language. Now, that language might merge back into the conceptual 'common tongue' English, or it might not. English, after all, is a living language. At the moment, there are a great many variations of English, including American English, British English (they'd deny it - perhaps Irish English, Oxford English, etc.), Japlish, etc. In many ways these are simply distinct languages that share something in common... much like the various curly-brace imperative languages share a great deal in common (similar ways of expressing similar ideas).
Meanings of words (semantics) are changed continually in the English language. Compare an old dictionary from 1950 to one in 2008. A lot of slang has been defined and added (for example the word Ass is slang, and even the word "tool" can be overloaded later to mean someone who is used). At one time some words did not have multiple meanings and more are added. Just as in the C language one can add the boolean type to the C99 standard, and we can say that C99 is slightly a different language than Cee from K&R or we can say that Cee is Cee, just that C99 has some differences. It's vague in that sense. But if you look at C from a general communication perspective, the syntax, notation, symbols are the same. Boolean has different semantics in old C where a boolean didn't have meaning. Therefore, as I say, language semantics change through implementation.. one can even change a VOID in Cee to mean something else in a later standard.. just as BOOLEAN semantics are not so constant.
Actually C has gained some new symbols and notation over time. And English is a living language, so of course things are changing in it continuously - that's what "living language" means - still in use, still changing.
For example XML defines a basic syntax and basic notation, and is called a mark up language. It does not have any meaning until someone actually does something with the XML. Similarly one can parse HTML or SGML and do anything with it that they want.
Technically, XML is not a language. It is a format for creating a language. In fact, an XML document is not considered valid unless it conforms to a set of semantic rules as described by a document DTD or Schema, even if it possesses a well-formed syntax.
Technically, according to a dictionary, the XML people have a case.
Then you should know that the XML people (or at least all those who have taught classes I've attended) will tell you that XML is a substrate for building a language, not really a language all by itself. The actual languages are things like Math Ml and Scalable Vector Graphics.
I'm no fan of XML, but according to the Webster dictionary, a language is just a set of symbols on a computer (and hopefully with some rules, i.e. syntax). One compiler may compile an integer to mean 16bit int, while another compiler may compile an integer to mean 64bit int.. so the meaning of integer can change.. but the language stays common (the notation, syntax, symbols).
In the C language, the semantics of 'int' was exactly that: whatever the 'most natural' integer-width is for the computer, so long as it is no shorter than a 'short' and no longer than a 'long' and at least 16 bits. Therefore, it is correct that it vary based on architecture. That is the semantics of 'int'. The meaning of 'int' did not change - it was in accordance with the semantics.
No, different compilers that do not support 64bit will still keep an Integer as 32 bit on a 64bit processor, while other compilers have completely different meanings of an Integer with the same language. For example Delphi's Integer is different than freepascal's Integer since Delphi does not support 64bit. Both languages have different meanings, yet the languages are exactly the same when it comes to the communication on paper, which is what a language is. The dictionary defines languages as a form of communication.
It is true that some compilers fail to meet the language-definition. These compilers are considered non-standard, and if you were communicating in the standard language, these non-standard compilers do not do what you told them to do. Now, just being a little-non-standard is bad for communication, and essentially means one is hearing a different language than the other is speaking. It becomes more obvious when one goes a lot non-standard, such as interpreting what any standard compiler would read as procedure-definitions as commands to assassinate Presidential cabinet members. Consider your own wording: "both languages have different meanings, yet the languages are exactly the same [...] on paper." See: the same on paper, yet two (plural) languages - even you subconsciously know that syntax is not language. (And going back to change your wording just to ruin the context so my words don't make sense is evil. If you continue to do it, I'll play back.)
(reworded) [...] Both compilers have different meanings, yet the language is exactly the same when it comes to the communication on paper, which is what a language is. [...]
Compilers, don't have meaning. Language expressions have meaning.
'''procedure''' hello; // same language, but different compilers have different semantics of integer on the same CPU '''var''' i: integer; '''begin''' '''for''' i:= low(i) '''to''' high(i) '''do''' writeln(i); '''end''';
Others argue whether XML really is a language, even though it is called a Markup Language.
Some argue that a "language" is a whole bunch of things. Are "Languages" as we know them, not clearly defined?
Different standards or types of Language Languages exist.. for example there is Old English, C99, Common Lisp, Some Other Lisp, Ada 95, Ada 8x, etc.
See original on c2.com