Language Barriers In Animal Caretaking

We all know that animals can’t really speak English, nor any other language.  However that doesn’t stop us from communicating with them. Animals are therefore key in certain studies of linguistics.  It’s interesting to see how studies have shown that humans make similar sounds to animals in order to get them to do certain things, such as stay or come, regardless of the human language the handler is speaking.

 

When learning Spanish or another language, it’s kind of fun to keep these ideas in mind – think of learning a new language as an exercise in speaking to another species – many of the links are the same, such as gestures, tones of voice, inflections, etc.

The paradox of language acquisition

Children learn their native language by hearing grammatical sentences from their parents or others. From this ‘environmental input’, children construct an internal representation of the underlying grammar. Children are not told the grammatical rules. Neither children nor adults are ever aware of the grammatical rules that specify their own language.

Chomsky pointed out that the environmental input available to the child does not uniquely specify the grammatical rules [35]. This phenomenon is known as ‘poverty of stimulus’ [36]. ‘The paradox of language acquisition’ is that children of the same speech community reliably grow up to speak the same language [37]. The proposed solution is that children learn the correct grammar by choosing from a restricted set of candidate grammars. The ‘theory’ of this restricted set is ‘universal grammar’ (UG). Formally, UG is not a grammar, but a theory of a collection of grammars.

The concept of an innate, genetically determined UG was controversial when introduced some 40 years ago and has remained so. The mathematical approach of learning theory, however, can explain in what sense UG is a logical necessity.

Learnability

Imagine a speaker-hearer pair. The speaker uses grammar, G , to construct sentences of language L . The hearer receives sentences and should after some time be able to use grammar G to construct other sentences of L . Mathematically speaking, the hearer is described by an algorithm (or more generally, a function), A , which takes a list of sentences as input and generates a language as output.

Let us introduce the notion of a ‘text’ as a list of sentences. Specifically, text T of language L is an infinite list of sentences of L with each sentence of L occurring at least once. Text T N contains the first N sentences of T . We say that language L is learnable by algorithm A if for each T of L there exists a number M such that for all N > M we have A (T N ) = L . This means that, given enough sentences as input, the algorithm will provide the correct language as output.

Furthermore, a set of languages is learnable by an algorithm if each language of this set is learnable. We are interested in what set of languages, L = {L 1 ,L 2 ,..}, can be learned by a given algorithm.

A key result of learning theory, Gold’s theorem [23], implies there exists no algorithm that can learn the set of regular languages. As a consequence, no algorithm can learn a set of languages that contains the set of regular languages, such as the set of context-free languages, context-sensitive languages or computable languages.

Gold’s theorem formally states there exists no algorithm that can learn a set of ‘super-finite’ languages. Such a set includes all finite languages and at least one infinite language. Intuitively, if the learner infers that the target language is an infinite language, whereas the actual target language is a finite language that is contained in the infinite language, then the learner will not encounter any contradicting evidence, and will never converge onto the correct language. This result holds in greatest possible generality: ‘algorithm’ here includes any function from text to language.

Nowak, Martin A., et al. “Computational and evolutionary aspects of language.” Nature, vol. 417, no. 6889, 2002, p. 611+

Clearly the acquisition of languages is a difficult thing to put into a formula.  It involves certain techniques – however everyone’s learning styles are quite similar.  When writing up my review of the Rocket Spanish program, I talked about how some people would much prefer the style of that program over the “immersion” style of the Rosetta Stone series.

Are all humans alike when it comes to language acquisition?  It’s something that will have to be studied further for more detailed information and knowledge.

 

Passwords And Encryption

The old native state of stored data (and even data nowadays on unencrypted drives) was creating a sensitive situation for government computers.  This is where the boost of encryption really got underway as more and more connectivity was gained between computer systems and more security was needed to prevent hacking situations.

It’s a very interesting thing to read about, and as you’ll be able to tell, we’ve certainly come a long way in terms of computer security.  Check out this excerpt from the 1980’s and how encryption was just starting to get used on a larger basis.

 

“Password Weaknesses

The principal difficulty with that type of system control is that once access to a user’s password combination has been gained, all the information stored under that indentification is available. There are no intermediate firewalls of protection for storing particularly valuable information or for storing each valuable piece separately.

Present means for securing communications between systems are either non-existent or involve point-to-point link level encryption. While the Data Encryption Standard (DES) can encrypt and decrypt data on a point-to-point basis, the native state of stored data is plain text and not cypher text.

The integrity of the system’s administration must be trusted to present plain text to the link-level communication wire where encryption can be performed.

That communication has the drawback of requiring absolute identification of the transmitter.

At certain times in a point-to-point communication, it is possible for an intruder to masquerade as the transmitting authority, implicitly validating the transmitted data while preserving the opportunity to copy or modify what is to be sent.

Encrypted Storage

The only practical solution is to encrypt data as it is initially stored. Using different encryption keys allows a user to subdivide information in a finer granularity, so that access to one piece of data does not mean access to all pieces.

Data is transmitted in cypher text in all of its local forms and when sent to remote locations. Only through the use of a decryption key under the explicit control of the original encryptor of the data does clear text become available to anyone.

Public key encryption offers additional benefits that make end-to-end encryption attractive. In public key systems, two keys are created, one of which is kept secret and the other made publicly available. Either key may encrypt or decrypt data.

If one key is used to encrypt data, only the corresponding opposite key may be used to decrypt the data. By encrypting data with the private key in the first place, the data is digitally “signed.” Because only the corresponding public key can decrypt the data, the original encryption operator can be precisely identified.

Service industries such as financial, legal and medical institutions where confidentiality of data is a top priority have found the public key encryption process advantageous. Because different service elements can encrypt different data for different users by their public keys, only the private keys available to each user can successfully produce plain text.

No systems in general use provide that kind of integrated data encrypt/decrypt tools. Public key systems suffer from a large computation requirement to encrypt or decrypt data. A tool with large computation capabilities needs to be made available on a variety of different vendors’ equipment.

It is the system designer’s responsibility to provide the tools to secure data within a system. One way is with end-to-end encryption. The designer must face its processing overheads, particularly with public key methods.”

Holmgren, Steven F. “Combined effort is needed to combat data tampering.” Government Computer News 17 July 1987: 76+.

Bill Gordon of We Hate Malware has written at length about spyware, malware, and other computer threats.  He believes that a quantum state of data storage is what is next for the next level of security, especially as quantum computers of the future will be able to break present-day encryption standards in a matter of seconds.

It’s a very interesting field and one where the security arms race is just heating up!

 

Combat Animals Of The Future?

If you’re into the tech sphere you may know about the different mechs that the US Army is putting together for the army of the future.

 

Zoo Security Came A Long Way

zoosecurityIn terms of zoo security, things are only just now reaching new highs.  It was an antiquated system at best that contained even the most vicious animals.  It’s interesting to look back in terms of how computer security evolved in order to ensure that nobody was able to reach into the minds of these computers and extract the data which governments were so carefully trying to protect.

Tempest is one response to what is slowly being recognized as a technological time bomb: loss of security in computer processing and information transmission systems.

The federal government, especially the military services, recognized more than two decades ago the danger of signal emanations from electronic equipment and the subsequent compromise of potentially sensitive data. Only in the last year or two, however, has private industry begun to look carefully at this explosive topic.

Consider that for less than $300 a reasonably clever technician can put together components bought from a popular electronics chain and be able to read computer screens within a mile or so, from almost any vantage point.

The intrusion device could be placed in a van, trailer or some other unobtrusive conveyance. It could monitor screen messages at random by successively scanning its targets. There would be no need for the device to decode anything, because it would be viewing, with no threat of detection, the same English-like messages the legitimate operator sees. This operator would not know screen images were being intercepted because the entire procedure relies on a simple fact: computers and their related peripherals indiscriminately disseminate information over radio frequencies.

There is no way to change the natural emission of radiation by computing components, but something can be done to stifle the emanations.

That something is Tempest technology. Tempest products constitute one of the hotter niche enterprises today. They include documentation, procedures, monitoring and systems that block potentially revealing emanations. Due to the sensitive nature of the Tempest program, many implementation details cannot be divulged. But there is enough general information and unclassified data to reveal its growing importance both in and outside the federal government.

Reducing and Shielding Signals

Control of compromising electromagnetic or acoustic emanations can be accomplished either by reducing signal levels or by shielding the radiation they produce.

In earlier years, the teletype was one of the prime emanators. Eventually a low-level signal teletype was developed. Today, power lines and signal lines carrying clear, unencrypted text are protected by means of shielding. Lines entering and exiting a “red” (unencrypted, but still classified) area are filtered or isolated to prevent clear text from being transmitted to the outside. Clear text is encrypted before external transmission. Fiber optics often are used to isolate signal lines between adjacent cabinets.

In data processing, Tempest devices traditionally have included peripherals, communications equipment and, more recently, personal computers. Large-scale computers are protected by shielded enclosures for economic reasons. It is simply too expensive to Tempest-certify large mainframes.

Equipment must be installed properly to ensure Tempest integrity. NACSIM 5203, Guidelines for Facility Design and Red/ Black installation, addresses this requirement for the Defense Department.

After installation, a facility is “swept.” Tempest teams visit a facility to measure emanation levels. Only after levels are sufficiently low is the facility approved for operation.

Ergonomics, or human engineering, also affects Tempest equipment. Source-suppression techniques are preferable to shielded enclosures so that switches, buttons, keys and other controls can remain unencumbered by boxes.”

Cashin, Jerry. “Focus on Tempest products.” Government Computer News 16 Jan. 1987: 21+.

 

In other words, shielding signals was one of the biggest ways that computer security advanced in the 1980’s.  Included in that were zoos and other containment facilities.  Imagine a Jurassic Park esque breakout because of a hacker such as Dennis Nedry.