Very religious people have greater wellbeing
superant
This Gallup survey announcement is fascinating. I was very interested in reading the results on the Gallup website. I imagine it is unlikely they will advertised the other finding of the survey, which shows that nonreligious people report higher wellbeing than moderately religious survey respondents.
Gallup clearly explains that the connection between religion and wellbeing in this survey is a correlation, but not an indication of a cause relation between the two. They also point out that one strong factor in the results is a higher incidence of smoking among people with lower wellbeing.
In terms of possible things about religion that could add to a sense of overall wellbeing, the Gallup website suggests several possibilities. I think the best suggestion, they give, is the idea that increased amounts of time spent socially and engaged in social networks might contribute to improved feelings of wellbeing.  Many religions feature community activities prominently as part of their practices. So it is likely that very religious people have supportive social ties and interactions which might improve their feelings of wellbeing; and if they smoke less, these individuals should be physically healthier.
Steven Gibson

In response to March 11, 2012 Valley Sun In theory question.

Conflict between religion and secular advocates
superant
There will likely always be tension between advocates for religious
beliefs and advocates for secular  or atheist beliefs. Some members of
each of these camps believe that political leaders and systems should
take advice from them in how the government should operate. There is a
very long history of religious leaders believing the government should
support them and encourage religious faith in a nation. Even in the
United States, where separation of church and state is the law of the
land, I can not handle any  piece of money without seeing "In God we
Trust." The Pledge of Allegiance contains the words "one nation under
God." Most religious advocates do not even see these strong clues of
religious influence in our social and political life. And in Britain
an even deeper history of strong religious influence has existed.
Secular views are not protected in Britain by law, and that condition
exists in most countries in the world, including Iran, Pakistan, Saudi
Arabia, etc. The Anglican Church is the politically recognized
official religious authority in England by Act of Parliament.
Twenty-six bishops in the Church of England have legal rights to
participate in the House of Lords. Just last week the Queen made one
of the most outspoken speeches of her 60-year reign about the need to
honor the Church.  Her speech included: "We should remind ourselves of
the significant position of the Church of England in our nation's
life." So while advocates for religion in England are decrying recent
court decisions, advocates for secularism and/or atheism feel the time
is long past for religion to continue to play an overly exalted role
in British life.
For different reasons advocates for atheism and secularism, in the
United States are engaging and will continue to engage in friction
filled dialog with advocates for religion in this country. Each side
believes the other side has too much influence in political life and
decisions. So as long as advocates for each side believe the other
side has too much power and influence, the friction will continue.
Steven Gibson

In answer to Valley Sun In theory question Feb 26, 2012

Interaction Matrix Model for Language Production
superant
Steven Gibson
California State University - Northridge

Abstract.‭ ‬

The Congruent Interaction Matrix (CIM) model is being

formulated to represent knowledge updating and language production. The line of research taken in this paper is limited to a set of questions restricted to concepts and actions that can be used in modeling human language behavior. The Congruent Interaction Matrix model introduced here proposes virtual structures represented as matrices. The theoretical and practical value of developing this framework and set of algorithms is discussed, in order to create tools useful for modeling human communication interactions. Possible future research studies and applications are suggested. The development of these tools could have future implications for human and machine communication analysis and production.



link to paper: http://stevengibson.org/LasVegas2011/ICA3690.pdf

link to slides: http://stevengibson.org/LasVegas2011/LasVegasSlideShow2011.pdf

link to talk outline: http://stevengibson.org/LasVegas2011/LasVegasTalk2011.pdf

link to conference: http://www.world-academy-of-science.org/worldcomp11/

Presented at: 2011 World Congress in Computer Science - Las Vegas July 2011

Modeling building blocks for language production and processing
superant
Abstract:This paper describes the design and implementation of software that models an aspect of language use. This software postulates that human linguistic activities can be treated as modules that can be assembled and joined together. This model represents activity of modules in the production of language behaviors. The model asserts these modules interpret sensory data and respond with language behaviors. This paper seeks to offer a unified model to explain language production and processing. The software demonstates elements that are added and fulfill language needs. The model also expresses the coexistence of multiple modules that will interact and support each other. While this model does not offer a complete solution to all language use, it presents indications that language processing can be modeled computationally. Future work will be suggested to build on the model.

Poster presented at: ASIC 2010 Oregon in the summer
http://www.cogs.indiana.edu/asic/2010/abstracts.asp#g

draft PDF at: http://stevengibson.org/Science/GibsonASIC201005.pdf

Mathematical proofs
superant
http://rjlipton.wordpress.com/

Interesting article that starts about a modern approach to P! = NP problem but expands to "Today I want to talk about methods mathematicians use to attack hard problems. Some of these methods are used often by theorists too, while others do not seem to be used as much as they could be."



Framework for Describing English Micro‐Dialects
superant
Framework for Describing English Micro‐Dialects
Steven Gibson 2010

This paper proposes developing steps for building categories and procedures to record dialects that do not meet the traditional paradigm of heritage language. The paper proposes assessment tools and categorizing methods for predicting or explaining post‐secondary success.

Presentation at the First International Conference on Heritage/Community Languages - UCLA Feb 2010

Slide presentation at:  http://stevengibson.org/PCC/SAGslide03.pdf

http://www.international.ucla.edu/languages/nhlrc/conference

Process algebra modeling of human communication‭
superant
Process algebra modeling of human communication‭
Steven Gibson
Superant Computing

Abstract.‭ ‬This paper proposes the use of process algebras and calculi to model human communication behaviors.‭ ‬This paper offers the beginning of formal methods to study the interaction of human beings.‭ ‬We will detail a limited process algebra to provide a tool for the high-level description of interactions,‭ ‬communications,‭ ‬and synchronizations between a collection of independent events.‭ ‬This limited process algebra is called event algebra.‭ ‬The expected outcomes are evaluation,‭ ‬verification of model identity and deadlock detection.

link to paper: docs.google.com/Doc

link to slides: docs.google.com/Doc

link to conference: http://www.world-academy-of-science.org/worldcomp09/ws/program/ica14

Presented at: 2009 World Congress in Computer Science - Las Vegas July 2009

Rule Tables for Animal Response
superant
Rule Tables for Animal Response

Steven Gibson

SuperAnt Computing, Los Angeles, USA

 

Introduction

This presentation addresses uncertainty in data sets gathered about animal responses using multi-step observations and model selection. Uncertainty is ubiquitous in studies of animal response and behavior. This presentation proposes calculating animal choices using rule tables. It is assumed that animal behavior is responsive, following stimulus or sensory input followed by behavior choices or triggered results. The challenges include: 1. incomplete, imprecise and uncertain data, 2. building rule sets, and 3. determining trigger points. It is postulated that triggered behaviors result in classifiable posterior results.

“Is there any point to which you would wish to draw my attention?”
“To the curious incident of the dog in the night-time.”
“The dog did nothing in the night-time.”
“That was the curious incident.” remarked Sherlock Holmes.
from Silver Blaze, by Sir Arthur Conan Doyle

There are real results in observing animal behavior. We need to find the significance in measurable ways, even when the data is sparse or uncertain.


Uncertainty & Methods


Incompleteness and uncertainty usually are a regular part of data collection. Uncertainty is a fundamental component of model selection. Data uncertainty has several origins: Imprecision due to lack of valid observations, distortions in observational processes, or errors in transformations or calculations introduced to the data after collection. Data uncertainty can reduce our ability to measure the quality of our predictions. We are sometimes faced with limits using statistical inference with incomplete data and need to take care with our assumptions. Fortunately incompleteness and uncertainty can sometimes represent knowledge or help build priors. As the models are built for selection, the data and models need to be matched.


The data embodied in the priors must be applicable to the data currently being measured.


Decision rules are used to serve predictive functions.


The goal is to select models based on how they can give the best predictions of future observations generated by similar data.


 

Animal Behavior


We are using a thought experiment of dog behavior based upon a hypothetical dog choosing to bark. Animal behaviors often seem computationally and practically unpredictable. There is a combinatorial property of animal behavior that requires repeated model selection. Behaviors involve cycles of needs, desires, responses and triggers. Behaviors are very non-linear and subject to variation and uncertainty in most priors. But we beleive a large enough set of observations will exhibit stochastic charcteristics that point to repeated patterns. The behaviors can be built into classifiers of mult-level models for model selection. These are represented as behavior models that demonstrate patterns that can only be predicted using multi-level selections. The difficultly in selecting models for animals behavior may partially be due to the combinatorial manner of observed behavior.


Our hypothecial example uses “canis lupus familiaris”, the common dog. We postulate that some behaviors are instinctual while some are modified by learning processes. Our one example simply addresses the prediction of a dog barking based on observed data limited to sleep, relaxation, alertness, and degrees of barking. Using survey data, we postulate that the trigger for barking is high alertness or previous barking. We use a two level model selection to give us the prediction result.


Behavior predictions result from models chosen based upon and mediated by trigger rules.


This is NOT an attempt to model cognitive processes. The model selection is limited to prior data and posterior data.


Priors To Decisions


In building and selecting models, priors are used instead of likelihood. These models are meant to  predict outcomes. The prior knowledge is used to develop classifiers and derive predictions. For given data, it belongs to class 1 or class 2. Priors are collected on behaviors of the animal like alertness, barking and sleep.


Inference


Rule Tables


We base the built classifiers on observed data. Of course we have selected a subset of all possible data to use, and not all data is observable. The important assumption is that by repeated runs, we will obtain accurate predictive ability even without filling in the missing data. So the key is the emperical nature of the cycle. With only conserns for the posterior results we should be able to move closer and closer to accuracy with repetition. Our predictive inference will be made with a sub-sample of the data. The steps will involve building a set of utility models to be selected. Then based on outcomes the models will be refined.


This  hypothetical data is meant to represent repeated data collection of animal behaviors that help choose the models. Either class 1 or class 2 will fit the data best.

Discussion & Future Work

The problem of incomplete data in statistical inference is pervasive in statistical practice. Our ability to validate our conclusions can be limited with incomplete and uncertain priors. The process of using levels of model selection recreates the scientific process. We go through repeating cycles of  evidence collection,  interpretation in context dependent ways, evidence weighting and production of coherent results.  Uncertainty is inherent in this approach, so misleading predictions are often produced. Special care should be taken to document the assumptions underlying the models built with the missing data.
Other animal behavior models should be built and tested.

References

J. Scott and J. Berger. “An exploration of aspects of Bayesian multiple testing”. Technical Report, Duke University. 2003

G. de Cooman and M. Zaffalon. Updating beliefs with incomplete observations. Artificial Intelligence 159, 2004

G. Socher, G. Sagerer and P. Perona. “Bayesian Reasoning on Qualitative Descriptions from Images and Speech” Image and Vision Computing  2000

More info: http://superant.livejournal.com

Presentad at:: 9th World Conference of the International Society for Bayesian Analysis (ISBA),  held on Hamilton Island Australia in July 2008


Why Numbers Matter
superant
Why Numbers Matter: The construction of the number-line
by Steven Gibson January 2006

Abstract
This paper proposes a model to explain the usefulness of the number system. The number line and basic number theorems are used daily by people to make predictions and explanations of real world events. People commonly believe arithmetic works and this paper offers one systematic model to explain it. The proposal is that a 1-dimensional cellular automation can be used for modeling the correspondence between the real world and features of the arithmetic number line. I postulate that any other universal computing tool could also be used for the model. So reversible Turing machines and lambda calculus could also be used for this modeling.

Numbers are posited to be human artifacts that model processes like measurable objects in space-time. The paper details the differences between real world space-time energy-matter and the human artifact of the number line. Then we offer a model that bridges the two. It is postulated that number features are expressed with symbols and a finite sequence of instructions.

While the domain covered by this model is limited to positive integers of the natural number sequence and basic arithmetic, some unexpected implications about number theory will be stated for future exploration.

Beginning Assumptions
The use of numbers has been a tool that has raised humankind to the heights of technology that it has achieved.

The proposal is that a 1-dimensional cellular automation can be used for modeling the correspondence between the real world and features of the arithmetic number line.

Our belief about space-time postulates is that the common sense notion of numbers and counting is only valid because of created structures. Selecting a number corresponds to slicing the system over an arbitrary area.
Other assumptions are that discrete space-time and matter do not exist in the real world, but humans approximate the appearance of discrete units, and it works.
Time is a human representation of the change states of space-time.
Here it is postulated that numbers do not exist, but we can work with abstract objects that work. We can create units that are usable in the real world. We are able to create a practical use even though there is tension between discrete and continuous points of view.
Numbers are a creation of our minds, while space and time have an independent reality that must be mapped out experimentally. I here assume arithmetic is the work of mind and constructive methods can produce experimental solutions. The ontological framework used here is that all entities are parts or facets of one unified continuum

Energy/Matter
The postulate in this paper is that events and objects in the world are representing states of matter and energy. All matter and energy is a view of one whole. Energy is a human abstraction of states of objects or events that exist in the world. Matter and mass are assumed to be other states of the same entities that are measurable as states of energy.

Space-time is the arena on which all physical events take place. We observe that space-time displays discrete and continuous forms.
The real world in large scale and small scale does not agree with intuitive human ideas. Our model for real world events is not grounded in our human senses, but must be developed and tested by instruments of observation and measurement. These instruments are the result of constructs based on existing or proposed theory. We postulate that spatial state change produces observable objects in space. By observing and measuring events and objects we are extracting usable order out of the complexity of space-time.

Energy/matter seems to be distinct because:
1. Our visible observation equipment detects differentiated objects
2.one unit of energy/matter can interpermeate other energy/matter\
Number-line/Arithmetic

Numbers are postulated to be human constructs. Simple rules lead to the number system. Computation proceeds by making changes in a coherent way. Physical systems are countable based on measurable information. We create an individual number by selecting a portion of energy/matter we see as standing out of the energy/matter continuum. So a number is an abstract symbolic object we create based on real world events/objects.
The use of numbers is a result of human cognitive requirements and language/symbol tools. We postulate each number being a discrete repeat of any other one. The labeling of a second “2” is an arbitrary human artifact. A one implies the lack of no number, meaning not-one or zero. For us to identify one we must be able to separate from the not-one.
Some postulates about numbers:
they are constructs of human thought
The number line as such is a model of representing and manipulating the symbolic number
some of the qualities that are expressed in number theory result from how we conceptualize numbers other than 1
the elementary number theorems seem to be based in our experiences with the real world around us
Numbers work because we can use language constructs to describe the world. We can record these constructs and take action based on them and cause a predicted effect in the real world. Objects, and events exist in the real world. Which can be described with language and used. Our interpretations of the world and actions in the world depend upon the sense tools we can use to view the world. and the action tools we can use to act in the world. So these representations entail concepts determined by real actions. Mathematical terms, for instance the concept of number, are explicitly defined and are abstract insofar as they are determined within the logical structure of a language. They also refer, to concrete objects and actions. So numbers, have an abstract structure on the one hand, on the other a connection to the arithmetical activities that work in the real world.
The ontology of this approach is that the assertions of number theory have no objective truth values, independent of the conventions, languages, and minds of the mathematicians.

Cellular Automation/Bridge Model
The postulated model here is that we can use cellular automation to model the correspondence between the real world and features of the arithmetic number line.
Cellular Automation are discretely defined but exhibit continuous dynamics so are useful to address the discrete and continuous.
A cellular automation can model usable calculation algorithms. These models are idealizations that capture some features and ignore others. Mathematical tools are embedded in structures in natural language. The elements for this bridging model include rules, procedures, transformation and rewriting. We need to model a arithmetic system with algorithmic methods. The algorithm contains numbers and follows a set of steps. Prediction of adheres to a program's run. The program plus its initial condition moves to the end task.
The reason CA works as a bridge model is because:
A. it gives the correct results for basic arithmetic
B. it models actual real processes in the energy/matter continuum's
i) algorithmic steps forming a sequence
ii)individual units that change through the action of the steps
A step in a change rule is a string of simple elements. Change rules apply rules repeatedly and model to energy, space and time.
This model uses a 1 dimensional first order cellular automata with the following definition:
This 1D binary CA is defined as where
Z can be finite or infinite
S = {0,1} is a set of two values
N = { − 1,0,1} is the neighborhood of size k = 3 with symmetric radius k0 = 1
f is a transition function rule
B = {b − 1,b1} is the boundary

1 -> 000
2 -> 001
3 -> 010
4 -> 011
5 -> 100
6 -> 101
7 -> 110
8 -> 111

This CA is reversible. This CA function mapping is bijective.
And from H. Morita we find invertible Turing Machines can be simulated by invertible cellular automata.

Conclusion
The conclusion is that the numbers, number-line and basic arithmetic are not Platonic objects but are human created tools.
It is shown that the usefulness of numbers can be explained by models of human language and models that explain the usefulness of the space-time postulate.
The number-line is stated to be a tool created by humans to be used in algorithms. Humans posit real world events that behave like numbers. Individual objects are posited to be useful within the present space-time theories. A model that produces the traits of the real world, and of basic numbers and arithmetic is a simple universal calculating system like cellular automation. Cellular automation has traits that model the real world and can emulate and reproduce arithmetic methods.
The number-line models energy in real space with two restrictions
1.the numbers are restricted to 2 directions
2.the cases are limited to full units (integers)
The number-line requires two aspects
A. a symbol set, and
B. a program that uses the system set
Programs require at least one function definition and can have a transformation rule. A program is defined to be a finite sequence of instructions.

A computable number is a number for which there is some program to compute it. Turing machines or cellular automations can model such computation
A model for 2 dimensional arithmetic can also be modeled with similar CA. Basic forms of behavior are modeled by cellular automation, Turing machines or lambda calculus. Arithmetic laws are validated in ways that other mathematical postulates are verified.

Future Steps
The linear number system may be a special case of a number system model that could be built that is 3 dimensional. This could lead to models that demonstrate some constants like pi and cosine and perhaps explain irrationals like square root(2).
Numbers arise because of the trait that processes have of being able to be sliced through a portion of time and change can be observed or likeness can be seen.
Numbers can be thought of as a abstract model that intrudes into a small part of the complete space/time of a process with measurable abstract objects.
There is perhaps work to be done using order theory to move from individual objects to the number-line. We need a systematic approach to object identification. Such a system would be foundational to arithmetic. To have usable objects we must be able to individuate entities out from the continuity and to identify these individual entities over time

References

1.Rudy Rucker, Mind Tools: The Five Levels of Mathematical Reality,
1987
2.S. Wolfram. A New Kind of Science, Wolfram Media, 2002.

3. George Lakoff, Where Mathematics Comes From: How the Embodied Mind Brings Mathematics into Being, August 2001

4. Jurgen Habermas, The Theory of Communicative Action: Reason and the Rationalization of Society, Social Science, 1985
5.Wilfried Sieg, (following Turing and Gandy),Calculations by Man and Machine for an abstract setting for computation
6.Konrad Zuse, Calculating Space, 1969
7.J. H. Conway, On numbers and games, second edition, A. K. Peters, 2001.
8. Edward Fredkin Did the Universe Just Happen Altantic Monthly April 1988,
9. Stewart Shapiro, Modality and ontology - philosophy of mathematics,
Mind, July, 1993
10. Peter Lynd, Time and Classical and Quantum Mechanics: Indeterminacy vs. Discontinuity, 2003

Presented at Wolfram Science Conference June 2006 - Washington DC
http://www.wolframscience.com/conference/2006/presentations/gibson.html

Model for Measurements without Space or Points
superant
Model for Measurements without Space or Points
by Steven Gibson July 2005

This paper postulates that points with zero dimensions are never accurate models for measurement and calculation. Any point under observation is a measure of an area. On a fine level absolute points do not exist. Of course when moving to human scale we can work with objects that behave like classic points. But this is because we are then measuring complex objects composed of sets of measurable objects.

This method proposes changing our conception from discrete core objects to continuous (state based) measurement objects.

Viewing points as having zero dimensions leads to several unnecessary paradoxes. For example, an infinite number of points don't get you anywhere. You never get to a physical object or distance. It’s turtles all the way down.

This model is supported by models based on information. For us to measure anything in this world requires information to be received. Information is a selection out of uncertainty. We are immersed in information and we select a specific bit of it for our measurement. Since measurement selects some bit of information that has physical traits, so any object we can work with will also have physical traits.

So this model suggests viewing points as measures of areas. Likewise space has no existence separate from measurements. We should define space as an object defined by measurement of a set of objects.

Selections of an actual object from the potential objects depend on evidence obtained from a measurement. This model will formally describe how the number system is built from measurement. The mathematical formalism of numbers mapping to points on a line should be changed to view numbers as probabilistic objects in a cloud that are found by measurements of information.

This model leads to the idea that the real numbers do not represent a full set of useful numbers. Real numbers are viewed as existing along a linear space
I would suggest numbers can be seen like a cloud of representational objects filling a volume of space.

This proposed model may change our view of imaginary numbers. I have suggested that the set of numbers should not be viewed as a linear set. Instead the set of numbers should be seen more like a cloud of representational objects filling a volume of space. So a subset of the number set could be abstracted out that represents the old number line. -n to +n, but I would suggest numbers could exist like 1-1+n and 1+1-n With that model we would find a number that when raised to a power of 2 will result in -1
x2 = -1

This model may unify our modeling of numerical systems from numbers to probability to logic. This model suggests the set of numbers is a cloud of information extracted from uncertainty. Perhaps probability can be viewed as a specialized model of this information pulled from uncertainty. Probability statements are about the likelihoods of outcomes: one exact event either occurs or does not, and you can bet on it. This measurement model does not select exact events out, but instead models the representations of events.

The model supports the idea that 2 valued logic is just a subset of more complex views of logic. Probability involves methods of measure of uncertain events or knowledge or defining selection of future events. Logic can be viewed as a specialized model relating to probability theory. I suggest viewing logic as a model for the representation of data range selection. We can use this data range interpretation for the management of real systems. It helps model decisions by selecting ranges of choices instead of only allowing one choice. Data ranges allow us to handle complex systems. Boolean logic, which most people are familiar with, is a subset of more generalized logic. In Boolean logic, we are limiting ourselves to data ranges that only include two values.


What seems like a discrete point in everyday life, like our body, a car, or a tree are measures of complex sets of objects.

This model may even address the Many Worlds vs Copenhagen view of Quantum Change. Since this model postulates there is no such think as a discrete point, so perhaps each world variation created by Quantum change is like a probabilistic possibility that exists and is not solidified until a measurement is taken. So I would postulate that there is a temporary many worlds existence that is created constantly. Then when information is selected out of the probabilities the possible worlds collapse back to one.

Does Schrodinger's cat split?
Consider Schrodinger's cat. A cat is placed in a sealed box with a device that releases a pellet of catnip if a certain radioactive decay is detected. For simplicity we'll imagine that the box, whilst closed, completely isolates the cat from its environment. After a while an investigator opens the box to see if the cat has catnip or not. According to the Copenhagen Interpretation the cat neither has catnip nor does not have it until the box was opened, whereupon the wave function of the cat collapsed into one of the two alternatives (catnipped or non-catnipped). The paradox, according to Schrodinger, is that the cat presumably knew if it had catnip “before” the box was opened. According to many-worlds the device was split into two states (catnip released or not) by the radioactive decay, which is a thermodynamically irreversible process. As the catnip/no-catnip interacts with the cat the cat is split into two states (catnip happy or not happy). From the catnip happy cat's point of view it occupies a different world from its non-catnip receiving version. The onlooker is split into two copies only when the box is opened and they are altered by the states of the cat. The cat splits when the device is triggered, irreversibly. The investigator splits when they open the box. The catnip happy cat has no idea that investigator has split, any more than it is aware that there is a unhappy cat in the neighboring split-off world. The investigator can deduce, after the event, by examining the catnip mechanism, or the cat's memory, that the cat split prior to opening the box.
According to this proposed measurement model I would say that the potentialities of the cat getting captnip or not are temporarily split during the passage of time and the potentialities resolve to one by the event of the box being opened.

You are viewing superant