Applied Mathematics Seminar

Fall 2014 Schedule


November 3rd:

Jayant Singh, Department of Mathematics, NDSU

Stability Analysis of Discrete time Recurrent Neural Networks

Recurrent Neural Networks (RNN) have shown promise in diverse applications including Pattern Recognition, and Modeling of systems. We consider the problem of stability of RNN. One of the known approaches is based on Theory of Absolute stability. It provides necessary and sufficient conditions for existence of quadratic Lyapunov function. But there exist stable systems, to which the Theory of Absolute Stability cannot be applied. We have proposed a new approach, based on Reduction of Dissipativity Domain. Some new results in this area will be presented.


October 27th:

Artem Novozhilov, Department of Mathematics, NDSU

Reaction-diffusion replicator equations and their stability

I plan to discuss how reaction-diffusion systems naturally appear in Mathematical Biology, and what kind of problems we can face if we try to generalize the replicator equation, defined on a simplex, for the case of explicit spatial structure. If time permits, I will try to formulate several recent results about permanence of replicator reaction-diffusion equations.


October 20th:

Artem Novozhilov, Department of Mathematics, NDSU

Stability of biological communities

The notion of stability permeates the field of dynamical systems. For different purposes different stabilities can be defined, the most frequent arguably being the stability in the sense of Lyapunov. From a biological point of view, however, the Lyapunov stability and its close relative the asymptotic stability are too restrictive, because for many biological communities we are interested only in the possibility of survival or extinction, without any need to know what are the exact asymptotic regimes in the system. This consideration led to the notion of the so-called ecological stability (a term coined in Russia by Svirezhev), which is called in the modern literature permanence (if you are from Europe) or uniform persistence (if you are from North America). I am planning to talk about the exact definitions of this kind of stability and to give several illustrative examples, including the famous replicator equation. The talk will be non-technical, and all the graduate students are invited.


September 29th:

Indranil SenGupta, Department of Mathematics, NDSU

Exotic option pricing in finance using Levy models (Part 3)


September 22nd:

Indranil SenGupta, Department of Mathematics, NDSU

Exotic option pricing in finance using Levy models (Part 2)


September 15th:

Indranil SenGupta, Department of Mathematics, NDSU

Exotic option pricing in finance using Levy models (Part 1)

In this talk I plan to go over the basic properties of Levy processes and its relation to financial models. In our recent works we have successfully used some special Levy processes to price different exotic options. In this presentation I plan to go over some new results in this direction. I will assume no background on "exotic options" or "Levy processes". Graduate students are very much welcome!


September 8th:

Davis Cope, Department of Mathematics, NDSU

Properties of difference-of-gaussian filters predict parameter regions for center-surround neural receprive fields

Difference-of-gaussians (DOG) functions provide the standard model for the receptive fields of center-surround neurons. Such fields have circular symmetry and consist of an excitatory center and inhibitory surround (ON-center case) or vice-versa (OFF-center case). Retinal ganglion neurons and lateral geniculate nucleus neurons are center-surround neurons, which thus play a major role in the initial levels of the visual system. DOG functions were introduced as 3-parameter models for center-surround fields in 1966 for empirical reasons and become the standard due to their empirical success. A natural question occurs: Is there a functional (theoretical) reason for this success beyond the purely descriptive (empirical) reason? We approach this question as follows: Modeling the receptive field as a DOG function is equivalent to using a DOG filter for the neuron's linear response regime. We analyze DOG filters to determine possible behaviors, then determine the regions of parameter space characterizing each behavior. These regions can be compared with observed parameter values. A concentration of values in one region would then identify that region with a function of center-surround neurons. We provide such a plot using reported values from multiple sources. The plot indicates that a primary function for center-surround neurons is to serve as regular band-pass linear spatial filters (that is, information-preserving filters with a single optimum frequency magnitude). The plot also shows a clustering of values that strongly suggests the existence of some further underlying relation. This talk reports joint work with Barbara Blakeslee and Mark McCourt of NDSU's Center for Visual and Cognitive Neuroscience and extends results reported in Cope, Blakeslee, McCourt [2013].


Back to Artem's Homepage