DigitalCommons@UTEPCopyright (c) 2016 University of Texas at El Paso All rights reserved.
http://digitalcommons.utep.edu
Recent documents in DigitalCommons@UTEPen-usTue, 31 May 2016 01:35:59 PDT3600Development of rigorous electromagnetic simulation tools for anisotropic structures
http://digitalcommons.utep.edu/dissertations/AAI1591948
http://digitalcommons.utep.edu/dissertations/AAI1591948Fri, 27 May 2016 12:53:33 PDT
Many times, mankind’s creative nature has pushed the envelope beyond what was deemed physically possible at the time. People have engineered ways for us to fly, communicate across the globe in seconds, landed on the moon, and more recently discovered ways to make an object invisible. Unfortunately, there are times that this creativeness is hindered by funding, access to required tools or a combination of both. While this work cannot provide funding, it does provide a tool that one could use to model and simulate a variety of media. The media can vary from devices already being implemented to more complex materials like anisotropic metamaterials. Some of these anisotropic metamaterials are currently being investigated for use in the areas of near field spectroscopy, cloaking, Dyakonov surface waves, and much more. These three topics require exotic materials with properties not found in nature, and the simulation tools currently available are quite costly to the normal consumer. This work aims to provide an algorithm to analyze complex anisotropic structures that can be implemented in any coding environment and opening the doors for more minds to play and invent new applications using these fascinating electromagnetic anisotropic materials.^
]]>
Jose Luis EnriquezElectrical engineering|ElectromagneticsCombining semiparametric regression and kriging for prediction of PM2.5 pollutant levels at unmonitored locations with meterological and traffic data
http://digitalcommons.utep.edu/dissertations/AAI10103276
http://digitalcommons.utep.edu/dissertations/AAI10103276Fri, 27 May 2016 12:53:15 PDT
Particulate matter (PM) is defined by the Texas Commission on Environmental Quality (TCEQ) as "a mixture of solid particles and liquid droplets found in the air". These particles vary widely in size. Those particles that are less than 2.5 micrometers in aerodynamic diameter are known as Particulate Matter 2.5 or PM2.5. These particles are inhaled, and their health effects are still largely being studied. Past studies have assessed PM2.5 exposure of a population, yet individual exposure is more difficult to assess and may vary widely in a population. Recent studies have combined semiparametric models with kriging (Li et. al [2012]) to assess nitrogen dioxide exposure in California. These methods may prove valuable at predicting PM2.5 at unmonitored locations in El Paso and subsequently in assessing personal exposure to PM2.5 within our population.^ Garcia (2010) provides us with a unique opportunity to estimate the spatial covariance of PM2.5 in the El Paso region. Past studies have established that PM2.5 varies spatially within a region based on local traffic variables (Smith et. al [2006]). Other studies have found meterological, variables such as wind speed play an important role in PM levels (Staniswalis et al.[2005]). First, we use meteorological variables to build a semiparametric model to estimate the mean PM2.5 at two monitored locations. Then in conjunction with traffic data and the spatial covariance structure of PM2.5, we use kriging of the residuals of the semiparametric models to predict PM2.5 at unmonitored locations.^
]]>
Justin Jonathan StrateStatisticsTrends in Occupational Employment Size, Growth and Average Hourly Wages in El Paso and Doña Ana Counties 1999-2017
http://digitalcommons.utep.edu/hunt_techrep/9
http://digitalcommons.utep.edu/hunt_techrep/9Fri, 27 May 2016 07:00:08 PDTEsmeralda Orozco et al.MS033 Trost and Trost Architects Colleection
http://digitalcommons.utep.edu/finding_aid/119
http://digitalcommons.utep.edu/finding_aid/119Mon, 09 May 2016 13:27:19 PDT
Born in Toledo, Ohio, Henry Charles Trost (1860-1933) was an architect who designed hundreds of buildings and residences in El Paso, Albuquerque, Tucson, and other southwestern cities. In 1905, Henry Trost established the Trost and Trost architectural firm in El Paso with his brother, Gustavus Adolphus Trost, also an architect. In 1908 Gustavus’s twin brother, Adolphus Gustavus Trost, joined the firm and worked as a structural engineer. The Trost and Trost Architects collection dates 1911 – 2013, bulk 1926 – 1985. The collection is arranged chronologically. Types of records include correspondence, floor plans, clippings, articles, photographs, notes, historic preservation project plans, and other printed material. These records help document the history and renovation of the Hotel Cortez in El Paso. They also provide information about Henry Trost and the Trost and Trost architectural firm.
]]>
Abbie WeiserThe Hub of Human Innovation: Economic Impacts of the UTEP Paso del Norte Clean Energy Incubator Program in El Paso, Texas over October 2014 - December 2015
http://digitalcommons.utep.edu/hunt_techrep/8
http://digitalcommons.utep.edu/hunt_techrep/8Thu, 05 May 2016 15:08:57 PDTManuel L. Reyes Loya et al."I was nothing but a lender of what I was ordered to supply..." Francisco Amangual, trustee of the presidio and las companias volantes in the Spanish borderlands, 1701--1812
http://digitalcommons.utep.edu/dissertations/AAI3724941
http://digitalcommons.utep.edu/dissertations/AAI3724941Fri, 29 Apr 2016 16:21:05 PDT
Francisco Amangual represented an agent for the Spanish colonial empire throughout his career as a presidio paymaster and in his ultimate role as the captain of a specialized unit of the borderlands military, the so-called compañías volantes, or flying squadrons. This study reorients colonial borderlands scholarship by making clear the significance of the empire's lesser known intermediaries charged with documenting the seemingly mundane activities of life in the garrison. Further, assigning a cogent place to the presence of the volantes allows for a more thoroughgoing understanding of their history by disentangling their place from among the various presidial units in the Spanish colonial frontier.^
]]>
Roland RodriguezLatin American history|History|Hispanic American studiesRecovery of nutrient nitrogen from municipal wastewater residuals by gas membrane separation
http://digitalcommons.utep.edu/dissertations/AAI1593365
http://digitalcommons.utep.edu/dissertations/AAI1593365Fri, 29 Apr 2016 16:20:54 PDT
There are relatively high amounts of ammonia returned to the headworks of a municipal wastewater plant that are extracted from the dewatering process. This ammonia load contributes to a significant oxygen demand in the secondary treatment, requiring greater air blower energy costs. This goal of this research was to evaluate the feasibility of the recovery of an ammonium sulfate product from belt press filtrate. The objectives of this research were to: (1) analyze wastewater treatment average monthly data flows and constituents with respect to ammonia nitrification, (2) the evaluate the technical performance of a two sage gas-separation membrane for recovering ammonia separation, and (3) evaluate the economic feasibility of wastewater plants operating ammonium sulfate production. Results showed that, for the years 2012–2013, the Bustamante Wastewater Treatment Plant had a monthly average of 1197 lb/day of ammonia (as N), which required an average energy cost of $60,868per year. The ammonia separation process was observed to operate optimally at a pH of 9.8 in the belt press filtrate influent, which maximizes the ammonia speciation without causing calcium carbonate precipitation. At a filtrate feed flowrate of 2.0 gpm in the pilot system, 68% of the ammonia was removed, producing approximately 11 lb/day of ammonium sulfate. Scale-up to treat the average 0.18 million gallons per day (mgd) of belt press filtrate would have an estimated annual energy cost savings of $23,900 and ammonium sulfate revenue of $15,135 per month for a total net financial benefit of $205,500 per year. With an estimated total capital cost of $1,680,280 the simple payback period was estimated to be 8 years. Future work should evaluate more efficient particle pretreatment to minimize fouling of the gas-separation membranes.^
]]>
Evelyn RiosCivil engineering|Environmental engineeringReckless anxiety
http://digitalcommons.utep.edu/dissertations/AAI10061480
http://digitalcommons.utep.edu/dissertations/AAI10061480Fri, 29 Apr 2016 16:20:36 PDT
Reckless Anxiety will focus on fourteen short stories that have characters who all suffer from some form of mental illness, compulsion or disorder. The title is itself a paradox, because anxiety in its general sense could make someone with it more cautious than normal. I wanted to take it to an extreme level and approach writing with the recklessness of an artist afraid of what could happen should he ever stop moving or stop dancing. ^ PROJECT BACKGROUND: Before I started developing my thesis, as a writer I felt as if I was on a path with only one direction. In the periphery, I could see that there were other Latin@ authors and poets that had been charting out the map and making incredible inroads into the literary world. I felt disconnected to that, as if those paths were meant for someone else. I had, however, made inroads myself into different venues of writing. I had been a journalist in high school and in college and had adapted those lessons into my professional career. But for writing on my spare time, that process started changing when I enrolled at the University of Texas at El Paso’s Online Master of Fine Arts in Creative Writing. Before that, my biggest hangup as a writer was being genre-stricken—that is, sticking to either sci-fi or fantasy. Once I started at UTEP, I started a process that made me appreciate, learn, and breathe literary fiction. ^ Writing short stories is a process that, for me, begins first and foremost with writing poetry. In my poetry, I use both image and line to the best of my ability to describe things around me. As a poet, I find something specific that calls to me, whether it is a building off of Houston’s Southwest Freeway, a particularly strong emotion at the closing of a Waldenbooks in Brownsville, or seeing two complete strangers dance the night away in a country western club. Then I scribble a page or two about that particular moment on a notepad that tends to be on my person most of the time. I then wait a week before transferring the physical notes into digital ink, where it then sits in a designated “crockpot” folder. Here, my ideas stew for one more week while I initiate other projects. Once the week passes, I re-open my project and determine whether or not the poem is ready. I have found that waiting several weeks to truly determine the worth of a poem rather than trying to perfect a poem in one sitting, and to view that poem with new clarity, has allowed me to greatly improve my poetic talents. If it is ready, then it is saved and kept in a folder for submissions. If it is not, then I set the poem aside in another folder and cannibalize the imagery for something else, or perhaps for future edits. Within this graveyard of incomplete poetry I can see if there is a connection or enough substance and enough imagery to go deeper. This makes up the first component of my fiction process. The other component involves having all that occur inside my own mind, everything from the initial draft to determining the worth or potential of a creative project. The union of these two processes is what allowed the birth of Reckless Anxiety as my thesis.^
]]>
Hugo Esteban Rodriguez CastanedaCreative writingEl Paso Smelting Works: Sewer System Dep: General Plan of Sewer System
http://digitalcommons.utep.edu/maps/5
http://digitalcommons.utep.edu/maps/5Thu, 28 Apr 2016 06:40:09 PDTEl Paso Smelting WorksOn Geometry of Finsler Causality: For Convex Cones, There Is No Affine-Invariant Linear Order (Similar to Comparing Volumes)
http://digitalcommons.utep.edu/cs_techrep/1008
http://digitalcommons.utep.edu/cs_techrep/1008Fri, 15 Apr 2016 09:29:51 PDT
Some physicists suggest that to more adequately describe the causal structure of space-time, it is necessary to go beyond the usual pseudo-Riemannian causality, to a more general Finsler causality. In this general case, the set of all the events which can be influenced by a given event is, locally, a generic convex cone, and not necessarily a pseudo-Reimannian-style quadratic cone. Since all current observations support pseudo-Riemannian causality, Finsler causality cones should be close to quadratic ones. It is therefore desirable to approximate a general convex cone by a quadratic one. This cane be done if we select a hyperplane, and approximate intersections of cones and this hyperplane. In the hyperplane, we need to approximate a convex body by an ellipsoid. This can be done in an affine-invariant way, e.g., by selecting, among all ellipsoids containing the body, the one with the smallest volume; since volume is affine-covariant, this selection is affine-invariant. However, this selection may depend on the choice of the hyperplane. It is therefore desirable to directly approximate the convex cone describing Finsler causality with the quadratic cone, ideally in an affine-invariant way. We prove, however, that on the set of convex cones, there is no affine-covariant characteristic like volume. So, any approximation is necessarily not affine-invariant.
]]>
Olga Kosheleva et al.Why Locating Local Optima Is Sometimes More Complicated Than Locating Global Ones
http://digitalcommons.utep.edu/cs_techrep/1007
http://digitalcommons.utep.edu/cs_techrep/1007Fri, 15 Apr 2016 09:29:34 PDT
In most applications, practitioners are interested in locating global optima. In such applications, local optima that result from some optimization algorithms are an unnecessary side effect. In other words, in such applications, locating global optima is a much more computationally complex problem than locating local optima. In several practical applications, however, local optima themselves are of interest. Somewhat surprisingly, it turned out that in many such applications, locating all local optima is a much more computationally complex problem than locating all global optima. In this paper, we provide a theoretical explanation for this surprising empirical phenomenon.
]]>
Olga Kosheleva et al.Do we have compatible concepts of epistemic uncertainty?
http://digitalcommons.utep.edu/cs_techrep/1006
http://digitalcommons.utep.edu/cs_techrep/1006Fri, 15 Apr 2016 09:29:14 PDT
Epistemic uncertainties appear widely in civil engineering practice. There is a clear consensus that these epistemic uncertainties need to be taken into account for a realistic assessment of the performance and reliability of our structures and systems. However, there is no clearly defined procedure to meet this challenge. In this paper we discuss the phenomena that involve epistemic uncertainties in relation to modeling options. Particular attention is paid to set-theoretical approaches and imprecise probabilities. The respective concepts are categorized, and relationships are highlighted.
]]>
Michael Beer et al.Bell-Shaped Curve for Productivity Growth: An Explanation
http://digitalcommons.utep.edu/cs_techrep/1005
http://digitalcommons.utep.edu/cs_techrep/1005Fri, 15 Apr 2016 09:28:55 PDT
A recent analysis of the productivity growth data shows, somewhat surprisingly, that the dependence of the 20-century productivity growth on time can be reasonably well described by a Gaussian formula. In this paper, we provide a possible theoretical explanation for this observation.
]]>
Olga Kosheleva et al.Adjoint Fuzzy Partition and Generalized Sampling Theorem
http://digitalcommons.utep.edu/cs_techrep/1004
http://digitalcommons.utep.edu/cs_techrep/1004Fri, 15 Apr 2016 09:28:36 PDT
A new notion of adjoint fuzzy partition is introduced and the reconstruction of a function from its F-transform components is analyzed. An analogy with the Nyquist-Shannon-Kotelnikov sampling theorem is discussed.
]]>
Irina Perfilieva et al.Why Dependence of Productivity on Group Size Is Log-Normal
http://digitalcommons.utep.edu/cs_techrep/1003
http://digitalcommons.utep.edu/cs_techrep/1003Fri, 15 Apr 2016 09:28:12 PDT
Empirical analysis shows that, on average, the productivity of a group log-normally depends on its size. The current explanations for this empirical fact are based on reasonably complex assumptions about the human behavior. In this paper, we show that the same conclusion can be made in effect, from first principles, without making these complex assumptions.
]]>
Francisco Zapata et al.Robustness as a Criterion for Selecting a Probability Distribution Under Uncertainty
http://digitalcommons.utep.edu/cs_techrep/1002
http://digitalcommons.utep.edu/cs_techrep/1002Fri, 15 Apr 2016 09:27:51 PDT
Often, we only have partial knowledge about a probability distribution, and we would like to select a single probability distribution $\rho(x)$ out of all probability distributions which are consistent with the available knowledge. One way to make this selection is to take into account that usually, the values $x$ of the corresponding quantity are also known only with some accuracy. It is therefore desirable to select a distribution which is the most robust -- in the sense the x-inaccuracy leads to the smallest possible inaccuracy in the resulting probabilities. In this paper, we describe the corresponding most robust probability distributions, and we show that the use of resulting probability distributions has an additional advantage: it makes related computations easier and faster.
]]>
Songsak Sriboonchitta et al.Why Superellipsoids: A Probability-Based Explanation
http://digitalcommons.utep.edu/cs_techrep/1001
http://digitalcommons.utep.edu/cs_techrep/1001Fri, 15 Apr 2016 09:27:32 PDT
In many practical situations, it turns out that the set of possible values of the deviation vector is (approximately) a super-ellipsoid. In this paper, we provide a theoretical explanation for this empirical fact -- an explanation based on the natural notion of scale-invariance.
]]>
Pedro Barragan Olague et al.Comparison of Formulations of Applied Tasks With Interval, Fuzzy Set and Probability Approaches
http://digitalcommons.utep.edu/cs_techrep/1000
http://digitalcommons.utep.edu/cs_techrep/1000Fri, 15 Apr 2016 09:27:15 PDT
The focus of this paper is to clarify the concepts of solutions in linear equations in interval, probabilistic and fuzzy sets setting for real word tasks. There is a fundamental difference between formal definitions of the solutions and physically meaningful concept of solution in applied tasks when equations have uncertain components. For instance, a formal definition of the solution in terms of Moore interval analysis can be completely irrelevant for solving a real world task. We show that formal definitions must follow meaningful concept of the solution in the real world. The paper proposed several formalized definitions of the concept of solution for the linear equations with uncertain components in the interval, probability and fuzzy sets terms.
]]>
Boris Kovalerchuk et al.Voting Aggregation Leads to (Interval) Median
http://digitalcommons.utep.edu/cs_techrep/999
http://digitalcommons.utep.edu/cs_techrep/999Fri, 15 Apr 2016 09:26:58 PDT
When we have several results of measuring or estimating the same quantities, it is desirable to aggregate them into a single estimate for the desired quantities. A natural requirement is that if the majority of estimates has some property, then the aggregate estimate should have the same property. It turns out that it is not possible to require this forall possible properties -- but we can require it for bounds, i.e., for properties that the value of the quantity is in between given bounds a and b. In this paper, we prove that if we restrict the above "voting" approach to such properties, then the resulting aggregate is an (interval) median. This result provides an additional justification for the use of median -- in addition to the usual justification that median is the most robust aggregate operation.
]]>
Olga Kosheleva et al.Why Sparse? Fuzzy Techniques Explain Empirical Efficiency of Sparsity-Based Data- and Image-Processing Algorithms
http://digitalcommons.utep.edu/cs_techrep/998
http://digitalcommons.utep.edu/cs_techrep/998Fri, 15 Apr 2016 09:26:36 PDT
In many practical applications, it turned out to be efficient to assume that the signal or an image is sparse, i.e., that when we decompose it into appropriate basic functions (e.g., sinusoids or wavelets), most of the coefficients in this decomposition will be zeros. At present, the empirical efficiency of sparsity-based techniques remains somewhat a mystery. In this paper, we show that fuzzy-related techniques can explain this empirical efficiency. A similar explanation can be obtained by using probabilistic techniques; this fact increases our confidence that our explanation is correct.
]]>
Fernando Cervantes et al.