Now showing items 1-20 of 19963

• #### Characterizing Large-Scale Resting State Effective Connectivity Patterns with Functionally Constrained Priors in Individuals with a History of Major Depressive Disorder

Major depressive disorder (MDD) is a common mental health condition (Kessler & Bromet, 2013) and the 3rd leading cause of disability worldwide (James et al., 2018). MDD history is a significant risk factor for relapse and recurrence of depression (Buckman et al., 2018; Burcusa & Iacono, 2007). The current study investigated resting state effective connectivity among 13 brain regions from three resting state networks (i.e., default, salience, and central executive), which had been implicated in the pathophysiology of MDD from previous studies (Kaiser et al., 2015; Mulders et al., 2015). In the current study, both within- and between-networks effective connectivity were found to be different in those with a MDD history (N=29) compared to the healthy controls (N=28), through spectral dynamic causal modeling (Friston, Kahan, Biswal, et al., 2014), Bayesian model reduction (Friston et al., 2016), and parametric empirical Bayes (Zeidman, Jafarian, Seghier, et al., 2019) analyses. Of particular interest is the finding that there is more negative effective connectivity from right anterior insula to left dorsolateral prefrontal cortex and left inferior parietal lobe in MDD history. Previous studies have found less causal influence from anterior insula to prefrontal cortex in currently depressed individuals (Hyett et al., 2015; Iwabuchi et al., 2014; Kandilarova et al., 2018). Given the importance of anterior insular in interoception and subjective feelings (Craig & Craig, 2009), the current study provides some preliminary evidence that altered effective connectivity between anterior insula and prefrontal cortex may be related to MDD history as well.
• #### Trapping Sets of Iterative Decoders for Quantum and Classical Low-Density Parity-Check Codes

Protecting logical information in the form of a classical bit or a quantum bit (qubit) is an essential step in ensuring fault-tolerant classical or quantum computation. Error correction codes and their decoders perform this step by adding redundant information that aids the decoder to recover or protect the logical information even in the presence of noise. Low-density parity-check (LDPC) codes have been one of the most popular error correction candidates in modern communication and data storage systems. Similarly, their quantum analogues, quantum LDPC codes are being actively pursued as excellent prospects for error correction in future fault-tolerant quantum systems due to their asymptotically non-zero rates, sparse parity check matrices, and efficient iterative decoding algorithms. This dissertation deals with failure configurations, known as \emph{trapping sets} of classical and quantum LDPC codes when decoded with iterative message passing decoding algorithms, and the \emph{error floor phenomenon} - the degradation of logical error rate performance at low physical noise regime. The study of quantum trapping sets will enable the construction of better quantum LDPC codes and also help in modifying iterative quantum decoders to achieve higher fault-tolerant thresholds and lower error floors. Towards this goal, the dissertation also presents iterative decoders for classical and quantum LDPC codes using the \emph{deep neural network framework}, novel iterative decoding algorithms, and a decoder-aware \emph{expansion-contraction method} for error floor estimation. In this dissertation, we first establish a systematic methodology by which one can identify and classify \emph{quantum trapping sets} (QTSs) according to their topological structure and decoder used. For this purpose, we leverage the known harmful configurations in the Tanner graph, called \emph{trapping sets} (TSs), from the classical error correction world. The conventional definition of a trapping set of classical LDPC codes is generalized to address the syndrome decoding scenario for quantum LDPC codes. Furthermore, we show that the knowledge of QTSs can be used to design better quantum LDPC codes and decoders. In the context of the development of novel decoders, we extend the stochastic resonance based decoders to quantum LDPC codes, propose iteration-varying message passing decoders with their message update rules learned by neural networks tuned for low logical error rate, and present a syndrome based generalized belief propagation algorithm for tackling convergence failure of iterative decoders due to the presence of short cycles. Our analysis of TSs of a layered decoding architecture clearly reveals the dependence of the harmfulness of TSs (classical or quantum) on the iterative decoder, and thus on the error floor estimates. We present a computationally efficient method for estimating error floors of LDPC codes over the binary symmetric channel without any prior knowledge of its trapping sets. The sub-graph expansion-contraction method is a general procedure for TS characterization, which lists all harmful error patterns up to a given weight for the LDPC code and decoder. Based on this decoder-aware trapping set characterization for LDPC codes, we propose a model-driven deep neural network (DNN) framework that unfolds the decoding iterations, to design the \emph{decoder diversity of finite alphabet iterative decoders (FAIDs)}. Our decoder diversity DNN-FAID delivers excellent waterfall performance along with a low error floor.

• #### They See Me Different…Like an Immigrant Cause of How I Sound: Perceived Difference, Limitations, & Co-Naturalizations of Race and Language

Latinx English language learners (ELLs) have long been the intended targets of U.S. language planning and policy efforts that seek to manage both the use of Spanish and its speakers. Since 2000, Arizona has adopted some of the most restrictive educational policies that shape the schooling of its ELLs (e.g., Proposition 203 and House Bill 2064). Like other bilingual education policies, Arizona’s frame Latinx ELLs as needing linguistic remediation in order for ELLs to develop proficiency in academic English for them to be successful in the modern, global economy (Flores, 2016). Yet academic/home language distinctions have been shown to position multilinguals’ language practices as deficient compared to an unmarked norm even when ELLs ostensibly model language practices that are validated when produced by non-racialized individuals (Flores & Rosa, 2015; Rosa, 2016). What is not well-known is if/how multilinguals reconstruct raciolinguistic ideologies. This descriptive qualitative study is guided by the research question: In what ways do Latinx multilingual students reproduce raciolinguistic ideologies? To better understand the pervasiveness of raciolinguistic ideologies, I interviewed ten Latinx multilinguals from two high schools in southern Arizona and thematically analyzed the data (Braun & Clarke, 2006). The findings showed Latinx multilingual students reproducing raciolinguistic ideologies, particularly in relation to co-naturalizations of race and language, perceived linguistic limitations, and raciolinguistic difference. These findings suggest that multilinguals sometimes adopt the stances of white perceiving subjects that re/construct multilingual language practices as inferior (Flores & Rosa, 2015; Inoue, 2003; Rosa & Flores, 2017), and deviating from an idealized monolingual norm (Flores, 2013). I conclude that there is a need for practitioners to advance efforts to dismantle raciolinguistic ideologies, and that the interventions most needed by multilinguals are ones that challenge the ubiquity of raciolinguistic ideologies and contribute to their denaturalization.
• #### Coping with Complexity: Essays on Evolution and Institutions

Despite their disparate subjects, the following essays share a number of common themes. Chief among these are complexity, evolution, and institutions. The first two essays examine Hayek's social theory, an examination that brings to light two basic points about complex societies. First, they are difficult to predict and control. Second, they adapt to internal and external changes. These features lay the groundwork for section II, which examines the proper form of governance structures for a complex, adaptive society. The first essay of part II applies multilevel selection theory to the problem of governing complexity. It concludes that polycentric political organization, supplemented by a few additional design principles,'' facilitates a socially beneficial process of competition and evolution. The second essay of part II uncovers a related benefit of polycentric governance. Due to its decentralized and competitive nature and due to the vast amount of relevant and constantly changing information generated in a complex society, polycentric governance institutions utilize information more effectively than centralized modes of governance. There is a substantial welfare benefit to utilizing this information by implementing reforms that seek to address the concerns and satisfy the preferences of millions, or perhaps billions, of citizens. Moreover, centralized governance becomes increasingly difficult as increasing numbers of increasingly interdependent variables become relevant to any given problem. Polycentricity is an adaptation of the state in response to the problem of social complexity. Like biological adaptations, this adaptation may be blind and unintentional. It may even precede the phenomenon to which it proves adaptive. Yet, it is adaptive nonetheless, since it provides an effective response to the problems posed by its environment. Finally, part III begins to examine some of the normative, philosophical consequences of these social scientific investigations. If society is in a constant state of flux, if it is evolving in response to fluctuating variables, then the traditional task of political philosophy may stand in need of amendment. Philosophers from Plato to Rawls have attempted to characterize a conception of justice, a political summum bonum, that transcends the institutional variations of time and place. If justice is, at least in part, a project of reconciliation, and if the values held by citizens continue to evolve, then there may not be a stable conception of justice that transcends societal dynamism. We may, instead, need to content ourselves with identifying certain general desiderata that better enable society to coordinate on a shared conception of justice, however ephemeral this conception might prove to be.
• #### Moving Beyond Inclusive Excellence: Operationalizing Diversity, Equity, & Inclusion Through Organizational Alignment in Higher Education

When it comes to the implementation of diverse, equitable, and inclusive (DEI) behaviors, institutions of higher education are misaligned in their understanding and operation of what the work entails. In an effort to emphasize diversity, equity, and inclusion (DEI) in the larger Organizational Development process (OD), many institutions of higher education wind up focusing most, if not all, of their energy into areas of the work that does not allow for sustainable action. The ensuing research explores the relationship between strategic plans for diversity and inclusion (SPFDI) and the ways in which they are impacted by organizational design – specifically the Inclusive Excellence model. In addition to the SPFDI, I will explore two different action items, Diversity Focused Programming (DFP) and Equal Employment Opportunity Compliance (EEOC) as cooperating elements used to implement DEI. Both action items will be examined through the lens of Inclusive Excellence (IE) as they pertain to sustaining the behavior of diversity, equity, and inclusion in institutions of higher education. Where elements of the information pertaining to this particular study are extremely scarce in both research and practice in higher education, I use a multidimensional approach in order to compile the necessary data to support my study. A multidimensional approach is a research method that involves the examination of multiple fields of study in order to analyze and make a case for another. In this case, I review organizational behavior and organizational design as subsets of organizational development, as well as the hybrid Inclusive Excellence and strategic planning models, corporate diversity programming models, and federal/state equal employment requirements in order to answer how organizational design effects the behavior of diversity, equity, and inclusion in institutions of higher education. As a result, this multidimensional study was supported by using a mixed-methods approach to analyze the data I gathered from the study. I used a quantitative approach to showcase the amount of institutional strategic plans that were impacted by the Inclusive Excellence model and I used a qualitative approach to explain and highlight the challenges and successes the model itself has had on various institutions of higher education. Together, this study examines the ways in which a series of colleges or universities that have adopted the Inclusive Excellence model interpret diversity, equity, and inclusion based on their understandings of the definitions. The chief aim of this study was to discover how the Inclusive Excellence model, as an organizational design, defines, implements, and sustains behaviors of diversity, equity, and inclusion in institutions of higher education. This study applies organizational design and behavior as subsects of the larger organizational development process in order to illustrate their relationships with the Inclusive Excellence model (IE) and strategic plans for diversity and inclusion (SPFDIs).
• #### Cultural Sensitivity for Healthcare Providers on the Tohono O’odham Nation: A Quality Improvement Project

Purpose. The purpose of this quality improvement (QI) project was to develop and implement an accessible, culturally sensitive educational intervention, a brief PowerPoint presentation designed for healthcare providers and staff of the Sells Hospital ED who provide care to members of the Tohono O’odham Nation (TON).Background. In rural areas of the United States (US), many barriers are present that hinder and complicate access to quality and culturally sensitive healthcare. Within rural and remote settings, the Emergency Department (ED) often becomes the sole source of accessible medical care for a broad range of both acute and chronic healthcare needs. Although the ED is critical for ensuring emergency care for rural populations, it can be a fast-paced and intimidating clinical environment, making it difficult for patients to effectively advocate for their healthcare needs. The ED is focused on delivering acute, critical, vital and lifesaving interventions however, of equal importance is the delivery of culturally sensitive healthcare. Healthcare staff and providers employed in Native American (NA) healthcare settings must possess cultural sensitivity, interpersonal etiquette and be aware of the historical and present-day intergenerational impact of historical trauma experienced by Native communities across the US. Purpose. The purpose of this QI project was to collaborate with TON cultural experts to develop, implement, and evaluate the impact of an educational intervention designed for healthcare providers who provide care for the TON. Methods. This project utilized a descriptive quality improvement design. Results. The educational presentation was co-created with consultation from five (N=5) Tohono O’odham cultural members and experts. These results of the post survey demonstrated that the educational material had a positive and influential impact on six (N=6) healthcare providers who work in the Sells ED. Conclusions. Participatory co-creation of culturally aligned educational material was a valuable aspect of this project. The outcomes of this quality improvement project offer an exemplar of a co-created educational video designed for healthcare providers working on the TON. This project has the capacity to improve cultural sensitivity and enhance quality of patient care and patient outcomes in the Tohono O’odham Healthcare system.
• #### White Curricula Effect to White Replacement Anxiety, Status Quo Politics: Teacher Experience and Understanding in Culturally Responsive Professional Development

While the student demographic continues to shift in public education across the country, reflecting a more diverse classroom comprised of minoritized students, teacher preparation programs continue to espouse White middle-class values. As such, the education process continues to dehumanize minoritized students through socially acceptable discriminatory practices and policies. This approach to teacher preparation leaves teachers ill prepared to adequately teach minoritized students by not recognizing the resources that they bring to the classroom. To the contrary, minoritized students are expected to leave their culture and identity outside of the classroom. As the achievement gap is maintained, this study purposefully examines the process implemented to interrupt that disparity through in-service teacher professional development in culturally responsive teaching in a large urban school district. This grounded theory method study examines in-service teacher experience and understanding in culturally responsive professional development that tends to teacher bias thinking and critical awareness development. Teacher critical awareness development focuses in these four areas: ongoing effort to instructionally integrate students’ cultural knowledge, attention to the effects of explicit and implicit bias, ongoing effort to affirm students’ academic and ethnic identities, and heightened awareness to issues of social justice, teachers’ asset-based beliefs, teacher critical awareness, and student identity. The findings provide insight regarding the ways that in-service teachers experience and understand the culturally responsive professional development that encompass six themes: (a) Status Quo-White Curricula Effect, (b) Altruistic Reconciliation, (c) Pensive Practitioners, (d) Colorblind Liberal, (e) People of Color Apologetic Syndrome and White Replacement Anxiety, and (f) Practical Complacent Practitioners. The six emergent themes were further analyzed and categorized into three overarching categories: the Conventional Practitioners, the Dysconscious Racists, and the Equity Saboteurs. The results of this study serve to inform approaches to implement culturally responsive professional development to interrogate educational inequities and provide humanizing spaces of authentic learning for minoritized students.
• #### Synthesis, Comprehensive Characterization, and Development of Therapeutic Peptides and Glycopeptides for Targeted Respiratory Drug Delivery as Inhalation Aerosols

Central nervous system (CNS) disorders, including neurodegeneration and chronic pain, and many respiratory diseases would greatly benefit from the specific and potent peptide pharmaceuticals and their low inherent toxicity. The delivery of peptides to target the brain is challenging, principally due to peptides' low metabolic stability, which decreases their duration of action, poor penetration of the blood-brain barrier (BBB), and their incompatibility with oral administration, typically resulting in the need for parenteral administration. These challenges limit the clinical application of peptides and explain the interest for alternative routes of peptide administration, particularly: delivery to the respiratory tract (upper and lower). Upper to target the brain through the olfactory route bypassing the blood-brain barrier (BBB), i.e., needle-free nose-to-brain delivery (N-to-B), which offers protein and peptide drugs the possibility to reach the brain noninvasively. N-to-B delivery can be a convenient method for rapidly targeting the CNS, bypassing the BBB, and minimizing systemic exposure. In addition, delivery to the lower respiratory tract as inhalation aerosol offers attractive advantages in delivering the drug locally to treat lung diseases; and to the CNS to treat its disorders at a low dose while minimizing systemic adverse effects. The lung is a low metabolic organ compared to the gastrointestinal (GI) tract. It allows rapid and high drug absorption due to the large surface area, the high blood flow, and the absence of the first-pass metabolism. In this study, several peptides and glycopeptides with different pharmacological mechanisms were developed. Some of these compounds were synthesized using the SPPS strategy and formulated as dry powders with characteristics tailored to target the respiratory tract (upper and/or lower) to treat various CNS and lung diseases. Advanced organic closed mode spray drying technique was used to produce microparticulate/ nanoparticulate formulations utilizing sugar-based excipients. The solubility and lipophilicity of all included compounds were determined computationally using molecular operating environment (MOE) software and experimentally using the shake-flask method (SFM). The raw and formulated compounds were comprehensively characterized in the solid-state. The safety of all peptides and glycopeptides covered in this dissertation was evaluated in vitro using the human nasal, brain, and pulmonary cell lines. The in vitro aerosol dispersion of the raw and spray-dried compounds was tested using an FDA-approved human inhaler device, and the influence of spray drying process conditions on the aerosol dispersion was evaluated.
• #### Design, Synthesis, and Evaluation of Brain-Penetrant PACAP-Derived Glycopeptides for the Treatment of Neurodegeneration and Neuroinflammation

Neurodegenerative disorders negatively impact the health of millions of people worldwide each year, and current therapeutic strategies only alleviate symptoms and exhibit little to no curative potential. Peptides comprise an important class of biological regulatory molecules that may be able to meet these concerns. Many endogenous peptides act as hormones, neuromodulators, secretagogues, and regulators of the inflammatory response. Furthermore, peptides are highly selective for their target receptors, leading to reduced side effect profiles, and they are regarded as non-toxic due to their metabolism yielding innocuous amino acids. However, progress in developing peptide drugs is hampered by their poor in vivo pharmacokinetic profiles, limited membrane permeability, and low oral bioavailability. Several chemical strategies including cyclization, N-methylation, lipidation, PEGylation, and incorporation of unnatural amino acids have been largely successful in improving the stability of peptides, but generally don’t elicit membrane penetration. One such chemical modification that can address the membrane permeability problem is glycosylation. Glycosylation has been demonstrated to improve water solubility and in vivo stability of peptides, and dramatically enhance penetration across biological membranes, most notably the blood-brain barrier (BBB). We have applied our glycosylation methodology to a variety of endogenous peptides, and this work summarizes the glycosylation of PACAP, a potential candidate for treating neurodegenerative disorders. Overall, we found that our PACAP glycopeptides exhibited superior stability in vitro and in vivo compared to their non-glycosylated counterparts while maintaining the intrinsic efficacy and potency of native PACAP. Most importantly, we found that our PACAP glycopeptides were able to penetrate the BBB in physiologically relevant concentrations and elicit neuroprotective and anti-inflammatory activities in animal models of Parkinson’s disease, stroke, and traumatic brain injury.