Página 7 dos resultados de 700 itens digitais encontrados em 0.023 segundos

Analise cinematica da movimentação dos membros superiores e inferiores, tronco e cabeça durante a marcha de hemipareticos; Kinematic analysis of upper and lower limbs, trunk and head motions during hemiparetic gait after stroke

Aline Araujo do Carmo
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 16/12/2009 Português
Relevância na Pesquisa
25.67%
O objetivo deste trabalho foi analisar integradamente a cinemática da movimentação dos membros superiores e inferiores, a partir cinemática angular, das variáveis espaço-temporais, do centro de massa total do corpo e das contribuições parciais dos segmentos para a trajetória do centro de massa total do corpo. A fim de identificar e analisar as alterações dos padrões de marcha desenvolvidos pelos sujeitos hemiparéticos acometidos por Acidente Vascular Encefálico. Para isso, foram analisados 14 sujeitos hemiparéticos acometidos por Acidente Vascular Encefálico, do sexo masculino, faixa etária entre 40 e 60 anos, com mínimo de 3 anos pós-lesão e, que não utilizassem dispositivos auxiliares. Para representar a marcha normal, foram selecionados 7 sujeitos do sexo masculino, faixa etária entre 40 e 60 anos, sem alterações na marcha. Os dados foram obtidos por videogrametria através do sistema DVideo. O modelo de orientação dos segmentos corporais utilizado consistiu de 71 marcadores de superfície, considerando 15 segmentos corporais articulados. O tratamento dos dados foi feito em ambiente Matlab. A análise estatística foi baseada nas seguintes comparações: 1) entre os lados direito e esquerdo dos sujeitos do grupo controle e do lado afetado e não afetado do grupo hemiparético; 2) Comparação entre os grupos controle e hemiparético; 3) Comparação das variáveis angulares contínuas das articulações afetadas e não afetadas do grupo hemiparético e as variáveis do grupo controle 4) Comparação da trajetória do centro de massa total e das porcentagens de contribuição parcial dos segmentos corporais entre os sujeitos do grupo controle e hemiparético (P<0.05). Os resultados mostraram que a movimentação do membro superior afetado é significativamente alterada...

Processamento, nivelamento e integração de levantamentos aerogeofísicos magnetométricos no Estado de Minas Gerais e sua contribuição à geologia da porção sul do Cráton São Francisco

Santos, Marcelo Henrique Leão
Fonte: Universidade de Brasília Publicador: Universidade de Brasília
Tipo: Dissertação
Português
Relevância na Pesquisa
25.65%
Dissertação (mestrado)-Universidade de Brasília, Instituto de Geociências, 2006.; O escopo principal desta dissertação de mestrado é o processamento, nivelamento, integração e interpretação de diversos levantamentos aerogeofísicos magnetométricos realizados no Estado de Minas Gerais. Estes levantamentos integrados foram correlacionados com dados geológicos disponíveis para definir o arcabouço magnético-estrutural da porção sul do Cráton São Francisco. O processamento consistiu no controle de qualidade, na avaliação da distribuição dos dados e na análise da consistência dos dados com o uso de testes como a diferença de quarta ordem. Testes de eficácia foram utilizados para definir o método bidirecional com ângulo de tendência como o melhor interpolador. A técnica da decorrugação foi usada - com o uso de técnicas de micronivelamento - para remover erros residuais do nivelamento da malha de vôo. No nivelamento e integração dos dados do Convênio Geofísico Brasil - Alemanha (CGBA), que cobre quase todo Estado de Minas Gerais e parte do Espírito Santo, foram realizadas diversas técnicas de junção para se chegar a um resultado eficaz. A tarefa mostrou-se bastante trabalhosa pela falta de continuidade entre os 59 blocos de dados obtidos em diferentes altitudes e alturas sobre o terreno...

SHORT STATIC GPS/GLONASS OBSERVATION PROCESSING IN THE CONTEXT OF ANTENNA PHASE CENTER VARIATION PROBLEM

Dawidowicz,Karol; Kazmierczak,Rafal; Swiatek,Krzysztof
Fonte: Universidade Federal do Paraná Publicador: Universidade Federal do Paraná
Tipo: Artigo de Revista Científica Formato: text/html
Publicado em 01/03/2015 Português
Relevância na Pesquisa
25.66%
So far, three methods have been developed to determine GNSS antenna phase center variations (PCV). For this reason, and because of some problems in introducing absolute models, there are presently three models of PCV receiver antennas (relative, absolute converted and absolute) and two satellite antennas (standard and absolute). Additionally, when simultaneously processing observations from different positioning systems (e.g. GPS and GLONASS), we can expect a further complication resulting from the different structure of signals and differences in satellite constellations. This paper aims at studying the height differences in short static GPS/GLONASS observation processing when different calibration models are used. The analysis was done using 3 days of GNSS data, collected with three different receivers and antennas, divided by half hour observation sessions. The results show that switching between relative and absolute PCV models may have a visible effect on height determination, particularly in high accuracy applications. The problem is especially important when mixed GPS/GLONASS observations are processed. The update of receiver antenna calibrations model from relative to absolute in our study (using LEIAT504GG, JAV_GRANT-G3T and TPSHIPER_PLUS antennas) induces a jump (depending on the measurement session) in the vertical component within to 1.3 cm (GPS-only solutions) or within 1.9 cm (GPS/GLONASS solutions).

Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution

Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.
Fonte: PubMed Publicador: PubMed
Tipo: Artigo de Revista Científica
Publicado em 04/11/1987 Português
Relevância na Pesquisa
25.66%
Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.

Detection of Blood Culture Bacterial Contamination using Natural Language Processing

Matheny, Michael E.; FitzHenry, Fern; Speroff, Theodore; Hathaway, Jacob; Murff, Harvey J.; Brown, Steven H.; Fielstein, Elliot M.; Dittus, Robert S.; Elkin, Peter L.
Fonte: American Medical Informatics Association Publicador: American Medical Informatics Association
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
25.66%
Microbiology results are reported in semi-structured formats and have a high content of useful patient information. We developed and validated a hybrid regular expression and natural language processing solution for processing blood culture microbiology reports. Multi-center Veterans Affairs training and testing data sets were randomly extracted and manually reviewed to determine the culture and sensitivity as well as contamination results. The tool was iteratively developed for both outcomes using a training dataset, and then evaluated on the test dataset to determine antibiotic susceptibility data extraction and contamination detection performance. Our algorithm had a sensitivity of 84.8% and a positive predictive value of 96.0% for mapping the antibiotics and bacteria with appropriate sensitivity findings in the test data. The bacterial contamination detection algorithm had a sensitivity of 83.3% and a positive predictive value of 81.8%.

Uma nova forma de calcular os centros dos Clusters em algoritmos de agrupamento tipo fuzzy c-means

Vargas, Rogerio Rodrigues de
Fonte: Universidade Federal do Rio Grande do Norte; BR; UFRN; Programa de Pós-Graduação em Sistemas e Computação; Ciência da Computação Publicador: Universidade Federal do Rio Grande do Norte; BR; UFRN; Programa de Pós-Graduação em Sistemas e Computação; Ciência da Computação
Tipo: Tese de Doutorado Formato: application/pdf
Português
Relevância na Pesquisa
25.65%
Clustering data is a very important task in data mining, image processing and pattern recognition problems. One of the most popular clustering algorithms is the Fuzzy C-Means (FCM). This thesis proposes to implement a new way of calculating the cluster centers in the procedure of FCM algorithm which are called ckMeans, and in some variants of FCM, in particular, here we apply it for those variants that use other distances. The goal of this change is to reduce the number of iterations and processing time of these algorithms without affecting the quality of the partition, or even to improve the number of correct classifications in some cases. Also, we developed an algorithm based on ckMeans to manipulate interval data considering interval membership degrees. This algorithm allows the representation of data without converting interval data into punctual ones, as it happens to other extensions of FCM that deal with interval data. In order to validate the proposed methodologies it was made a comparison between a clustering for ckMeans, K-Means and FCM algorithms (since the algorithm proposed in this paper to calculate the centers is similar to the K-Means) considering three different distances. We used several known databases. In this case...

Comparing Remote Data Transfer Rates of Compact Muon Solenoid Jobs with Xrootd and Lustre

Kaganas, Gary H
Fonte: FIU Digital Commons Publicador: FIU Digital Commons
Tipo: Artigo de Revista Científica Formato: application/pdf
Português
Relevância na Pesquisa
25.68%
To explore the feasibility of processing Compact Muon Solenoid (CMS) analysis jobs across the wide area network, the FIU CMS Tier-3 center and the Florida CMS Tier-2 center designed a remote data access strategy. A Kerberized Lustre test bed was installed at the Tier-2 with the design to provide storage resources to private-facing worker nodes at the Tier-3. However, the Kerberos security layer is not capable of authenticating resources behind a private network. As a remedy, an xrootd server on a public-facing node at the Tier-3 was installed to export the file system to the private-facing worker nodes. We report the performance of CMS analysis jobs processed by the Tier-3 worker nodes accessing data from a Kerberized Lustre file. The processing performance of this configuration is benchmarked against a direct connection to the Lustre file system, and separately, where the xrootd server is near the Lustre file system.

Designing and implementing sample and data collection for an international genetics study: the Type 1 Diabetes Genetics Consortium

Hilner, Joan E.; Perdue, Letitia H.; Sides, Elizabeth G.; Pierce, June J.; Wagner, Ana M.; Aldrich, Alan; Loth, Amanda; Albret, Lotte; Wagenknecht, Lynne E.; Nierras, Concepcion; Akolkar, Beena; Type 1 Diabetes Genetics Consortium
Fonte: Sage Publications Ltd. Publicador: Sage Publications Ltd.
Tipo: Artigo de Revista Científica
Publicado em //2010 Português
Relevância na Pesquisa
25.67%
Background and Purpose The Type 1 Diabetes Genetics Consortium (T1DGC) is an international project whose primary aims are to: (a) discover genes that modify type 1 diabetes risk; and (b) expand upon the existing genetic resources for type 1 diabetes research. The initial goal was to collect 2500 affected sibling pair (ASP) families worldwide. Methods T1DGC was organized into four regional networks (Asia-Pacific, Europe, North America, and the United Kingdom) and a Coordinating Center. A Steering Committee, with representatives from each network, the Coordinating Center, and the funding organizations, was responsible for T1DGC operations. The Coordinating Center, with regional network representatives, developed study documents and data systems. Each network established laboratories for: DNA extraction and cell line production; human leukocyte antigen genotyping; and autoantibody measurement. Samples were tracked from the point of collection, processed at network laboratories and stored for deposit at National Institute for Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories. Phenotypic data were collected and entered into the study database maintained by the Coordinating Center. Results T1DGC achieved its original ASP recruitment goal. In response to research design changes...

Parallel Algorithms and Architectures for near-far resistant CDMA acquisition

Kota, Kishore; Kota, Kishore
Fonte: Universidade Rice Publicador: Universidade Rice
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
25.66%
PhD Thesis; Subspace-based algorithms are a class of algorithms for estimation problems in array signal processing and more recently near-far resistant Code Division Multiple Access (CDMA) acquisition problems. Subspace-based algorithms are based on estimating signal or noise subspaces from the received data vectors and then performing some form of optimization to estimate the desired parameters. This thesis presents two new parallel algorithms applicable in the estimation of signal or noise subspaces from the received data vectors. The first algorithm is a pipelined SVD algorithm which allows pipelining of multiple independent singular value decomposition (SVD) problems on a single processor array. The resultant algorithm uses the exibility provided by the Jacobi algorithm by defining a new parallel ordering to result in a simple uniform array in which all communication including the initial load and the final unload operations are pipelined. The second algorithm described in this thesis is a sliding window SVD updating algorithm where the signal or noise subspace is updated whenever a new observation vector is received by applying a fixed-length window over the data. An important result shown in this thesis is that an important property ofdowndating problems...

Ground Based Systems

Fonte: Escola de Pós-Graduação Naval Publicador: Escola de Pós-Graduação Naval
Português
Relevância na Pesquisa
25.66%
The Naval Postgraduate School and the Navy’s Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) in collaboration with ProSensing Inc. has modified an X-Band tactical radar system to add a weather observation mode. The new system was named MWR-05XP (Mobile Weather Radar, 2005 X-Band, Phased Array) and is the first mobile, electronically scanned phased array radar developed for weather sensing applications. Key system parameters of the MWR-05XP rapid scanning radar system are summarized. As part of the modification, ProSensing developed a state-of-the-art PC based weather processor (WRP), which provides radar control, data acquisition, signal processing, real-time data display. Processing algorithms provide estimates of reflectivity, average radial velocity and velocity spread for distributed targets.

Computationally Aware Sum-Rate Optimal Scheduling for Centralized Radio Access Networks

Rost, Peter; Maeder, Andreas; Valenti, Matthew C.; Talarico, Salvatore
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
25.67%
In a centralized or cloud radio access network, certain portions of the digital baseband processing of a group of several radio access points are executed at a central data center. Centralizing the processing improves the flexibility, scalability, and utilization of computational assets. However, the performance depends critically on how the limited data processing resources are allocated to serve the needs of the different wireless devices. As the processing load imposed by each device depends on its allocated transmission rate and channel quality, the rate-allocation aspect of the scheduling should take into account the available computing. In this paper, two computationally aware schedulers are proposed that have the objective of maximizing the sum-rate of the system while satisfying a constraint on the offered computational load. The first scheduler optimally allocates resources and is implemented according to a water-filling algorithm. The second scheduler is suboptimal, but uses a simpler and intuitive complexity-cut-off approach. The performance of both schedulers is evaluated using an LTE-compliant system level simulator. It is found that both schedulers avoid outages that are caused by an overflow of computational load (i.e....

Adventures in Radio Astronomy Instrumentation and Signal Processing

McMahon, Peter L.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 02/09/2011 Português
Relevância na Pesquisa
25.66%
This thesis describes the design and implementation of several instruments for digitizing and processing analogue astronomical signals collected using radio telescopes. Modern radio telescopes have significant digital signal processing demands that are typically best met using custom processing engines implemented in Field Programmable Gate Arrays. These demands essentially stem from the ever-larger analogue bandwidths that astronomers wish to observe, resulting in large data volumes that need to be processed in real time. We focused on the development of spectrometers for enabling improved pulsar science on the Allen Telescope Array, the Hartebeesthoek Radio Observatory telescope, the Nan\c{c}ay Radio Telescope, and the Parkes Radio Telescope. We also present work that we conducted on the development of real-time pulsar timing instrumentation. All the work described in this thesis was carried out using generic astronomy processing tools and hardware developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. We successfully deployed to several telescopes instruments that were built solely with CASPER technology, which has helped to validate the approach to developing radio astronomy instruments that CASPER advocates.; Comment: 135 pages. Master's thesis for M.Sc. degree in Electrical Engineering at the University of Cape Town (2008)

Dynamic Nested Clustering for Parallel PHY-Layer Processing in Cloud-RANs

Fan, Congmin; Zhang, Ying Jun; Yuan, Xiaojun
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
25.66%
Featured by centralized processing and cloud based infrastructure, Cloud Radio Access Network (C-RAN) is a promising solution to achieve an unprecedented system capacity in future wireless cellular networks. The huge capacity gain mainly comes from the centralized and coordinated signal processing at the cloud server. However, full-scale coordination in a large-scale C-RAN requires the processing of very large channel matrices, leading to high computational complexity and channel estimation overhead. To resolve this challenge, we exploit the near-sparsity of large C-RAN channel matrices, and derive a unified theoretical framework for clustering and parallel processing. Based on the framework, we propose a dynamic nested clustering (DNC) algorithm that not only greatly improves the system scalability in terms of baseband-processing and channel-estimation complexity, but also is amenable to various parallel processing strategies for different data center architectures. With the proposed algorithm, we show that the computation time for the optimal linear detector is greatly reduced from $O(N^3)$ to no higher than $O(N^{\frac{42}{23}})$, where $N$ is the number of RRHs in C-RAN.

An Approximately Optimal Algorithm for Scheduling Phasor Data Transmissions in Smart Grid Networks

Nagananda, K. G.; Khargonekar, P. P.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
25.66%
In this paper, we devise a scheduling algorithm for ordering transmission of synchrophasor data from the substation to the control center in as short a time frame as possible, within the realtime hierarchical communications infrastructure in the electric grid. The problem is cast in the framework of the classic job scheduling with precedence constraints. The optimization setup comprises the number of phasor measurement units (PMUs) to be installed on the grid, a weight associated with each PMU, processing time at the control center for the PMUs, and precedence constraints between the PMUs. The solution to the PMU placement problem yields the optimum number of PMUs to be installed on the grid, while the processing times are picked uniformly at random from a predefined set. The weight associated with each PMU and the precedence constraints are both assumed known. The scheduling problem is provably NP-hard, so we resort to approximation algorithms which provide solutions that are suboptimal yet possessing polynomial time complexity. A lower bound on the optimal schedule is derived using branch and bound techniques, and its performance evaluated using standard IEEE test bus systems. The scheduling policy is power grid-centric, since it takes into account the electrical properties of the network under consideration.; Comment: 8 pages...

Processing of 24 Micron Image Data at the Spitzer Science Center

Masci, Frank J.; Laher, Russ; Fang, Fan; Fowler, John; Lee, Wen; Stolovy, Susan; Padgett, Deborah; Moshir, Mehrdad
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 12/11/2004 Português
Relevância na Pesquisa
25.66%
The 24 micron array on board the Spitzer Space Telescope is one of three arrays in the Multi-band Imaging Photometer for Spitzer (MIPS) instrument. It provides 5.3 x 5.3 arcmin images at a scale of ~2.5 arcsec per pixel corresponding to sampling of the point spread function which is slightly better than critical (~0.4\lambda/D). A scan-mirror allows dithering of images on the array without the overhead of moving and stabilizing the spacecraft. It also enables efficient mapping of large areas of sky without significant compromise in sensitivity. We present an overview of the pipeline flow and reduction steps involved in the processing of image data acquired with the 24 micron array. Residual instrumental signatures not yet removed in automated processing and strategies for hands-on mitigation thereof are also given.; Comment: 5 pages, 1 figure. To appear in Proceedings of Astronomical Data Analysis Software and Systems (ADASS) XIV, Pasadena, 2004

Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 02/06/2010 Português
Relevância na Pesquisa
25.65%
Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of Clouds; (b) energy-efficient resource allocation policies and scheduling algorithms considering quality-of-service expectations, and devices power usage characteristics; and (c) a novel software technology for energy-efficient management of Clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.; Comment: 12 pages...

Overview of the Kepler Science Processing Pipeline

Jenkins, Jon M.; Caldwell, Douglas A.; Chandrasekaran, Hema; Twicken, Joseph D.; Bryson, Stephen T.; Quintana, Elisa V.; Clarke, Bruce D.; Li, Jie; Allen, Christopher; Tenenbaum, Peter; Wu, Hayley; Klaus, Todd C.; Middour, Christopher K.; Cote, Miles T.;
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 01/01/2010 Português
Relevância na Pesquisa
25.65%
The Kepler Mission Science Operations Center (SOC) performs several critical functions including managing the ~156,000 target stars, associated target tables, science data compression tables and parameters, as well as processing the raw photometric data downlinked from the spacecraft each month. The raw data are first calibrated at the pixel level to correct for bias, smear induced by a shutterless readout, and other detector and electronic effects. A background sky flux is estimated from ~4500 pixels on each of the 84 CCD readout channels, and simple aperture photometry is performed on an optimal aperture for each star. Ancillary engineering data and diagnostic information extracted from the science data are used to remove systematic errors in the flux time series that are correlated with these data prior to searching for signatures of transiting planets with a wavelet-based, adaptive matched filter. Stars with signatures exceeding 7.1 sigma are subjected to a suite of statistical tests including an examination of each star's centroid motion to reject false positives caused by background eclipsing binaries. Physical parameters for each planetary candidate are fitted to the transit signature, and signatures of additional transiting planets are sought in the residual light curve. The pipeline is operational...

Is the 130 GeV Line Real? A Search for Systematics in the Fermi-LAT Data

Finkbeiner, Douglas P.; Su, Meng; Weniger, Christoph
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 20/09/2012 Português
Relevância na Pesquisa
25.67%
Our recent claims of a Galactic center feature in Fermi-LAT data at approximately 130 GeV have prompted an avalanche of papers proposing explanations ranging from dark matter annihilation to exotic pulsar winds. Because of the importance of such interpretations for physics and astrophysics, a discovery will require not only additional data, but a thorough investigation of possible LAT systematics. While we do not have access to the details of each event reconstruction, we do have information about each event from the public event lists and spacecraft parameter files. These data allow us to search for suspicious trends that could indicate a spurious signal. We consider several hypotheses that might make an instrumental artifact more apparent at the Galactic center, and find them implausible. We also search for an instrumental signature in the Earth limb photons, which provide a smooth reference spectrum for null tests. We find no significant 130 GeV feature in the Earth limb sample. However, we do find a marginally significant 130 GeV feature in Earth limb photons with a limited range of detector incidence angles. This raises concerns about the 130 GeV Galactic center feature, even though we can think of no plausible model of instrumental behavior that connects the two. A modest amount of additional limb data would tell us if the limb feature is a statistical fluke. If the limb feature persists...

An Overview of the Palomar Transient Factory Pipeline and Archive at the Infrared Processing and Analysis Center

Grillmair, C. J.; Laher, R.; Surace, J.; Mattingly, S.; Hacopians, E.; Jackson, E.; van Eyken, J.; McCollum, B.; Groom, S.; Mi, W.; Teplitz, H.
Fonte: Astronomical Society of the Pacific Publicador: Astronomical Society of the Pacific
Tipo: Book Section; PeerReviewed Formato: application/pdf
Publicado em //2010 Português
Relevância na Pesquisa
25.66%
The Palomar Transient Factory is conducting a wide-field, variable-cadence optical survey of the northern sky to detect transient, variable, and moving objects. As a member of the PTF collaboration, the Infrared Processing and Analysis Center has developed an image archive, a high-quality photometry pipeline, and a searchable database of detected astronomical sources. The system is capable of processing and storing 300 Gbytes of data per night over the course of the 5-year survey. With an expected total of ~ 20 billion rows, the table containing sources extracted from PTF images will be among the largest astronomical databases ever created. The survey is efficiently discovering transient sources from asteroids to supernovae, and will inform the development of future sky surveys like that of the Large Synoptic Survey Telescope.

SHORT STATIC GPS/GLONASS OBSERVATION PROCESSING IN THE CONTEXT OF ANTENNA PHASE CENTER VARIATION PROBLEM

Dawidowicz, Karol; Universidade Federal do Paraná; Kazmierczak, Rafal; University of Warmia and Mazury in Olsztyn; Swiatek, Krzysztof; University of Warmia and Mazury in Olsztyn
Fonte: Universidade Federal do Paraná-UFPR Publicador: Universidade Federal do Paraná-UFPR
Tipo: info:eu-repo/semantics/article; info:eu-repo/semantics/publishedVersion; Artigo Avaliado pelos Pares Formato: application/pdf
Publicado em 24/03/2015 Português
Relevância na Pesquisa
25.66%
So far, three methods have been developed to determine GNSS antenna phase center variations (PCV). For this reason, and because of some problems in introducing absolute models, there are presently three models of PCV receiver antennas (relative, absolute converted and absolute) and two satellite antennas (standard and absolute). Additionally, when simultaneously processing observations from different positioning systems (e.g. GPS and GLONASS), we can expect a further complication resulting from the different structure of signals and differences in satellite constellations. This paper aims at studying the height differences in short static GPS/GLONASS observation processing when different calibration models are used. The analysis was done using 3 days of GNSS data, collected with three different receivers and antennas, divided by half hour observation sessions. The results show that switching between relative and absolute PCV models may have a visible effect on height determination, particularly in high accuracy applications. The problem is especially important when mixed GPS/GLONASS observations are processed. The update of receiver antenna calibrations model from relative to absolute in our study (using LEIAT504GG, JAV_GRANT-G3T and TPSHIPER_PLUS antennas) induces a jump (depending on the measurement session) in the vertical component within to 1.3 cm (GPS-only solutions) or within 1.9 cm (GPS/GLONASS solutions).