Página 2 dos resultados de 700 itens digitais encontrados em 0.033 segundos

A Technology Assessment of Remote Data Entry in Clinical Trials

Herson, Jay
Fonte: PubMed Publicador: PubMed
Tipo: Artigo de Revista Científica
Publicado em 07/11/1984 Português
Relevância na Pesquisa
35.64%
Remote date entry is a new technology that may have a profound impact on the conduct of clinical trials. Under remote data entry clinic staff enter patient data into a personal computer at the clinic and the data is promptly transmitted to a mainframe at the coordinating center. This paper contrasts this new technology with conventional data entry whereby case report forms are mailed to the coordinating center where the data processing staff enters data directly to the mainframe. Remote data entry is found to be preferable to conventional when the remote software offers the possibility of driving clinic processes rather than merely transmitting outcome data to a mainframe. Some preliminary cost-benefit results are presented.

Meta-Analysis of Repository Data: Impact of Data Regularization on NIMH Schizophrenia Linkage Results

Walters, Kimberly A.; Huang, Yungui; Azaro, Marco; Tobin, Kathleen; Lehner, Thomas; Brzustowicz, Linda M.; Vieland, Veronica J.
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: Artigo de Revista Científica
Publicado em 14/01/2014 Português
Relevância na Pesquisa
35.72%
Human geneticists are increasingly turning to study designs based on very large sample sizes to overcome difficulties in studying complex disorders. This in turn almost always requires multi-site data collection and processing of data through centralized repositories. While such repositories offer many advantages, including the ability to return to previously collected data to apply new analytic techniques, they also have some limitations. To illustrate, we reviewed data from seven older schizophrenia studies available from the NIMH-funded Center for Collaborative Genomic Studies on Mental Disorders, also known as the Human Genetics Initiative (HGI), and assessed the impact of data cleaning and regularization on linkage analyses. Extensive data regularization protocols were developed and applied to both genotypic and phenotypic data. Genome-wide nonparametric linkage (NPL) statistics were computed for each study, over various stages of data processing. To assess the impact of data processing on aggregate results, Genome-Scan Meta-Analysis (GSMA) was performed. Examples of increased, reduced and shifted linkage peaks were found when comparing linkage results based on original HGI data to results using post-processed data within the same set of pedigrees. Interestingly...

Cancer systems biology: signal processing for cancer research

Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei
Fonte: Sun Yat-sen University Cancer Center Publicador: Sun Yat-sen University Cancer Center
Tipo: Artigo de Revista Científica
Publicado em /04/2011 Português
Relevância na Pesquisa
35.64%
In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts.

Cross-Layer Design for Energy Efficiency on Data Center Network

Cheocherngngarn, Tosmate
Fonte: FIU Digital Commons Publicador: FIU Digital Commons
Tipo: Artigo de Revista Científica Formato: application/pdf
Português
Relevância na Pesquisa
35.71%
Energy efficient infrastructures or green IT (Information Technology) has recently become a hot button issue for most corporations as they strive to eliminate every inefficiency from their enterprise IT systems and save capital and operational costs. Vendors of IT equipment now compete on the power efficiency of their devices, and as a result, many of the new equipment models are indeed more energy efficient. Various studies have estimated the annual electricity consumed by networking devices in the U.S. in the range of 6 - 20 Terra Watt hours. Our research has the potential to make promising solutions solve those overuses of electricity. An energy-efficient data center network architecture which can lower the energy consumption is highly desirable. First of all, we propose a fair bandwidth allocation algorithm which adopts the max-min fairness principle to decrease power consumption on packet switch fabric interconnects. Specifically, we include power aware computing factor as high power dissipation in switches which is fast turning into a key problem, owing to increasing line speeds and decreasing chip sizes. This efficient algorithm could not only reduce the convergence iterations but also lower processing power utilization on switch fabric interconnects. Secondly...

Random Projections of Smooth Manifolds

Baraniuk, Richard G.; Wakin, Michael; Baraniuk, Richard G.; Wakin, Michael
Fonte: Universidade Rice Publicador: Universidade Rice
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
35.66%
Journal Paper; Many types of data and information can be described by concise models that suggest each data vector (or signal) actually has â few degrees of freedomâ relative to its size N. This is the motivation for a variety of dimensionality reduction techniques for data processing that attempt to reduce or eliminate the impact of the ambient dimension N on computational or storage requirements. As an example, many signals can be expressed as a sparse linear combination of elements from some dictionary. The sparsity of the representation directly reflects the conciseness of the model and permits efficient techniques such as Compressed Sensing (CS), an emerging theory for sparse signal recovery requiring only a small number of nonadaptive, random linear measurements. In other cases, the conciseness of the signal model may dictate that the signal class forms a low-dimensional manifold as a subset of the high-dimensional ambient space R^N. This type of geometric structure may not be neatly reflected in a sparse representation. Instead, dimensionality reduction techniques for manifold-modeled data typically involve â learningâ the manifold structure from a collection of data points, often by constructing nonlinear mappings from R^N to R^M for some M < N that are adapted to the training data and intended to preserve some characteristic property of the manifold. In this paper...

Signal and Information Processing for Wireless Communication Systems

Bhashyam, Srikrishna; Bhashyam, Srikrishna
Fonte: Universidade Rice Publicador: Universidade Rice
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
35.72%
PhD Thesis; Next generation wireless communication systems need to support access to multimedia data available on the internet. This universal wireless access to multimedia data requires data rates and quality of service that are orders of magnitude better than those available in current wireless systems. The two major problems posed by the wireless channel, multipath fading and multiple access interference, are addressed here. The time-varying and shared nature of the wireless channel lead to multipath fading and multiple access interference (MAI) respectively. In this work, signal and information processing algorithms are developed to combat these problems and, consequently, increase data rates and improve performance. First, multiuser signal processing algorithms that combat MAI are developed for code division multiple access (CDMA) systems. Although multiuser signal processing has been extensively studied in the past, very little attention has focused on the practically important case of CDMA with long spreading codes that forms the basis of next generation wireless cellular communications. This work proposes new multiuser channel estimation and tracking algorithms for long code CDMA and demonstrates the significant gains achievable in performance. We also propose new channel estimation algorithms for short code CDMA systems that support multiple data rates to achieve multimedia communication. The new problems posed by the multirate nature of this system...

Automated meteorological and oceanographic data collection and distribution in support of C4I, weapons, and remote sensing systems

Nisley, William Hughes.
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
35.64%
On-scene characterization of the battlespace environment is critical toward providing the warfighter with an effective understanding of the environment and its impact on weapon systems and sensors and requires the rapid acquisition and dissemination of on-scene meteorological and oceanographic (METOC) measurements. The current practice of manually observing and recording METOC data is labor intensive, outdated, and no longer capable of satisfying the requirements for higher temporal and spatial observations. This study reviews the current methodology to characterize the battlespace environment, summarizes relevant Navy needs, and describes the results of integrating a prototype small combatant integrated METOC system (SCIMS) developed by the Naval Postgraduate School, with a prototype data processing and distribution system (Weather Viewer) developed by SPAWARSYSCEN San Diego. At-sea demonstration included the acquisition, encoding, transmission and retrieval of real-time observations to/ from shore based METOC data servers at Fleet Numerical Meteorology and Oceanography Center via commercial telephone access to the Internet. The demonstration further served as the basis for development of a PC based prototype shipboard METOC archive and reports system called SMART Log. The study concludes with particular recommendations for updating and improving the system of environmental data collection...

Cálculo de la eficiencia energética en centros de procesos de datos; Calculation of energy efficiency of data center

Vila López, Manuel
Fonte: Universidade de Cantabria Publicador: Universidade de Cantabria
Tipo: Trabalho de Conclusão de Curso
Português
Relevância na Pesquisa
35.71%
RESUMEN: El objetivo del presente Proyecto Fin de Carrera es crear una herramienta software para la medida y monitorización de la eficiencia energética en centros de procesos de datos. Esta herramienta se realizará en la empresa CIC Consulting Informático, realizando las medias y las pruebas sobre el data center de la citada empresa. La métrica estándar para obtener la eficiencia energética es el PUE que es una variable definida por el consorcio ‘The Green Grid’. El PUE se calcula como la división de la potencia total de entrada en el centro de procesos de datos entre la potencia suministrada a los equipos IT. La herramienta realizará la medición del consumo energético en cada uno de los puntos clave que son necesarios para calcular el llamado PUE en sus diferentes categorías y la representación de los valores instantáneos e históricos de los mismos. Así como información relevante del estado de los componentes del data center. Para la captura en los puntos clave utilizaremos el protocolo a nivel de aplicación SNMP y sondas de medida basadas en toroides con tarjetas de adquisición de la información. Mientras que para la visualización y almacenamiento de los datos se realizara a través de una herramienta web llamada “IDbox”...

Cost/benefit analysis of Commander, Naval Surface Force, U.S. Pacific Fleet's Supply Maintenance Training Team

Buzon, Carlos D., II; Huggins, Michael D.
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Tese de Doutorado Formato: 116 p.
Português
Relevância na Pesquisa
35.64%
Approved for public release; distribution is unlimited; This thesis is an attempt to accomplish a cost/benefit analysis of Commander, Naval Surface Force, U.S. Pacific Fleet's (COMNAVSURFPAC) Supply Maintenance Training Team (SMTT). The effectiveness of the SMIT program is also evaluated. Data were gathered from surveys of current and former Supply Officers of COMNAVSURFPAC ships, with the Shipboard Non-tactical Automated Data Processing II system installed and which had received all or part of an SMTT assistance visit. Interviews were conducted with selected Navy and civilian contractor members of the SMTT staff. Data were also gathered from various records and reports maintained by the SMTT staff and by COMNAVSURFPAC's Supply Assistance Center. The nature of the cost/benefit data of the SMTT program did not lend itself to a homogenous comparison of costs to benefits. The authors have determined that an attempt to "homogenize" the data with the use of economic "shadow prices" was of little value in meeting the "measurable performance" criteria of the cost/benefit analysis. Therefore, in strict terms, a cost/effectiveness analysis was accomplished. The analysis indicated that the SMTT program has resulted in positive gains in afloat supply operations. Many intangible benefits are derived from the assistance visit and there are indications of tangible benefits in the form of dollar and manhour savings. The trend of the data shows a valuable program to the fleet. Although not definitive due to data limitations...

Long-Range Dependence: Now you see it now you don't!

Karagiannis , Thomas; Faloutsos , Michalis; Riedi, Rudolf H.; Karagiannis , Thomas; Faloutsos , Michalis; Riedi, Rudolf H.
Fonte: Universidade Rice Publicador: Universidade Rice
Tipo: Conference paper; Text; Text
Português
Relevância na Pesquisa
35.67%
Conference Paper; Over the last few years, the network community has started to rely heavily on the use of novel concepts such as self-similarity and Long-Range Dependence (LRD). Despite their wide use, there is still much confusion regarding the identification of such phenomena in real network traffic data. In this paper, we show that estimating Long Range Dependence is not straightforward: there is no systematic or definitive methodology. There exist several estimating methodologies, but they can give misleading and conflicting estimates. More specifically, we arrive at several conclusions that could provide guidelines for a systematic approach to LRD. First, long-range dependence may exist even, if the estimators have different estimates in value. Second, long-range dependence is unlikely to exist, if there are several estimators that do not ``converge'' statistically to a value. Third, we show that periodicity can obscure the analysis of a signal giving partial evidence of long range dependence. Fourth, the Whittle estimator is the most accurate in finding the exact value when LRD exists, but it can be fooled easily by periodicity. As a case-study, we analyze real round-trip time data. We find and remove a periodic component from the signal...

Network Traffic Modeling using a Multifractal Wavelet Model

Riedi, Rudolf H.; Crouse, Matthew; Ribeiro, Vinay Joseph; Baraniuk, Richard G.; Riedi, Rudolf H.; Crouse, Matthew; Ribeiro, Vinay Joseph; Baraniuk, Richard G.
Fonte: Universidade Rice Publicador: Universidade Rice
Tipo: Conference paper; Text; Text
Português
Relevância na Pesquisa
35.68%
Conference Paper; In this paper, we describe a new multiscale model for characterizing positive-valued and long-range dependent data. The model uses the Haar wavelet transform and puts a constraint on the wavelet coefficients to guarantee positivity, which results in a swift O(N) algorithm to synthesize N-point data sets. We elucidate our model's ability to capture the covariance structure of real data, study its multifractal properties, and derive a scheme for matching it to real data observations. We demonstrate the model's utility by applying it to network traffic synthesis. The flexibility and accuracy of the model and fitting procedure result in a close match to the real data statistics (variance-time plots) and queuing behavior.

Building Results Frameworks for Safety Nets Projects

Rubio, Gloria M.
Fonte: World Bank, Washington, DC Publicador: World Bank, Washington, DC
Tipo: Publications & Research :: Working Paper; Publications & Research
Português
Relevância na Pesquisa
35.66%
Results chains are useful tools to clarify safety nets programs' objectives, verify the program internal logic, and guide the selection of indicators. Although the recent trend has been to focus mostly on outcome indicators, indicators are needed at all levels of the results chain to better understand program performance. Ideally, safety nets performance monitoring systems should build upon reliable program records and be complemented with a combination of tailor?made data sources and national surveys. The latter should be considered not only for monitoring purposes but within a program evaluation agenda. In any case, it is very important that adequate resources and technical assistance are channeled to strengthen institutional capacity for safety nets results-based management.

The Bolocam Galactic Plane Survey: Survey Description and Data Reduction

Aguirre, James E.; Ginsburg, Adam G.; Dunham, Miranda K.; Drosback, Meredith M.; Bally, John; Battersby, Cara; Bradley, Eric Todd; Cyganowski, Claudia; Dowell, Darren; Evans II, Neal J.; Glenn, Jason; Harvey, Paul; Rosolowsky, Erik; Stringfellow, Guy S.;
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 02/11/2010 Português
Relevância na Pesquisa
35.65%
We present the Bolocam Galactic Plane Survey (BGPS), a 1.1 mm continuum survey at 33" effective resolution of 170 square degrees of the Galactic Plane visible from the northern hemisphere. The survey is contiguous over the range -10.5 < l < 90.5, |b| < 0.5 and encompasses 133 square degrees, including some extended regions |b| < 1.5. In addition to the contiguous region, four targeted regions in the outer Galaxy were observed: IC1396, a region towards the Perseus Arm, W3/4/5, and Gem OB1. The BGPS has detected approximately 8400 clumps over the entire area to a limiting non-uniform 1-sigma noise level in the range 11 to 53 mJy/beam in the inner Galaxy. The BGPS source catalog is presented in a companion paper (Rosolowsky et al. 2010). This paper details the survey observations and data reduction methods for the images. We discuss in detail the determination of astrometric and flux density calibration uncertainties and compare our results to the literature. Data processing algorithms that separate astronomical signals from time-variable atmospheric fluctuations in the data time-stream are presented. These algorithms reproduce the structure of the astronomical sky over a limited range of angular scales and produce artifacts in the vicinity of bright sources. Based on simulations...

High Volume Computing: Identifying and Characterizing Throughput Oriented Workloads in Data Centers

Zhan, Jianfeng; Zhang, Lixin; Sun, Ninghui; Wang, Lei; Jia, Zhen; Luo, Chunjie
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
35.64%
For the first time, this paper systematically identifies three categories of throughput oriented workloads in data centers: services, data processing applications, and interactive real-time applications, whose targets are to increase the volume of throughput in terms of processed requests or data, or supported maximum number of simultaneous subscribers, respectively, and we coin a new term high volume computing (in short HVC) to describe those workloads and data center computer systems designed for them. We characterize and compare HVC with other computing paradigms, e.g., high throughput computing, warehouse-scale computing, and cloud computing, in terms of levels, workloads, metrics, coupling degree, data scales, and number of jobs or service instances. We also preliminarily report our ongoing work on the metrics and benchmarks for HVC systems, which is the foundation of designing innovative data center computer systems for HVC workloads.; Comment: 10 pages

Strong-Motion Instrumental data on the San Fernando Earthquake of Feb. 9, 1971

Hudson, Donald E.
Fonte: California Institute of Technology Publicador: California Institute of Technology
Tipo: Report or Paper; PeerReviewed Formato: application/pdf
Publicado em 01/01/1971 Português
Relevância na Pesquisa
35.66%
The San Fernando Earthquake of February 9, 1971 occurred virtually at the center of the Southern California strong-motion earthquake instrumentation network, and provided an unprecedented amount of valuable data on strong earthquake -generated ground motions, This data will be of key significance in interpreting the severe damage to many modern engineering structures which occurred, and marks a major development in the field of earthquake engineering. It was evident immediately after the event that the problems of recovering field records, of processing the information, and of disseminating the results as quickly and as widely as possible would severely tax the available resources. Fortunately, the close cooperation which had been built up over the years between the Seismological Field Survey of the U. S. Department of Commerce and the Earthquake Engineering Research Laboratory of the California Institute of Technology provided an operating group which could be quickly expanded to meet the challenge. In the days and weeks following the earthquake, each of these organizations issued numerous preliminary reports aimed at the quickest possible distribution of information. The present report up-dates and brings together a number of these initial releases...

Refinement of the Spitzer Space Telescope Pointing History Based on Image Registration Corrections from Multiple Data Channels

McCallon, Howard L.; Fowler, John W.; Laher, Russ R.; Masci, Frank J.; Moshir, Mehrdad
Fonte: Astronomical Society of the Pacific Publicador: Astronomical Society of the Pacific
Tipo: Article; PeerReviewed Formato: application/pdf
Publicado em /11/2007 Português
Relevância na Pesquisa
35.69%
Position reconstruction for images acquired by the Infrared Array Camera (IRAC), one of the science instruments onboard the Spitzer Space Telescope, is a multistep procedure that is part of the routine processing done at the Spitzer Science Center (SSC). The IRAC instrument simultaneously images two different sky footprints, each with two independent infrared passbands (channels). The accuracy of the initial Spitzer pointing reconstruction is typically slightly better than 1". The well‐known technique of position matching imaged point sources to even more accurate star catalogs to refine the pointing further is implemented for SSC processing of IRAC data as well. Beyond that, the optimal processing of redundant pointing information from multiple instrument channels to yield an even better solution is also performed at the SSC. Our multichannel data processing approach is particularly beneficial when the star‐catalog matches are sparse in one channel but copious in others. A thorough review of the algorithm as implemented for the Spitzer mission reveals that the mathematical formalism can be fairly easily generalized for application to other astronomy missions. The computation of pointing uncertainties, the interpolation of pointing corrections and their uncertainties between measurements...

A Computational package for the evaluation of centrifugal turbopumps

Knebel, Albert
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
35.68%
Future space missions will require reusable engine platforms which in some instances will necessitate deep-engine throttling thrust capability. This is accomplished through the off-design operation of the engine fuel pumps. At off-design conditions, there are additional losses associated with centrifugal turbopump operation, leading to inefficiencies and poor pump performance. Future turbopump designs will incorporate various design changes/innovations to address these inefficiencies. Therefore a means to accurately evaluate proposed designs over extended operating ranges is required. The Centrifugal Pump Analysis Code (CPAC) provides this means. CPAC is a one-dimensional meanline analysis code which provides design and off-design performance predictions based on pump geometry and operating conditions. CPAC is based on the Loss Isolation Code (LSISO) which was written in the early 1970's for NASA Lewis Research Center. The CPAC user interface is a menu driven treed format with an on-line user manual and user help screens. Several enhancements: * Additional pump elements * Node based modeling scheme * Individual or multiple element analysis * Constant or variable fluid properties * English or SI unit input/output * User-friendly interface incorporating - various input options - online input editting * Graphical output analysis including test data comparison capability along with online help and accompaning user's manual make CPAC a versatile tool for turbopump design performance and evaluation. Based on comparison to experimental test data...

3D head motion, point-of-regard and encoded gaze fixations in real scenes: next-generation portable video-based monocular eye tracking

Munn, Susan M.
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Dissertação
Português
Relevância na Pesquisa
35.68%
Portable eye trackers allow us to see where a subject is looking when performing a natural task with free head and body movements. These eye trackers include headgear containing a camera directed at one of the subject's eyes (the eye camera) and another camera (the scene camera) positioned above the same eye directed along the subject's line-of-sight. The output video includes the scene video with a crosshair depicting where the subject is looking -- the point-of-regard (POR) -- that is updated for each frame. This video may be the desired final result or it may be further analyzed to obtain more specific information about the subject's visual strategies. A list of the calculated POR positions in the scene video can also be analyzed. The goals of this project are to expand the information that we can obtain from a portable video-based monocular eye tracker and to minimize the amount of user interaction required to obtain and analyze this information. This work includes offline processing of both the eye and scene videos to obtain robust 2D PORs in scene video frames, identify gaze fixations from these PORs, obtain 3D head motion and ray trace fixations through volumes-of-interest (VOIs) to determine what is being fixated, when and where (3D POR). To avoid the redundancy of ray tracing a 2D POR in every video frame and to group these POR data meaningfully...

An Investigation in printing from a remote field location using wireless communications

Beden, Brenda
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
Português
Relevância na Pesquisa
35.66%
Computers have changed the way our society works. Everyday life is somehow effected by a computer. It has changed the way many industries do their business. The business world is now a global community. The Graphic Arts industry has been impacted by these changes. With computers, documents are now found in digital form. Instead of being hand prepared, they are compiled within the computer realm. By using modems, these documents can travel from one location and be printed at several different locations, even world wide. As the computer evolves, it is also becoming more portable so that our mobile society is not tied to one location. Along with this mobility, there is a strong trend in communications that are mobile also. Wireless technologies are advancing at a rapid rate to keep up with customer demand. By combining these emerging technologies, is it possible that a person with some knowledge of computers and peripherals, desktop publishing, and digital photography can transmit documents using cellular communications from a field location to a digital press to produce a finished product? A digital camera was used to capture images for a test document. The images were then downloaded to a laptop computer. From there changes were made to the images that fit the parameters of the final output device. Using Quark XPress the test document was prepared. It included four of the images taken with the digital camera. When the document was complete it was saved to a PostScript file. The transmission of the file was possible by using a PCMCIA Fax/modem card installed in the portable computer. This was connected using a special cellular phone adapter with a Motorola Elite Cellular Phone. Two transmission tests were attempted. The first test used the internet as a means to connect to the file server at the Digital Publishing Center at Rochester Institute of Technology. The second test used Xerox proprietary software Launch to link the computer to the file server at Suttons' Printing in Grand Junction...

Enabling user-oriented data access in a satellite data portal

Kalyanam, Rajesh; Zhao, Lan; Park, Taezoon; Biehl, Larry; Song, Carol
Fonte: Grid Computing Environments (GCE) Publicador: Grid Computing Environments (GCE)
Tipo: Proceedings
Português
Relevância na Pesquisa
35.67%
This paper presents the design and implementation of a webenabled platform for enhancing user data access experience in a satellite data portal. The platform integrates a number of useroriented capabilities including the subscription, production, dissemination, and visualization of custom satellite remote sensing data hosted at the Purdue Terrestrial Observatory (PTO). Different from a traditional remote sensing system, the PTO data portal aims at enabling access to real-time satellite remote sensing data products for a large, diverse user community. Central to this system is a user-driven publish/subscribe model that empowers users with the ability to specify, control, and receive satellite data products without having to go through the time-consuming and error-prone manual process of configuring the system to generate the requested data. The user-oriented capabilities are implemented on top of a service-oriented architecture (SOA) backed by an existing satellite data receiving and processing backend. To control the processing of user data subscription requests, a workflow is created by leveraging the SOA interfaces and mechanisms. The user-oriented platform also embraces the recent Web 2.0 technologies in bringing the rich access...