Summary Isothermal titration microcalorimeters are submerged in an ambience at constant temperature and, this is the reason why there should not be variations in the baseline; but the experimental measures show that, in some cases, it is produced a jump in the baseline after the liquid injection. In this paper, it is studied the origin of this variation of the baseline in order to avoid it, if it is possible, or correct it and thus, to determine with the minimum error the amounts of energy or power developed in the energetic process that is being studied. The experimental results that are shown support the hypothesis that states that the cause of the baseline jump in an isothermal microcalorimeter is the variation of the thermal coupling between the content of the mixture cell and the axis of the stirrer, which is coupled, at the same time, with an area with a temperature slightly lower (in this case) than the thermostat temperature. This jump is independent from the dissipation and it always has the same sign. The magnitude of the baseline jump is variable and it mainly depends on the volume that the liquid content in a cell reaches and the placement of the stirrer within the cell, that can be changed in the handling process.
Authors:Malgorzata Wislowska, Dominik P. J. Heib, Hermann Griessenberger, Kerstin Hoedlmoser, and Manuel Schabus
., 2006 ; Schabus et al., 2008 ; Tucker and Fishbein, 2009 ).
In this paper, we sought to explore the influence of baseline memory performance (BMP) – i.e., a measure of encoding efficiency of new declarative material – on forgetting or the
Various types of transformations require different baselines reflecting specificities of these transitions. The present work deals with the case when a degree of transformation is directly proportional to heat consumed or released. For such case, a baseline is named an integral baseline and is traditionally constructed by unnecessary simplifications. A new method is proposed as an alternative fast and robust computational method for baseline construction utilizing interpolating cubic splines. The method is self-consistent in the sense that it is free of needless assumptions and that it provides linearity between the degree of transformation and heat measured.
A collection of coauthored papers is the new norm for doctoral dissertations in the natural and biomedical sciences, yet there
is no consensus on how to partition authorship credit between PhD candidates and their coauthors. Guidelines for PhD programs
vary but tend to specify only a suggested range for the number of papers to be submitted for evaluation, sometimes supplemented
with a requirement for the PhD candidate to be the principal author on the majority of submitted papers. Here I use harmonic
counting to quantify the actual amount of authorship credit attributable to individual PhD graduates from two Scandinavian
universities in 2008. Harmonic counting corrects for the inherent inflationary and equalizing biases of routine counting methods,
thereby allowing the bibliometrically identifiable amount of authorship credit in approved dissertations to be analyzed with
unprecedented accuracy. Unbiased partitioning of authorship credit between graduates and their coauthors provides a post hoc
bibliometric measure of current PhD requirements, and sets a de facto baseline for the requisite scientific productivity of
these contemporary PhD’s at a median value of approximately 1.6 undivided papers per dissertation. Comparison with previous
census data suggests that the baseline has shifted over the past two decades as a result of a decrease in the number of submitted
papers per candidate and an increase in the number of coauthors per paper. A simple solution to this shifting baseline syndrome
would be to benchmark the amount of unbiased authorship credit deemed necessary for successful completion of a specific PhD
program, and then monitor for departures from this level over time. Harmonic partitioning of authorship credit also facilitates
cross-disciplinary and inter-institutional analysis of the scientific output from different PhD programs. Juxtaposing bibliometric
benchmarks with current baselines may thus assist the development of harmonized guidelines and transparent transnational quality
assurance procedures for doctoral programs by providing a robust and meaningful standard for further exploration of the causes
of intra- and inter-institutional variation in the amount of unbiased authorship credit per dissertation.
Authors:S. Alcay, C. Inal, C. Yigit, and M. Yetkin
Nowadays, GPS is the best positioning system with its constellation, but number of GLONASS satellites increased to the required number, with launched new ones, for positioning. With recent revitalization of GLONASS, a great number of high precision GLONASS and GPS/GLONASS receivers have been produced. In this paper, baselines of two networks have been analyzed in order to assess the usability of GLONASS on global positioning. In both networks, repeatabilities of results were investigated by using GPS, GLONASS and GPS/GLONASS data. Results revealed that repeatabilities of all baselines by using GLONASS observations are not consistent when compared to the GPS and GPS/GLONASS.
A simple method is described for use with the differential scanning calorimeter for baseline interpolation in continual processes over a wide temperature interval. For the process of water desorption from the synthetic zeolite LiA with the coveragegq=1.5, the measured heat of desorption wasQ=11.2±0.5 kcal/mole.
Authors:V. García-Cuello, J. Moreno-Pirajan, L. Giraldo-Gutiérrez, K. Sapag, and G. Zgrablich
This work shows the results obtained to determine the noise in the baseline of a specially designed Tian Calvet-Type adsorption
microcalorimeter. The results show that noise levels vary from 0.5 to 10 μV, which were evaluated varying the electrical work
and the micro calorimeter surrounding temperature. Relationships can be seen between the variables employed in the observation
of stability, temperature, potency levels and generated noise.
Authors:Eric Iversen, Magnus Gulbrandsen, and Antje Klitkou
As the commercialization of academic research has risen as a target area in many countries, the need for better empirical
data collection to evaluate policy changes on this front has increasingly been recognized. This need is exemplified in the
Norwegian case where legislative changes went into effect in 2003 expressly to encourage greater commercialization through
patenting research results. This policy ambition faces the problem that no record of the patenting activity of academic researchers
is available before 2003 when the country’s “professor’s privilege” was phased out. This article addresses the fundamental
difficulty of how to empirically test the effect of such policy aims. It develops a methodology which can be used to reliably
baseline changes in the extent and focus of academic patents. The purpose is to describe the empirical approach and results,
while also providing insight into the changes in Norwegian policy on this front and their context.