On the Optimum Architecture of the Biologically Inspired Hierarchical Temporal Memory Model Applied to the Hand-Written Digit Recognition
In the paper we describe basic functions of the Hierarchical Temporal Memory (HTM) network based on a novel biologically inspired model of the large-scale structure of the mammalian neocortex. The focus of this paper is in a systematic exploration of possibilities how to optimize important controlling parameters of the HTM model applied to the classification of hand-written digits from the USPS database. The statistical properties of this database are analyzed using the permutation test which employs a randomization distribution of the training and testing data. Based on a notion of the homogeneous usage of input image pixels, a methodology of the HTM parameter optimization is proposed. In order to study effects of two substantial parameters of the architecture: the patch size and the overlap in more details, we have restricted ourselves to the single-level HTM networks. A novel method for construction of the training sequences by ordering series of the static images is developed. A novel method for estimation of the parameter maxDist based on the box counting method is proposed. The parameter sigma of the inference Gaussian is optimized on the basis of the maximization of the belief distribution entropy. Both optimization algorithms can be equally applied to the multi-level HTM networks as well. The influences of the parameters transitionMemory and requestedGroupCount on the HTM network performance have been explored. Altogether, we have investigated 2736 different HTM network configurations. The obtained classification accuracy results have been benchmarked with the published results of several conventional classifiers.
Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture
Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.
A Method for ADC Error Testing and its Compensation in Ratiometric Measurements
Errors induced due to ratiometric measurements are discussed and a simplified compensation method to reduce the various static errors of ADC, voltage reference errors in ratiometry and resistance mismatch errors is proposed. Curve fitting is done for the error samples and the system is modelled in comparison to an ideal system. Static errors and other ratiometric errors, thus modelled are derived into a corrective equation in comparison with an errorless system. Implementation of the proposed method is discussed for a resistance measurement system and analyzed. This paper also discusses the usage of the proposed system with successive approximation ADCs for ratiometric measurement operations against the conventional requirement dual slope ADCs for the same.
Counting Method for Measuring and Linearity Checking Photometry Devices
The possibility of calibration and linearity check of optical detectors in a wide dynamic range by method of counting separate parts of energy is investigated. Counting method permits nonselective energy change of measuring optical pulse more then 105 times. A construction of working source of radiation with GaAs LED is described. Results of experimental investigation are presented.
Prediction of Spirometric Forced Expiratory Volume (FEV1) Data Using Support Vector Regression
In this work, prediction of forced expiratory volume in 1 second (FEV1) in pulmonary function test is carried out using the spirometer and support vector regression analysis. Pulmonary function data are measured with flow volume spirometer from volunteers (N=175) using a standard data acquisition protocol. The acquired data are then used to predict FEV1. Support vector machines with polynomial kernel function with four different orders were employed to predict the values of FEV1. The performance is evaluated by computing the average prediction accuracy for normal and abnormal cases. Results show that support vector machines are capable of predicting FEV1 in both normal and abnormal cases and the average prediction accuracy for normal subjects was higher than that of abnormal subjects. Accuracy in prediction was found to be high for a regularization constant of C=10. Since FEV1 is the most significant parameter in the analysis of spirometric data, it appears that this method of assessment is useful in diagnosing the pulmonary abnormalities with incomplete data and data with poor recording.
Inertia Compensation While Scanning Screw Threads on Coordinate Measuring Machines
Usage of scanning coordinate-measuring machines for inspection of screw threads has become a common practice nowadays. Compared to touch trigger probing, scanning capabilities allow to speed up the measuring process while still maintaining high accuracy. However, in some cases accuracy drastically depends on the scanning speed. In this paper a compensation method is proposed allowing to reduce the influence of inertia of the probing system while scanning screw threads on coordinate-measuring machines.
On the Optimum Architecture of the Biologically Inspired Hierarchical Temporal Memory Model Applied to the Hand-Written Digit Recognition
In the paper we describe basic functions of the Hierarchical Temporal Memory (HTM) network based on a novel biologically inspired model of the large-scale structure of the mammalian neocortex. The focus of this paper is in a systematic exploration of possibilities how to optimize important controlling parameters of the HTM model applied to the classification of hand-written digits from the USPS database. The statistical properties of this database are analyzed using the permutation test which employs a randomization distribution of the training and testing data. Based on a notion of the homogeneous usage of input image pixels, a methodology of the HTM parameter optimization is proposed. In order to study effects of two substantial parameters of the architecture: the patch size and the overlap in more details, we have restricted ourselves to the single-level HTM networks. A novel method for construction of the training sequences by ordering series of the static images is developed. A novel method for estimation of the parameter maxDist based on the box counting method is proposed. The parameter sigma of the inference Gaussian is optimized on the basis of the maximization of the belief distribution entropy. Both optimization algorithms can be equally applied to the multi-level HTM networks as well. The influences of the parameters transitionMemory and requestedGroupCount on the HTM network performance have been explored. Altogether, we have investigated 2736 different HTM network configurations. The obtained classification accuracy results have been benchmarked with the published results of several conventional classifiers.
Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture
Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.
A Method for ADC Error Testing and its Compensation in Ratiometric Measurements
Errors induced due to ratiometric measurements are discussed and a simplified compensation method to reduce the various static errors of ADC, voltage reference errors in ratiometry and resistance mismatch errors is proposed. Curve fitting is done for the error samples and the system is modelled in comparison to an ideal system. Static errors and other ratiometric errors, thus modelled are derived into a corrective equation in comparison with an errorless system. Implementation of the proposed method is discussed for a resistance measurement system and analyzed. This paper also discusses the usage of the proposed system with successive approximation ADCs for ratiometric measurement operations against the conventional requirement dual slope ADCs for the same.
Counting Method for Measuring and Linearity Checking Photometry Devices
The possibility of calibration and linearity check of optical detectors in a wide dynamic range by method of counting separate parts of energy is investigated. Counting method permits nonselective energy change of measuring optical pulse more then 105 times. A construction of working source of radiation with GaAs LED is described. Results of experimental investigation are presented.
Prediction of Spirometric Forced Expiratory Volume (FEV1) Data Using Support Vector Regression
In this work, prediction of forced expiratory volume in 1 second (FEV1) in pulmonary function test is carried out using the spirometer and support vector regression analysis. Pulmonary function data are measured with flow volume spirometer from volunteers (N=175) using a standard data acquisition protocol. The acquired data are then used to predict FEV1. Support vector machines with polynomial kernel function with four different orders were employed to predict the values of FEV1. The performance is evaluated by computing the average prediction accuracy for normal and abnormal cases. Results show that support vector machines are capable of predicting FEV1 in both normal and abnormal cases and the average prediction accuracy for normal subjects was higher than that of abnormal subjects. Accuracy in prediction was found to be high for a regularization constant of C=10. Since FEV1 is the most significant parameter in the analysis of spirometric data, it appears that this method of assessment is useful in diagnosing the pulmonary abnormalities with incomplete data and data with poor recording.
Inertia Compensation While Scanning Screw Threads on Coordinate Measuring Machines
Usage of scanning coordinate-measuring machines for inspection of screw threads has become a common practice nowadays. Compared to touch trigger probing, scanning capabilities allow to speed up the measuring process while still maintaining high accuracy. However, in some cases accuracy drastically depends on the scanning speed. In this paper a compensation method is proposed allowing to reduce the influence of inertia of the probing system while scanning screw threads on coordinate-measuring machines.