http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Super-resolution Imaging via Uncertainty Models
Kathiravan S,Kanakaraj J 한국산학기술학회 2011 SmartCR Vol.1 No.1
Super-resolution is the process of converting many similar low-resolution images or video sequences. Super-resolution can be used for superior automatic target or object detection. There is always an uncertainty related to this method. An uncertainty is rarely considered during analysis. In many cases, uncertainties are the noisy signals and their processing leads to invalid results. This paper reviews certain super-resolution methodologies and presents two uncertainty model based super-resolution techniques. The presented models consider the uncertainty factor and establish an absolute probability distribution. The first model is based on analytical computation while the second is based on a statistical algorithm. These algorithms are presented as separate methods for image analysis and enhance performance compared with that of the existing automatic object recognition super-resolution methods.
An Overview of SR Techniques Applied to Images, Videos and Magnetic Resonance Images
Kathiravan S,Kanakaraj J 한국산학기술학회 2014 SmartCR Vol.4 No.3
In this paper, we have summarized the necessary statistical theory behind an image reconstruction, and it discusses the idea of super resolution as an inverse problem. Moreover, it presents the various contributions of various techniques to improve spatial resolution of images, video, and magnetic resonance images using super resolution methods. Subsequently, it portrays the major findings pertaining to the study. Additionally, we give a glimpse of the numerous factors influencing the performance of super resolution.
A Review of Magnetic Resonance Imaging Techniques
Kathiravan S,Kanakaraj J 한국산학기술학회 2013 SmartCR Vol.3 No.5
Magnetic resonance imaging (MRI) is the preferred imaging modality for visualization of intracranial soft tissues. Surgical planning and, increasingly, surgical navigation use high-resolution three-dimensional patient-specific structural maps of the brain. Such maps are commonly provided by pre-operatively acquired MRI scans, stereo-tactically co-registered with a rigid frame attached to the head of the patient. This paper illustrates magnetic resonance techniques such as fast MRI, spiral imaging, and parallel imaging. Subsequently, image quality evaluation techniques for magnetic resonance images and the perceptual difference models are discussed.
( Kathiravan Srinivasan ),( Chuan-yu Chang ),( Chao-hsi Huang ),( Min-hao Chang ),( Anant Sharma ),( Avinash Ankur ) 한국정보처리학회 2018 Journal of information processing systems Vol.14 No.4
Rapid advances in science and technology with exponential development of smart mobile devices, workstations, supercomputers, smart gadgets and network servers has been witnessed over the past few years. The sudden increase in the Internet population and manifold growth in internet speeds has occasioned the generation of an enormous amount of data, now termed ‘big data’. Given this scenario, storage of data on local servers or a personal computer is an issue, which can be resolved by utilizing cloud computing. At present, there are several cloud computing service providers available to resolve the big data issues. This paper establishes a framework that builds Hadoop clusters on the new single-board computer (SBC) Mobile Raspberry Pi. Moreover, these clusters offer facilities for storage as well as computing. Besides the fact that the regular data centers require large amounts of energy for operation, they also need cooling equipment and occupy prime real estate. However, this energy consumption scenario and the physical space constraints can be solved by employing a Mobile Raspberry Pi with Hadoop clusters that provides a cost-effective, low-power, high-speed solution along with micro-data center support for big data. Hadoop provides the required modules for the distributed processing of big data by deploying map-reduce programming approaches. In this work, the performance of SBC clusters and a single computer were compared. It can be observed from the experimental data that the SBC clusters exemplify superior performance to a single computer, by around 20%. Furthermore, the cluster processing speed for large volumes of data can be enhanced by escalating the number of SBC nodes. Data storage is accomplished by using a Hadoop Distributed File System (HDFS), which offers more flexibility and greater scalability than a single computer system.
Super-resolution Techniques based on Temporal Recursion using Optimal and Real Parameters
S. Kathiravan,J. Kanakaraj 한국산학기술학회 2014 SmartCR Vol.4 No.1
The major goal of a super-resolution image reconstruction method is to construct a single in-depth high-resolution image from a group of numerous low-resolution images of the scene taken from diverse positions. Since every low-resolution image retains a different view of the scene, it is possible to reconstruct an in-depth high-resolution image. Thus image super-resolution is a key to overcoming the material precincts of hardware competence. An enormous amount of video is still in traditional formats or at an even lower resolution. Some also has relentless coding artifacts. Hence, there is a need for techniques that can improve video quality and that show all the traditional and low-resolution videos on panels with high-resolution grids. Iterative super-resolution reconstruction algorithms can accomplish this exigent chore by using internal image models and an incorporated feedback loop to control output quality, thereby improving resolution and lessening artifacts. This article portrays the prospects of iterative reconstruction algorithms and establishes a new super-resolution algorithm that is computationally very strong and efficient against motion estimation errors.
Srinivasan, Kathiravan,Chang, Chuan-Yu,Huang, Chao-Hsi,Chang, Min-Hao,Sharma, Anant,Ankur, Avinash Korea Information Processing Society 2018 Journal of information processing systems Vol.14 No.4
Rapid advances in science and technology with exponential development of smart mobile devices, workstations, supercomputers, smart gadgets and network servers has been witnessed over the past few years. The sudden increase in the Internet population and manifold growth in internet speeds has occasioned the generation of an enormous amount of data, now termed 'big data'. Given this scenario, storage of data on local servers or a personal computer is an issue, which can be resolved by utilizing cloud computing. At present, there are several cloud computing service providers available to resolve the big data issues. This paper establishes a framework that builds Hadoop clusters on the new single-board computer (SBC) Mobile Raspberry Pi. Moreover, these clusters offer facilities for storage as well as computing. Besides the fact that the regular data centers require large amounts of energy for operation, they also need cooling equipment and occupy prime real estate. However, this energy consumption scenario and the physical space constraints can be solved by employing a Mobile Raspberry Pi with Hadoop clusters that provides a cost-effective, low-power, high-speed solution along with micro-data center support for big data. Hadoop provides the required modules for the distributed processing of big data by deploying map-reduce programming approaches. In this work, the performance of SBC clusters and a single computer were compared. It can be observed from the experimental data that the SBC clusters exemplify superior performance to a single computer, by around 20%. Furthermore, the cluster processing speed for large volumes of data can be enhanced by escalating the number of SBC nodes. Data storage is accomplished by using a Hadoop Distributed File System (HDFS), which offers more flexibility and greater scalability than a single computer system.
Mathur Nadarajan Kathiravan,김근호,류재원,김평일,이철원,김시욱 한국미생물학회 2015 The journal of microbiology Vol.53 No.11
In this study, novel DNA extraction and purification methods were developed to obtain high-quantity and reliable quality DNA from the microbial community of agricultural yellow loess soil samples. The efficiencies of five different soil DNAextraction protocols were evaluated on the basis of DNA yield, quality and DNA shearing. Our suggested extraction method, which used CTAB, EDTA and cell membrane lytic enzymes in the extraction followed by DNA precipitation using isopropanol, yielded a maximum DNA content of 42.28 ± 5.59 μg/g soil. In addition, among the five different purification protocols, the acid-treated polyvinyl polypyrrolidone (PVPP) spin column purification method yielded high-quality DNA and recovered 91% of DNA from the crude DNA. Spectrophotometry revealed that the ultraviolet A260/A230 and A260/A280 absorbance ratios of the purified DNA were 1.82 ± 0.03 and 1.94 ± 0.05, respectively. PCR-based 16S rRNA amplification showed clear bands at ~1.5 kb with acid-treated PVPP–purified DNA templates. In conclusion, our suggested extraction and purification protocols can be used to recover high concentration, high purity, and high-molecular-weight DNA from clay and silica-rich agricultural soil samples.
Mathur Nadarajan Kathiravan,김근호,류재원,한기환,김시욱 한국생물공학회 2015 Biotechnology and Bioprocess Engineering Vol.20 No.2
A novel hexavalent chromium (CrVI)-removing Bacillus sp. was isolated from leather industry wastewatercontaminated soil. This potential isolate was subjected to Cr(VI) removal under free and immobilized states in a stirred batch reactor (SBR). Two biokinetic parameters, Vmax and Km, and the effective diffusivity (De) for various bead sizes were calculated from Lineweaver-Burk and Eadie-Hoftsee plots, respectively. With respect to bead size, De decreased significantly from a maximum of 3.024 × 10−6 cm2/sec in the 0.20 cm bead to 2.948 × 10−6, 1.775 × 10−7, and 1.144 × 10−7 cm2/sec in the 0.40-, 0.60- and 0.80 cm beads, respectively. Additionally, steady and unsteady state modeling of diffusional mass transfer into the immobilized beads was conducted to determine the mass transfer rate as a function of time and the beads’ radial profile. Furthermore, the space-time yield (STY) was modeled according to the residence time of the reactor. The reactor’s STY was reasonable and could be further boosted by increasing the fractional biocatalyst.
Brindha Appavu,Kathiravan Kannan,Sivakumar Thiripuranthagan 한국공업화학회 2016 Journal of Industrial and Engineering Chemistry Vol.36 No.-
We report a new photocatalytic material to address the twin issues namely recombination of electron/hole pairs and visible light inactivity of the titania based photocatalysts. A template free hydrothermalmethod was adapted to synthesize mesoporous nitrogen doped reduced graphene oxide/titaniacomposite catalysts (N-[x% rGO-TiO2] where x = 2.5, 5, 7.5 and 10%). The detailed physicochemicalstudies of the catalysts were done and correlated with their photocatalytic activities. Simultaneousnitrogen doping on both reduced graphene oxide and titania was confirmed by X-ray photoelectronspectroscopy (XPS). The photocatalytic activities of the synthesized composite catalysts were evaluatedtowards the photodegradation of organic dyes namely methylene blue and congo red under visible lightirradiation. Among them, the composite N-[5% rGO/TiO2] showed the highest photocatalytic activity. Doping of nitrogen makes graphene a better charge transporter whereas nitrogen in TiO2 lattice makes itvisible light active (red shift).
A Review of Manchester, Miller, and FM0 Encoding Techniques
Lalitha V,Kathiravan S 한국산학기술학회 2014 SmartCR Vol.4 No.6
Encoding techniques are becoming more important in communication. Techniques such as Miller, Manchester, and FM0 encoding can be used in various applications. Each technique has different operations based on their needs. Each encoding scheme should be used without losing any of its parameters. The finite state machine can be used for all encodings, because at a time the input has given the corresponding output can be occurred due to this. So speed can be increased. The fully-reused VLSI architecture of FM0, Manchester, and Miller encoders has reduced the number of transistors and maintains the DC balance. The simulation results of Xilinx indicate successful functions.