ISSN: 2582 - 9734

Past Issue

Quantum State Mapping and Measurement Techniques: Foundations for Quantum Computing and Communication

Yudhvir, Dr. Vandana Yadav, Dr. B. Ramesh

CrossRef DOI URL :


This paper explores quantum state mapping techniques critical for understanding and utilizing quantum systems in computing and communication. It focuses on methods such as quantum state tomography, phase-space distributions, homodyne and heterodyne detection, weak measurements, and process tomography. These techniques allow for indirect yet precise reconstruction of quantum states, overcoming challenges like wavefunction collapse and noise. Using synthetic data, the study demonstrates close alignment between theoretical probabilities and simulated outcomes, validating the reliability of quantum measurement protocols..

Enhancing Contextual Emotion Recognition in Dialogues Using Transformer-Based Architectures

Mayuri Madhvi, Dr. Jitender Rai

CrossRef DOI URL :


This study highlights the effectiveness of transformer-based models in emotion recognition within dialogue-driven environments. By leveraging contextual dependencies across conversation turns, these models capture nuanced emotional expressions often missed by traditional approaches. Despite achieving high accuracy, the study stresses that accuracy alone may not reflect true model performance due to class imbalances and emotional subtleties..

Role of Blue Light and Photosensitizers in The Reduction of Foodborne Bacteria

Sajesh T, Dr. Nirmal Sharma

CrossRef DOI URL :


This research seeks to determine if Escherichia coli O157, Staphylococcus aureus, and Salmonella enterica may be inactivated by combining blue light (405 nm) with the natural photosensitizers riboflavin and chlorophyll. The bacteria and viruses were subjected to blue light and photosensitizers at concentrations of 5 µM for different amounts of time (5, 10, 15, and 20 minutes). The results showed that the combined treatment greatly decreased bacterial populations. At 20 minutes, the maximal log reductions for E. coli, S. aureus, and S. enterica were 5.6, 5.8, and 5.4 log CFU/ml, respectively. .

Strengthen Patient Data Privacy and System Efficiency with Mehr in Smart Health Monitoring

Aparna Datta , Dr. Narendra Chaudhari

CrossRef DOI URL :


Smart health monitoring technologies are advancing at a rapid pace, which has changed patient care by allowing for continuous health surveillance, individualized therapy, and real-time data collecting. On the other hand, serious concerns about data privacy and system efficiency are brought up by these advancements. The study takes a quantitative approach, simulating a hospital infrastructure to evaluate five major e-health schemes: IoT Healthcare 4.0, IoT Network Forensics, Health Insurance Barriers, and Cloud Storage in e-Health, and MEHR. The metrics that were measured included transmission cost, encryption computing time, and decryption computing time. Using a standardized simulation platform that includes the PCB cryptographic library and Android-based applications, the results show that MEHR has the best efficiency and data integrity in healthcare data exchange, even though it has slightly higher encryption costs due to its robust security mechanisms. It also achieves the lowest decryption time and transmission cost. The results demonstrate that MEHR has great promise as a safe and scalable option for EHRs in contemporary healthcare systems that prioritize patient privacy..

Integrating Data Mining and Seismic Modelling for Natural Hazard Assessment in Tectonic Zones

Mrudula Manish Gudadhe, Dr. Narendra Chaudhari

CrossRef DOI URL :


In order to improve the evaluation of natural hazards in tectonically active zones, this research offers an integrated strategy that combines data mining approaches with seismic modeling. Conventional seismic models don't always do a good job of representing nonlinear connections and dynamic tectonic activity, especially with the growing amount and complexity of complex geographic data. To improve forecast accuracy and early warning capabilities, data mining may be used as a nonparametric analytical strategy to uncover hidden patterns and correlations within vast, diverse datasets. Data mining helps find seismic precursors, estimate event magnitudes, and identify high-risk zones by using machine learning methods including clustering, regression, and classification. This hybrid approach provides a more thorough comprehension of possible hazards when combined with probabilistic and deterministic seismic models. Dynamic risk mapping, multi-hazard interaction modeling, and real-time hazard monitoring are the main points of the research. It further shows how these methods, when combined, can build stronger infrastructure and make better policy choices. The integrated model has great promise for enhancing disaster preparation and risk mitigation measures in earthquake-prone locations, despite obstacles such as data quality, computing needs, and multidisciplinary collaboration. Findings from this study point to data-driven, adaptable approaches as the way forward for seismic risk assessment..

A Deep Learning-Based Approach for Robust Text Detection and Recognition in Natural Scene Images

Somnath Saha, Dr. Narendra Chaudhari

CrossRef DOI URL :


Recent years have seen a maturation of scene text detection and recognition research methodologies, with an increased focus on improving accuracy, due to the growing relevance of this field in real life. It is challenging to satisfy the application's needs using multiple ways when dealing with complicated backgrounds in natural settings, variable font layouts, lighting, etc. Text identification and recognition in complicated natural scene photos is addressed in this study by introducing a very effective deep learning approach. The suggested system captures contextual information in character sequences using a Bidirectional Long Short-Term Memory (BLSTM) network and a ResNet-based convolutional neural network (CNN) for deep visual feature extraction. The DenseNet architecture is used for text recognition because of its high feature propagation and reuse capabilities. The final character categorization is done by a Softmax classifier. Partitioned into training, validation, and test sets in a 3:1:1 ratio, the VOCdevkit and MSRA-TD500 datasets are used to train and assess the model. Evaluative metrics included F-measure, Precision, and Recall, and training was executed over 50 epochs using an NVIDIA Tesla M40 GPU with Keras and TensorFlow. The experimental findings show that our strategy outperforms other current methods on the benchmark datasets ICDAR 2011 and ICDAR 2013. As an added bonus, DenseNet-based identification reached a peak accuracy of 94%, proving that our system is reliable and resilient in identifying English and Chinese letters in a variety of stressful visual environments..

Call For Papers

August

2025

Call For Papers
August 2025
August

31

Publication:
31-August-2025