Search alternatives:
preprocessing process » preprocessing steps (Expand Search), preprocessing phase (Expand Search)
process involves » process involved (Expand Search), processes involved (Expand Search), proteins involved (Expand Search)
preprocessing process » preprocessing steps (Expand Search), preprocessing phase (Expand Search)
process involves » process involved (Expand Search), processes involved (Expand Search), proteins involved (Expand Search)
-
61
Performance of various methods on MERFISH.
Published 2025“…To address these challenges, we propose a novel deep-learning method called DMGCN for domain identification. The process begins with preprocessing that constructs two types of graphs: a spatial graph based on Euclidean distance and a feature graph based on Cosine distance. …”
-
62
Performance of various methods on DLPFC.
Published 2025“…To address these challenges, we propose a novel deep-learning method called DMGCN for domain identification. The process begins with preprocessing that constructs two types of graphs: a spatial graph based on Euclidean distance and a feature graph based on Cosine distance. …”
-
63
Parameter configuration of all datasets.
Published 2025“…To address these challenges, we propose a novel deep-learning method called DMGCN for domain identification. The process begins with preprocessing that constructs two types of graphs: a spatial graph based on Euclidean distance and a feature graph based on Cosine distance. …”
-
64
Seismic data set.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
65
Performance evaluation of different deep models.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
66
AlexNet architecture.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
67
CNN architecture.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
68
LeNet training hyperparameters.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
69
AlexNet training hyperparameters.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
70
pone.0331952.t005 -
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
71
Proposed method flow diagram.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
72
CNN training hyperparameters.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
73
LeNet architecture.
Published 2025“…This study addresses these challenges by employing deep learning approaches, specifically LeNet, AlexNet, and conventional CNN architectures, to improve seismic resolution and synthetic seismogram generation. The methodology involves preprocessing seismic and well-log data, calculating acoustic impedance and reflection coefficients, and applying Continuous Wavelet Transform (CWT) for feature extraction. …”
-
74
Flowchart of the proposed approach.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
75
CM of highest results obtained.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
76
A list of abbreviations used in this paper.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
77
Performance evaluation of all experiments.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
78
Parameters values used for GA optimization.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
79
ROC of highest results obtained.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”
-
80
Accuracy over selected features.
Published 2025“…The features extracted by both models are fused and optimized through two sophisticated feature selection techniques: Dragonfly and Genetic Algorithm (GA). The optimization process involves rigorous experimentation with 5- and 10-fold cross-validation to evaluate performance across various feature sets. …”