Search alternatives:
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
data process » data processing (Expand Search), damage process (Expand Search), data access (Expand Search)
binary rate » binary image (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
rate based » rule based (Expand Search), made based (Expand Search), game based (Expand Search)
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
data process » data processing (Expand Search), damage process (Expand Search), data access (Expand Search)
binary rate » binary image (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
rate based » rule based (Expand Search), made based (Expand Search), game based (Expand Search)
-
1
Proposed Algorithm.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
2
Optimized Bayesian regularization-back propagation neural network using data-driven intrusion detection system in Internet of Things
Published 2025“…Then, the data is pre-processed using Variational Bayesian-based Maximum Correntropy Cubature Kalman Filtering (VBMCCKF) of noise removal and data enhancement. …”
-
3
Comparisons between ADAM and NADAM optimizers.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
4
Normalized computation rate for N = 10.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
5
-
6
An Example of a WPT-MEC Network.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
7
Related Work Summary.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
8
Simulation parameters.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
9
Training losses for N = 10.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
10
Summary of Notations Used in this paper.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
11
Improved support vector machine classification algorithm based on adaptive feature weight updating in the Hadoop cluster environment
Published 2019“…This result reflects the effectiveness of the algorithm, which provides a basis for the effective analysis and processing of image big data.…”
-
12
Algoritmo de clasificación de expresiones de odio por tipos en español (Algorithm for classifying hate expressions by type in Spanish)
Published 2024“…</p><h2>Model Architecture</h2><p dir="ltr">The model is based on <code>pysentimiento/robertuito-base-uncased</code> with the following modifications:</p><ul><li>A dense classification layer was added over the base model</li><li>Uses input IDs and attention masks as inputs</li><li>Generates a multi-class classification with 5 hate categories</li></ul><h2>Dataset</h2><p dir="ltr"><b>HATEMEDIA Dataset</b>: Custom hate speech dataset with categorization by type:</p><ul><li><b>Labels</b>: 5 hate type categories (0-4)</li><li><b>Preprocessing</b>:</li><li>Null values removed from text and labels</li><li>Reindexing and relabeling (original labels are adjusted by subtracting 1)</li><li>Exclusion of category 2 during training</li><li>Conversion of category 5 to category 2</li></ul><h2>Training Process</h2><h3>Configuration</h3><ul><li><b>Batch size</b>: 128</li><li><b>Epoches</b>: 5</li><li><b>Learning rate</b>: 2e-5 with 10% warmup steps</li><li><b>Early stopping</b> with patience=2</li><li><b>Class weights</b>: Balanced to handle class imbalance</li></ul><h3>Custom Metrics</h3><ul><li>Recall for specific classes (focus on class 2)</li><li>Precision for specific classes (focus on class 3)</li><li>F1-score (weighted)</li><li>AUC-PR</li><li>Recall at precision=0.6 (class 3)</li><li>Precision at recall=0.6 (class 2)</li></ul><h2>Evaluation Metrics</h2><p dir="ltr">The model is evaluated using:</p><ul><li>Macro recall, precision, and F1-score</li><li>One-vs-Rest AUC</li><li>Accuracy</li><li>Per-class metrics</li><li>Confusion matrix</li><li>Full classification report</li></ul><h2>Technical Features</h2><h3>Data Preprocessing</h3><ul><li><b>Tokenization</b>: Maximum length of 128 tokens (truncation and padding)</li><li><b>Encoding of labels</b>: One-hot encoding for multi-class classification</li><li><b>Data split</b>: 80% training, 10% validation, 10% testing</li></ul><h3>Optimization</h3><ul><li><b>Optimizer</b>: Adam with linear warmup scheduling</li><li><b>Loss function</b>: Categorical Crossentropy (from_logits=True)</li><li><b>Imbalance handling</b>: Class weights computed automatically</li></ul><h2>Requirements</h2><p dir="ltr">The following Python packages are required:</p><ul><li>TensorFlow</li><li>Transformers</li><li>scikit-learn</li><li>pandas</li><li>datasets</li><li>matplotlib</li><li>seaborn</li><li>numpy</li></ul><h2>Usage</h2><ol><li><b>Data format</b>:</li></ol><ul><li>CSV file or Pandas DataFrame</li><li>Required column name: <code>text</code> (string type)</li><li>Required column name: Data type label (integer type, 0-4) - optional for evaluation</li></ul><ol><li><b>Text preprocessing</b>:</li></ol><ul><li>Automatic tokenization with a maximum length of 128 tokens</li><li>Long texts will be automatically truncated</li><li>Handling of special characters, URLs, and emojis included</li></ul><ol><li><b>Label encoding</b>:</li></ol><ul><li>The model classifies hate speech into 5 categories (0-4)</li><li><code>0</code>: Political hatred: Expressions directed against individuals or groups based on political orientation.…”
-
13
GSE96058 information.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
14
The performance of classifiers.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
15
Contextual Dynamic Pricing with Strategic Buyers
Published 2024“…We then establish an <math><mrow><mi>O</mi><mo>(</mo><msqrt><mi>T</mi></msqrt><mo>)</mo></mrow></math> regret upper bound of our proposed policy and an <math><mrow><mi>Ω</mi><mo>(</mo><msqrt><mi>T</mi></msqrt><mo>)</mo></mrow></math> regret lower bound for any pricing policy within our problem setting. This underscores the rate optimality of our policy. Importantly, our policy is not a mere amalgamation of existing dynamic pricing policies and strategic behavior handling algorithms. …”
-
16
Machine Learning-Ready Dataset for Cytotoxicity Prediction of Metal Oxide Nanoparticles
Published 2025“…Details on the data sourcing process, prompt engineering strategies for large language model (LLM)-based extraction, and validation protocols are provided in the Supplementary Information section.…”