Showing 1 - 6 results of 6 for search '(( ((algorithm steps) OR (algorithm models)) function ) OR ( algorithm python function ))~', query time: 0.41s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4

    Code and Data for 'Fabrication and testing of lensed fiber optic probes for distance sensing using common path low coherence interferometry' by Radu Stancu (21165068)

    Published 2025
    “…'model.py' contains the function 'grin_model' which models the working distance and focused spot FWHM base on the length of NCF and GRIN fibre. …”
  5. 5

    <b>Rethinking neighbourhood boundaries for urban planning: A data-driven framework for perception-based delineation</b> by Shubham Pawar (22471285)

    Published 2025
    “…</p><p dir="ltr"><b>Input:</b></p><ul><li><code>svi_module/svi_data/svi_info.csv</code> - Image metadata from Step 1</li><li><code>perception_module/trained_models/</code> - Pre-trained models</li></ul><p dir="ltr"><b>Command:</b></p><pre><pre>python -m perception_module.pred \<br> --model-weights .…”
  6. 6

    An Ecological Benchmark of Photo Editing Software: A Comparative Analysis of Local vs. Cloud Workflows by Pierre-Alexis DELAROCHE (22092572)

    Published 2025
    “…Performance Profiling Algorithms Energy Measurement Methodology # Pseudo-algorithmic representation of measurement protocol def capture_energy_metrics(workflow_type: WorkflowEnum, asset_vector: List[PhotoAsset]) -> EnergyProfile: baseline_power = sample_idle_power_draw(duration=30) with PowerMonitoringContext() as pmc: start_timestamp = rdtsc() # Read time-stamp counter if workflow_type == WorkflowEnum.LOCAL: result = execute_local_pipeline(asset_vector) elif workflow_type == WorkflowEnum.CLOUD: result = execute_cloud_pipeline(asset_vector) end_timestamp = rdtsc() energy_profile = EnergyProfile( duration=cycles_to_seconds(end_timestamp - start_timestamp), peak_power=pmc.get_peak_consumption(), average_power=pmc.get_mean_consumption(), total_energy=integrate_power_curve(pmc.get_power_trace()) ) return energy_profile Statistical Analysis Framework Our analytical pipeline employs advanced statistical methodologies including: Variance Decomposition: ANOVA with nested factors for hardware configuration effects Regression Analysis: Generalized Linear Models (GLM) with log-link functions for energy modeling Temporal Analysis: Fourier transform-based frequency domain analysis of power consumption patterns Cluster Analysis: K-means clustering with Euclidean distance metrics for workflow classification Data Validation and Quality Assurance Measurement Uncertainty Quantification All energy measurements incorporate systematic and random error propagation analysis: Instrument Precision: ±0.1W for CPU power, ±0.5W for GPU power Temporal Resolution: 1ms sampling with Nyquist frequency considerations Calibration Protocol: NIST-traceable power standards with periodic recalibration Environmental Controls: Temperature-compensated measurements in climate-controlled facility Outlier Detection Algorithms Statistical outliers are identified using the Interquartile Range (IQR) method with Tukey's fence criteria (Q₁ - 1.5×IQR, Q₃ + 1.5×IQR). …”