API reference

Data model

The data model.

datamodel Data classes and acceptable variables as defined by the SolarForecastArbiter Data Model document.

There are two kinds of sites:

datamodel.Site(name, latitude, longitude, …) Class for keeping track of Site metadata.
datamodel.SolarPowerPlant(name, latitude, …) Class for keeping track of metadata associated with solar power plant Sites.

The several model parameters are associated with the solar power plant site:

datamodel.PVModelingParameters(ac_capacity, …) Class for keeping track of generic PV modeling parameters
datamodel.FixedTiltModelingParameters(…) A class based on PVModelingParameters that has additional parameters for fixed tilt PV systems.
datamodel.SingleAxisModelingParameters(…) A class based on PVModelingParameters that has additional parameters for single axis tracking systems.

The Observation and Forecast:

datamodel.Observation(name, variable, …) A class for keeping track of metadata associated with an observation.
datamodel.Forecast(name, issue_time_of_day, …) A class to hold metadata for Forecast objects.

All datamodel objects have from_dict and to_dict methods:

datamodel.BaseModel.from_dict(input_dict[, …]) Construct a dataclass from the given dict, matching keys with the class fields.
datamodel.BaseModel.to_dict() Convert the dataclass into a dictionary suitable for uploading to the API.

PV modeling

The pvmodel module contains functions closely associated with PV modeling.

pvmodel Calculate AC power and modeling intermediates from system metadata, times, and weather data.

Several utility functions wrap pvlib functions:

pvmodel.calculate_solar_position(latitude, …) Calculates solar position using pvlib’s implementation of NREL SPA.
pvmodel.complete_irradiance_components(ghi, …) Uses the Erbs model to calculate DNI and DHI from GHI.
pvmodel.calculate_clearsky(latitude, …) Calculates clear sky irradiance using the Ineichen model and the SoDa climatological turbidity data set.

Three functions are useful for determining AOI, surface tilt, and surface azimuth. aoi_func_factory() is helpful for standardizing the calculations for tracking and fixed systems. See calculate_poa_effective(), for example.

pvmodel.aoi_func_factory(modeling_parameters) Create a function to calculate AOI, surface tilt, and surface azimuth from system modeling_parameters.
pvmodel.aoi_fixed(surface_tilt, …) Calculate AOI for fixed system, bundle return with tilt, azimuth for consistency with similar tracker function.
pvmodel.aoi_tracking(axis_tilt, …) Calculate AOI, surface tilt, and surface azimuth for tracking system.
pvmodel.calculate_poa_effective_explicit(…) Calculate effective plane of array irradiance from system metadata, solar position, and irradiance components.
pvmodel.calculate_poa_effective(aoi_func, …) Calculate effective plane of array irradiance from system metadata, solar position, and irradiance components.
pvmodel.calculate_power(dc_capacity, …[, …]) Calcuate AC power from system metadata, plane of array irradiance, and weather data using the PVWatts model.
pvmodel.irradiance_to_power(…[, temp_air, …]) Calcuate AC power from system metadata, solar position, and ghi, dni, dhi.

Reference forecasts

Entry points

High-level functions for NWP and persistence forecasts.

reference_forecasts.main.run_nwp(forecast, …) Calculate benchmark irradiance and power forecasts for a Forecast.
reference_forecasts.main.run_persistence(…) Run a persistence forecast for an observation.
reference_forecasts.main.find_reference_nwp_forecasts(…) Sort through all forecasts to find those that should be generated by the Arbiter from NWP models.
reference_forecasts.main.process_nwp_forecast_groups(…) Groups NWP forecasts based on piggyback_on, calculates the forecast as appropriate for run_time, and uploads the values to the API.
reference_forecasts.main.make_latest_nwp_forecasts(…) Make all reference NWP forecasts for run_time that are within issue_buffer of the next issue time for the forecast.

NWP models

reference_forecasts.models.hrrr_subhourly_to_subhourly_instantaneous(…) Subhourly (15 min) instantantaneous HRRR forecast.
reference_forecasts.models.hrrr_subhourly_to_hourly_mean(…) Hourly mean HRRR forecast.
reference_forecasts.models.rap_ghi_to_instantaneous(…) Hourly instantantaneous RAP forecast.
reference_forecasts.models.rap_ghi_to_hourly_mean(…) Take hourly RAP instantantaneous irradiance and convert it to hourly average forecasts.
reference_forecasts.models.rap_cloud_cover_to_hourly_mean(…) Take hourly RAP instantantaneous cloud cover and convert it to hourly average forecasts.
reference_forecasts.models.gfs_quarter_deg_3hour_to_hourly_mean(…) Take 3 hr GFS and convert it to hourly average data.
reference_forecasts.models.gfs_quarter_deg_hourly_to_hourly_mean(…) Take 1 hr GFS and convert it to hourly average data.
reference_forecasts.models.gfs_quarter_deg_to_hourly_mean(…) Hourly average forecasts derived from GFS 1, 3, and 12 hr frequency output.
reference_forecasts.models.nam_12km_hourly_to_hourly_instantaneous(…) Hourly instantantaneous forecast.
reference_forecasts.models.nam_12km_cloud_cover_to_hourly_mean(…) Hourly average forecast.

Forecast processing

Functions that process forecast data.

reference_forecasts.forecast.cloud_cover_to_ghi_linear(…) Convert cloud cover to GHI using a linear relationship.
reference_forecasts.forecast.cloud_cover_to_irradiance_ghi_clear(…) Estimates irradiance from cloud cover in the following steps:
reference_forecasts.forecast.cloud_cover_to_irradiance(…) Estimates irradiance from cloud cover in the following steps:
reference_forecasts.forecast.resample(arg[, …]) Resamples an argument, allowing for None.
reference_forecasts.forecast.interpolate(arg) Interpolates an argument, allowing for None.
reference_forecasts.forecast.unmix_intervals(mixed) Convert mixed interval averages into pure interval averages.


reference_forecasts.persistence.persistence_scalar(…) Make a persistence forecast using the mean value of the observation from data_start to data_end.
reference_forecasts.persistence.persistence_interval(…) Make a persistence forecast for an observation using the mean values of each interval_length bin from data_start to data_end.
reference_forecasts.persistence.persistence_scalar_index(…) Calculate a persistence forecast using the mean value of the observation clear sky index or AC power index from data_start to data_end.

Fetching external data


io.fetch.arm.request_arm_file_list(user_id, …) Make an http request to the ARM live API for filenames between start and end.
io.fetch.arm.list_arm_filenames(user_id, …) Get a list of filenames from ARM for the given datastream between start and end.
io.fetch.arm.request_arm_file(user_id, …) Get a file from ARM live in the form of a stream so that the python netCDF4 module can read it.
io.fetch.arm.retrieve_arm_dataset(user_id, …) Request a file from the ARM Live API and return a netCDF4 Dataset.
io.fetch.arm.extract_arm_variables(nc_file, …) Extracts variables and datetime index from an ARM netcdf.
io.fetch.arm.fetch_arm(user_id, api_key, …) Gets data from ARM API and concatenates requested datastreams into a single Pandas Dataframe.


io.fetch.nwp.get_with_retries(get_func, *args) Call get_func and retry if the request fails
io.fetch.nwp.get_available_dirs(session, model) Get the available date/date+init_hr directories
io.fetch.nwp.check_next_inittime(session, …) Check if data from the next model initializtion time is available
io.fetch.nwp.get_filename(basepath, …)
io.fetch.nwp.files_to_retrieve(session, …) Generator to return the parameters of the available files for download
io.fetch.nwp.process_grib_to_netcdf(folder, …)
io.fetch.nwp.optimize_netcdf(nctmpfile, …) Compress the netcdf file and adjust the chunking for fast time-series access
io.fetch.nwp.sleep_until_inittime(inittime, …)
io.fetch.nwp.startup_find_next_runtime(…) Find the next model run to get based on what is available on NOMADS and what .nc files are present locally
io.fetch.nwp.next_run_time(inittime, …)
io.fetch.nwp.run(basepath, model_name, chunksize)
io.fetch.nwp.optimize_only(path_to_files, …)


io.fetch.rtc.request_doe_rtc_data(location, …) Makes a request to DOE RTC pv dashboard with the provided parameters.
io.fetch.rtc.fetch_doe_rtc(location, …) Requests and concatenates data from the DOE RTC pv dashboard API into a single dataframe.

Reference observations

io.reference_observations.midc_config Contains a config dictionary mapping SFA variable names to MIDC fields.
io.reference_observations.reference_data A CLI tool for creating reference sites and observations and updating them with data from their respective API.
io.reference_observations.surfrad Functions for Creating and Updating NOAA SURFRAD related objects within the SolarForecastArbiter



Get an API token.

io.api.request_cli_access_token(user, password)

API Session

Class for communicating with the Solar Forecast Arbiter API.

io.api.APISession(access_token[, …]) Subclass of requests.Session to handle requets to the SolarForecastArbiter API.
io.api.APISession.request(method, url, …) Modify the default Session.request to add in the default timeout and make requests relative to the base_url.
io.api.APISession.get_site(site_id) Retrieve site metadata for site_id from the API and process into the proper model.
io.api.APISession.list_sites() List all the sites available to a user.
io.api.APISession.create_site(site) Create a new site in the API with the given Site model
io.api.APISession.get_observation(observation_id) Get the metadata from the API for the a given observation_id in an Observation object.
io.api.APISession.list_observations() List the observations a user has access to.
io.api.APISession.create_observation(observation) Create a new observation in the API with the given Observation model
io.api.APISession.get_forecast(forecast_id) Get Forecast metadata from the API for the given forecast_id
io.api.APISession.list_forecasts() List all Forecasts a user has access to.
io.api.APISession.create_forecast(forecast) Create a new forecast in the API with the given Forecast model
io.api.APISession.get_observation_values(…) Get observation values from start to end for observation_id from the API
io.api.APISession.get_forecast_values(…[, …]) Get forecast values from start to end for forecast_id
io.api.APISession.post_observation_values(…) Upload the given observation values to the appropriate observation_id of the API.
io.api.APISession.post_forecast_values(…) Upload the given forecast values to the appropriate forecast_id of the API







Functions to perform validation.

validation.validator.check_ghi_limits_QCRad(…) Tests for physical limits on GHI using the QCRad criteria.
validation.validator.check_dhi_limits_QCRad(…) Tests for physical limits on DHI using the QCRad criteria.
validation.validator.check_dni_limits_QCRad(…) Tests for physical limits on DNI using the QCRad criteria.
validation.validator.check_irradiance_limits_QCRad(…) Tests for physical limits on GHI, DHI or DNI using the QCRad criteria.
validation.validator.check_irradiance_consistency_QCRad(…) Checks consistency of GHI, DHI and DNI.
validation.validator.check_temperature_limits(…) Checks for extreme temperatures.
validation.validator.check_wind_limits(…) Checks for extreme wind speeds.
validation.validator.check_rh_limits(rh[, …]) Checks for extreme relative humidity.
validation.validator.check_ghi_clearsky(ghi, …) Flags GHI values greater than clearsky values.
validation.validator.check_poa_clearsky(…) Flags plane of array irradiance values greater than clearsky values.
validation.validator.check_irradiance_day_night(…) Checks for day/night periods based on solar zenith.
validation.validator.check_timestamp_spacing(…) Checks if spacing between times conforms to freq.
validation.validator.detect_stale_values(x) Detects stale data.
validation.validator.detect_interpolation(x) Detects sequences of data which appear linear.
validation.validator.detect_levels(x[, …]) Detects plateau levels in data.
validation.validator.detect_clipping(ac_power) Detects clipping in a series of AC power.


Perform a sequence of valdiation steps. Used by the API to initiate valdiation.

validation.tasks.validate_ghi(observation, …) Run validation checks on a GHI observation.
validation.tasks.validate_dni(observation, …) Run validation checks on a DNI observation.
validation.tasks.validate_dhi(observation, …) Run validation checks on a DHI observation.
validation.tasks.validate_poa_global(…) Run validation checks on a POA observation.
validation.tasks.validate_air_temperature(…) Run validation checks on an air temperature observation.
validation.tasks.validate_wind_speed(…) Run validation checks on a wind speed observation.
validation.tasks.validate_relative_humidity(…) Run validation checks on a relative humidity observation.
validation.tasks.validate_timestamp(…) Run validation checks on an observation.
validation.tasks.validate_daily_ghi(…) Run validation on a daily timeseries of GHI.
validation.tasks.validate_daily_dc_power(…) Run validation on a daily timeseries of DC power.
validation.tasks.validate_daily_ac_power(…) Run a number of validation checks on a daily timeseries of AC power.
validation.tasks.immediate_observation_validation(…) Task that will run immediately after Observation values are uploaded to the API to validate the data.
validation.tasks.daily_single_observation_validation(…) Task that expects a longer, likely daily timeseries of Observation values that will be validated.
validation.tasks.daily_observation_validation(…) Run the daily observation validation for all observations that the user has access to.

Quality flag mapping

Functions to handle the translation of validation results and database storage.

validation.quality_mapping.mask_flags(…[, …]) Decorator that will convert a boolean pandas object into an integer, bitmasked object when _return_mask=True.
validation.quality_mapping.has_data_been_validated(flags) Return True (or a boolean series) if flags has been validated
validation.quality_mapping.get_version(flag) Extract the version from flag
validation.quality_mapping.check_if_single_value_flagged(…) Check if the single integer flag has been flagged for flag_description
validation.quality_mapping.which_data_is_ok(flags) Return True for flags that have been validated and are OK
validation.quality_mapping.check_for_all_descriptions(flag) Return a boolean Series indicating the checks a flag represents
validation.quality_mapping.convert_mask_into_dataframe(…) Convert flag_series into a boolean DataFrame indicating which checks the flags represent.
validation.quality_mapping.convert_flag_frame_to_strings(…) Convert the flag_frame output of convert_mask_into_dataframe() into a pandas.Series of strings which are the active flag names separated by sep.
validation.quality_mapping.check_if_series_flagged(…) Check if flag_series has been flagged for the checks given by flag_description



Time series plotting.

plotting.timeseries.build_figure_title(…) Builds a title for the plot
plotting.timeseries.make_quality_bars(…) Make figures to display the whether a time is flagged for any of the columns in source.
plotting.timeseries.add_hover_tool(fig, …) Add a hover tool to fig.
plotting.timeseries.make_basic_timeseries(…) Make a basic timeseries plot (with either a step or line) and add a hover tool.
plotting.timeseries.generate_forecast_figure(…) Creates a bokeh timeseries figure for forcast data
plotting.timeseries.generate_observation_figure(…) Creates a bokeh figure from API responses for an observation


Utility functions for plotting.

plotting.utils.format_variable_name(variable) Make a human readable name, with units, for the variable
plotting.utils.align_index(df, interval_length) Align the index to the specified interval_length inserting NaNs as appropriate.
plotting.utils.line_or_step(interval_label) For a given interval_label, determine the plot_method of the data, any kwargs for that plot method, and kwargs for adding a hovertool for the data.