• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

NREL / bifacial_radiance / 10778060626

09 Sep 2024 05:18PM UTC coverage: 72.218% (+0.06%) from 72.155%
10778060626

Pull #543

github

cdeline
add `results` property to RadianceObj.  rename demo.CompiledResults to demo.compiledResults
Pull Request #543: 512 cec performance

119 of 135 new or added lines in 4 files covered. (88.15%)

7 existing lines in 1 file now uncovered.

3699 of 5122 relevant lines covered (72.22%)

1.44 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

76.48
/bifacial_radiance/main.py
1
#!/usr/bin/env python
2

3
"""
1✔
4
@author: cdeline
5

6
bifacial_radiance.py - module to develop radiance bifacial scenes, including gendaylit and gencumulativesky
7
7/5/2016 - test script based on G173_journal_height
8
5/1/2017 - standalone module
9

10
Pre-requisites:
11
    This software is written for Python >3.6 leveraging many Anaconda tools (e.g. pandas, numpy, etc)
12

13
    *RADIANCE software should be installed from https://github.com/NREL/Radiance/releases
14

15
    *If you want to use gencumulativesky, move 'gencumulativesky.exe' from
16
    'bifacial_radiance\data' into your RADIANCE source directory.
17

18
    *If using a Windows machine you should download the Jaloxa executables at
19
    http://www.jaloxa.eu/resources/radiance/radwinexe.shtml#Download
20

21
    * Installation of  bifacial_radiance from the repo:
22
    1. Clone the repo
23
    2. Navigate to the directory using the command prompt
24
    3. run `pip install -e . `
25

26
Overview:
27
    Bifacial_radiance includes several helper functions to make it easier to evaluate
28
    different PV system orientations for rear bifacial irradiance.
29
    Note that this is simply an optical model - identifying available rear irradiance under different conditions.
30

31
    For a detailed demonstration example, look at the .ipnyb notebook in \docs\
32

33
    There are two solar resource modes in bifacial_radiance: `gendaylit` uses hour-by-hour solar
34
    resource descriptions using the Perez diffuse tilted plane model.
35
    `gencumulativesky` is an annual average solar resource that combines hourly
36
    Perez skies into one single solar source, and computes an annual average.
37

38
    bifacial_radiance includes five object-oriented classes:
39

40
    RadianceObj:  top level class to work on radiance objects, keep track of filenames,
41
    sky values, PV module type etc.
42

43
    GroundObj:    details for the ground surface and reflectance
44

45
    SceneObj:    scene information including array configuration (row spacing, clearance or hub height)
46

47
    MetObj: meteorological data from EPW (energyplus) file.
48
        Future work: include other file support including TMY files
49

50
    AnalysisObj: Analysis class for plotting and reporting
51

52
"""
53
import logging
2✔
54
logging.basicConfig()
2✔
55
LOGGER = logging.getLogger(__name__)
2✔
56
LOGGER.setLevel(logging.DEBUG)
2✔
57

58
import os, datetime
2✔
59
from subprocess import Popen, PIPE  # replacement for os.system()
2✔
60
import pandas as pd
2✔
61
import numpy as np 
2✔
62
import warnings
2✔
63

64

65

66
global DATA_PATH # path to data files including module.json.  Global context
67
DATA_PATH = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data'))
2✔
68

69
def _findme(lst, a): #find string match in a list. script from stackexchange
2✔
70
    return [i for i, x in enumerate(lst) if x == a]
2✔
71

72
def _firstlist(l):  #find first not-none value in a list.  useful for checking multiple keys in dict 
2✔
73
    try:
2✔
74
        return next(item for item in l if item is not None)
2✔
75
    except StopIteration:
2✔
76
        return None
2✔
77

78
def _missingKeyWarning(dictype, missingkey, newvalue): # prints warnings 
2✔
79
    if type(newvalue) is bool:
2✔
80
        valueunit = ''
×
81
    else:
82
        valueunit = 'm'
2✔
83
    print("Warning: {} Dictionary Parameters passed, but {} is missing. ".format(dictype, missingkey))        
2✔
84
    print("Setting it to default value of {} {} to continue\n".format(newvalue, valueunit))
2✔
85

86
def _normRGB(r, g, b): #normalize by each color for human vision sensitivity
2✔
87
    return r*0.216+g*0.7152+b*0.0722
2✔
88

89
def _popen(cmd, data_in, data_out=PIPE):
2✔
90
    """
91
    Helper function subprocess.popen replaces os.system
92
    - gives better input/output process control
93
    usage: pass <data_in> to process <cmd> and return results
94
    based on rgbeimage.py (Thomas Bleicher 2010)
95
    """
96
    if type(cmd) == str:
2✔
97
        cmd = str(cmd) # gets rid of unicode oddities
2✔
98
        shell=True
2✔
99
    else:
100
        shell=False
2✔
101

102
    p = Popen(cmd, bufsize=-1, stdin=PIPE, stdout=data_out, stderr=PIPE, shell=shell) #shell=True required for Linux? quick fix, but may be security concern
2✔
103
    data, err = p.communicate(data_in)
2✔
104

105
    if err:
2✔
106
        if data:
2✔
UNCOV
107
            returntuple = (data.decode('latin1'), 'message: '+err.decode('latin1').strip())
×
108
        else:
109
            returntuple = (None, 'message: '+err.decode('latin1').strip())
2✔
110
    else:
111
        if data:
2✔
112
            returntuple = (data.decode('latin1'), None) #Py3 requires decoding
2✔
113
        else:
114
            returntuple = (None, None)
2✔
115

116
    return returntuple
2✔
117

118
def _interactive_load(title=None):
2✔
119
    # Tkinter file picker
120
    import tkinter
×
121
    from tkinter import filedialog
×
122
    root = tkinter.Tk()
×
123
    root.withdraw() #Start interactive file input
×
124
    root.attributes("-topmost", True) #Bring window into foreground
×
125
    return filedialog.askopenfilename(parent=root, title=title) #initialdir = data_dir
×
126

127
def _interactive_directory(title=None):
2✔
128
    # Tkinter directory picker.  Now Py3.6 compliant!
129
    import tkinter
×
130
    from tkinter import filedialog
×
131
    root = tkinter.Tk()
×
132
    root.withdraw() #Start interactive file input
×
133
    root.attributes("-topmost", True) #Bring to front
×
134
    return filedialog.askdirectory(parent=root, title=title)
×
135

136
def _modDict(originaldict, moddict, relative=False):
2✔
137
    '''
138
    Compares keys in originaldict with moddict and updates values of 
139
    originaldict to moddict if existing.
140
    
141
    Parameters
142
    ----------
143
    originaldict : dictionary
144
        Original dictionary calculated, for example frontscan or backscan dictionaries.
145
    moddict : dictionary
146
        Modified dictinoary, for example modscan['xstart'] = 0 to change position of x.
147
    relative : Bool
148
        if passing modscanfront and modscanback to modify dictionarie of positions,
149
        this sets if the values passed to be updated are relative or absolute. 
150
        Default is absolute value (relative=False)
151
            
152
    Returns
153
    -------
154
    originaldict : dictionary
155
        Updated original dictionary with values from moddict.
156
    '''
157
    newdict = originaldict.copy()
2✔
158

159
    for key in moddict:
2✔
160
        try:
2✔
161
            if relative:
2✔
162
                newdict[key] = moddict[key] + newdict[key]
×
163
            else:
164
                newdict[key] = moddict[key]
2✔
165
        except:
×
166
            print("Wrong key in modified dictionary")
×
167
    
168
    return newdict
2✔
169

170
def _heightCasesSwitcher(sceneDict, preferred='hub_height', nonpreferred='clearance_height',
2✔
171
                         suppress_warning=False):
172
        """
173
        
174
        Parameters
175
        ----------
176
        sceneDict : dictionary
177
            Dictionary that might contain more than one way of defining height for 
178
            the array: `clearance_height`, `hub_height`, `height`*
179
            * height deprecated from sceneDict. This function helps choose
180
            * which definition to use.  
181
        preferred : str, optional
182
            When sceneDict has hub_height and clearance_height, or it only has height,
183
            it will leave only the preferred option.. The default is 'hub_height'.
184
        nonpreferred : TYPE, optional
185
            When sceneDict has hub_height and clearance_height, 
186
            it wil ldelete this nonpreferred option. The default is 'clearance_height'.
187
        suppress_warning  :  Bool, default False
188
            If both heights passed in SceneDict, suppress the warning        
189
        
190
        Returns
191
        -------
192
        sceneDict : TYPE
193
            Dictionary now containing the appropriate definition for system height. 
194
        use_clearanceheight : Bool
195
            Helper variable to specify if dictionary has only clearancehet for
196
            use inside `makeScene1axis`. Will get deprecated once that internal
197
            function is streamlined.
198
    
199
        """
200
        # TODO: When we update to python 3.9.0, this could be a Switch Cases (Structural Pattern Matching):
201
    
202
            
203
        heightCases = '_'
2✔
204
        if 'height' in sceneDict:
2✔
205
            heightCases = heightCases+'height__'
2✔
206
        if 'clearance_height' in sceneDict:
2✔
207
            heightCases = heightCases+'clearance_height__'
2✔
208
        if 'hub_height' in sceneDict:
2✔
209
            heightCases = heightCases+'hub_height__'
2✔
210
        
211
        use_clearanceheight = False
2✔
212
        # CASES:
213
        if heightCases == '_height__':
2✔
214
            print("sceneDict Warning: 'height' is being deprecated. "+
2✔
215
                                  "Renaming as "+preferred)
216
            sceneDict[preferred]=sceneDict['height']
2✔
217
            del sceneDict['height']
2✔
218
        
219
        elif heightCases == '_clearance_height__':
2✔
220
            #print("Using clearance_height.")
221
            use_clearanceheight = True
2✔
222
            
223
        elif heightCases == '_hub_height__':
2✔
224
            #print("Using hub_height.'")
225
            pass
2✔
226
        elif heightCases == '_height__clearance_height__':  
2✔
227
            print("sceneDict Warning: 'clearance_height and 'height' "+
2✔
228
                  "(deprecated) are being passed. removing 'height' "+
229
                  "from sceneDict for this tracking routine")
230
            del sceneDict['height']
2✔
231
            use_clearanceheight = True
2✔
232
                            
233
        elif heightCases == '_height__hub_height__':     
2✔
234
            print("sceneDict Warning: 'height' is being deprecated. Using 'hub_height'")
2✔
235
            del sceneDict['height']
2✔
236
        
237
        elif heightCases == '_height__clearance_height__hub_height__':       
2✔
238
            print("sceneDict Warning: 'hub_height', 'clearance_height'"+
×
239
                  ", and 'height' are being passed. Removing 'height'"+
240
                  " (deprecated) and "+ nonpreferred+ ", using "+preferred)
241
            del sceneDict[nonpreferred]
×
242
        
243
        elif heightCases == '_clearance_height__hub_height__':  
2✔
244
            if not suppress_warning:
2✔
245
                print("sceneDict Warning: 'hub_height' and 'clearance_height'"+
2✔
246
                      " are being passed. Using "+preferred+
247
                      " and removing "+ nonpreferred)
248
            del sceneDict[nonpreferred]
2✔
249
    
250
        else: 
251
            print ("sceneDict Error! no argument in sceneDict found "+
×
252
                   "for 'hub_height', 'height' nor 'clearance_height'. "+
253
                   "Exiting routine.")
254
            
255
        return sceneDict, use_clearanceheight
2✔
256

257
def _is_leap_and_29Feb(s): # Removes Feb. 29 if it a leap year.
2✔
258
    return (s.index.year % 4 == 0) & \
2✔
259
           ((s.index.year % 100 != 0) | (s.index.year % 400 == 0)) & \
260
           (s.index.month == 2) & (s.index.day == 29)
261

262
def _subhourlydatatoGencumskyformat(gencumskydata, label='right'):
2✔
263
    # Subroutine to resample, pad, remove leap year and get data in the
264
    # 8760 hourly format
265
    # for saving the temporary files for gencumsky in _saveTempTMY and
266
    # _makeTrackerCSV
267
    
268

269
    #Resample to hourly. Gencumsky wants right-labeled data.
270
    try:
2✔
271
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean()  
2✔
272
    except TypeError: # Pandas 2.0 error
1✔
273
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean(numeric_only=True) 
1✔
274
    
275
    if label == 'left': #switch from left to right labeled by adding an hour
2✔
276
        gencumskydata.index = gencumskydata.index + pd.to_timedelta('1H')
×
277
                     
278

279
    # Padding
280
    tzinfo = gencumskydata.index.tzinfo
2✔
281
    padstart = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0],1,1,1,0 ) ).tz_localize(tzinfo)
2✔
282
    padend = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0]+1,1,1,0,0) ).tz_localize(tzinfo)
2✔
283
    gencumskydata.iloc[0] = 0  # set first datapt to zero to forward fill w zeros
2✔
284
    gencumskydata.iloc[-1] = 0  # set last datapt to zero to forward fill w zeros
2✔
285
    # check if index exists. I'm sure there is a way to do this backwards.
286
    if any(gencumskydata.index.isin([padstart])):
2✔
287
        print("Data starts on Jan. 01")
2✔
288
    else:
289
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padstart]))
290
        gencumskydata=pd.concat([gencumskydata,pd.DataFrame(index=[padstart])])
2✔
291
    if any(gencumskydata.index.isin([padend])):
2✔
292
        print("Data ends on Dec. 31st")
2✔
293
    else:
294
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padend]))
295
        gencumskydata=pd.concat([gencumskydata, pd.DataFrame(index=[padend])])
2✔
296
    gencumskydata.loc[padstart]=0
2✔
297
    gencumskydata.loc[padend]=0
2✔
298
    gencumskydata=gencumskydata.sort_index() 
2✔
299
    # Fill empty timestamps with zeros
300
    gencumskydata = gencumskydata.resample('60min').asfreq().fillna(0)
2✔
301
    # Mask leap year
302
    leapmask =  ~(_is_leap_and_29Feb(gencumskydata))
2✔
303
    gencumskydata = gencumskydata[leapmask]
2✔
304

305
    if (gencumskydata.index.year[-1] == gencumskydata.index.year[-2]+1) and len(gencumskydata)>8760:
2✔
306
        gencumskydata = gencumskydata[:-1]
×
307
    return gencumskydata
2✔
308
    # end _subhourlydatatoGencumskyformat   
309

310
def _checkRaypath():
2✔
311
    # Ensure that os.environ['RAYPATH'] exists and contains current directory '.'     
312
    if os.name == 'nt':
2✔
313
        splitter = ';'
×
314
    else:
315
        splitter = ':'
2✔
316
    try:
2✔
317
        raypath = os.getenv('RAYPATH', default=None)
2✔
318
        if not raypath:
2✔
319
            raise KeyError()
2✔
320
        raysplit = raypath.split(splitter)
2✔
321
        if not '.' in raysplit:
2✔
322
            os.environ['RAYPATH'] = splitter.join(filter(None, raysplit + ['.'+splitter]))
2✔
323
    except (KeyError, AttributeError, TypeError):
2✔
324
        raise Exception('No RAYPATH set for RADIANCE.  Please check your RADIANCE installation.')
2✔
325

326
class SuperClass:
2✔
327
      def __repr__(self):
2✔
328
          return str(type(self)) + ' : ' + str({key: self.__dict__[key] for key in self.columns})    
×
329
          #return str(self.__dict__)
330
      @property
2✔
331
      def columns(self):
2✔
332
          return [attr for attr in dir(self) if not (attr.startswith('_') or attr.startswith('methods') 
2✔
333
                                               or attr.startswith('columns') or callable(getattr(self,attr)))]
334
      @property
2✔
335
      def methods(self): 
2✔
336
          return [attr for attr in dir(self) if (not (attr.startswith('_') or attr.startswith('methods') 
2✔
337
                                                      or  attr.startswith('columns')) and callable(getattr(self,attr)))]
338
  
339
    
340

341
class RadianceObj(SuperClass):
2✔
342
    """
343
    The RadianceObj top level class is used to work on radiance objects, 
344
    keep track of filenames,  sky values, PV module configuration, etc.
345

346
    Parameters
347
    ----------
348
    name : text to append to output files
349
    filelist : list of Radiance files to create oconv
350
    nowstr : current date/time string
351
    path : working directory with Radiance materials and objects
352

353
    Methods
354
    -------
355
    __init__ : initialize the object
356
    _setPath : change the working directory
357

358
    """
359
    @property
2✔
360
    def results(self):
2✔
361
        """
362
        Iterate over trackerdict and return irradiance results
363
        following analysis1axis runs
364

365
        Returns
366
        -------
367
        results : Pandas.DataFrame
368
            dataframe containing irradiance scan results.
369
        """
370
        from bifacial_radiance.load import getResults
2✔
371
        
372
        if getattr(self, 'trackerdict', None) is None:
2✔
373
            return None
2✔
374

375
        return getResults(self.trackerdict, self.cumulativesky)
2✔
376
    
377
    def __repr__(self):
2✔
378
        #return str(self.__dict__)  
379
        return str(type(self)) + ' : ' + str({key: self.__dict__[key] for key in self.columns if (key != 'trackerdict') &  (key != 'results') }) 
2✔
380
    def __init__(self, name=None, path=None, hpc=False):
2✔
381
        '''
382
        initialize RadianceObj with path of Radiance materials and objects,
383
        as well as a basename to append to
384

385
        Parameters
386
        ----------
387
        name: string, append temporary and output files with this value
388
        path: location of Radiance materials and objects
389
        hpc:  Keeps track if User is running simulation on HPC so some file 
390
              reading routines try reading a bit longer and some writing 
391
              routines (makeModule) that overwrite themselves are inactivated.
392

393
        Returns
394
        -------
395
        none
396
        '''
397

398
        self.metdata = {}        # data from epw met file
2✔
399
        self.data = {}           # data stored at each timestep
2✔
400
        self.path = ""             # path of working directory
2✔
401
        self.name = ""         # basename to append
2✔
402
        #self.filelist = []         # list of files to include in the oconv
403
        self.materialfiles = []    # material files for oconv
2✔
404
        self.skyfiles = []          # skyfiles for oconv
2✔
405
        #self.radfiles = []      # scene rad files for oconv, compiled from self.scenes
406
        self.scenes = []        # array of scenefiles to be compiled
2✔
407
        self.octfile = []       #octfile name for analysis
2✔
408
        self.Wm2Front = 0       # cumulative tabulation of front W/m2
2✔
409
        self.Wm2Back = 0        # cumulative tabulation of rear W/m2
2✔
410
        self.backRatio = 0      # ratio of rear / front Wm2
2✔
411
        #self.nMods = None        # number of modules per row
412
        #self.nRows = None        # number of rows per scene
413
        self.hpc = hpc           # HPC simulation is being run. Some read/write functions are modified
2✔
414
        self.compiledResults = pd.DataFrame(None) # DataFrame of cumulative results, output from self.calculatePerformance1axis()
2✔
415
        
416
        now = datetime.datetime.now()
2✔
417
        self.nowstr = str(now.date())+'_'+str(now.hour)+str(now.minute)+str(now.second)
2✔
418
        _checkRaypath()       # make sure we have RADIANCE path set up correctly
2✔
419

420
        # DEFAULTS
421

422
        if name is None:
2✔
423
            self.name = self.nowstr  # set default filename for output files
×
424
        else:
425
            self.name = name
2✔
426
        self.basename = name # add backwards compatibility for prior versions
2✔
427
        #self.__name__ = self.name  #optional info
428
        #self.__str__ = self.__name__   #optional info
429
        if path is None:
2✔
430
            self._setPath(os.getcwd())
2✔
431
        else:
432
            self._setPath(path)
2✔
433
        # load files in the /materials/ directory
434
        self.materialfiles = self.returnMaterialFiles('materials')
2✔
435

436

437
    def _setPath(self, path):
2✔
438
        """
439
        setPath - move path and working directory
440

441
        """
442
        self.path = os.path.abspath(path)
2✔
443

444
        print('path = '+ path)
2✔
445
        try:
2✔
446
            os.chdir(self.path)
2✔
447
        except OSError as exc:
2✔
448
            LOGGER.error('Path doesn''t exist: %s' % (path))
2✔
449
            LOGGER.exception(exc)
2✔
450
            raise(exc)
2✔
451

452
        # check for path in the new Radiance directory:
453
        def _checkPath(path):  # create the file structure if it doesn't exist
2✔
454
            if not os.path.exists(path):
2✔
455
                os.makedirs(path)
2✔
456
                print('Making path: '+path)
2✔
457

458
        _checkPath('images'); _checkPath('objects')
2✔
459
        _checkPath('results'); _checkPath('skies'); _checkPath('EPWs')
2✔
460
        # if materials directory doesn't exist, populate it with ground.rad
461
        # figure out where pip installed support files.
462
        from shutil import copy2
2✔
463

464
        if not os.path.exists('materials'):  #copy ground.rad to /materials
2✔
465
            os.makedirs('materials')
2✔
466
            print('Making path: materials')
2✔
467

468
            copy2(os.path.join(DATA_PATH, 'ground.rad'), 'materials')
2✔
469
        # if views directory doesn't exist, create it with two default views - side.vp and front.vp
470
        if not os.path.exists('views'):
2✔
471
            os.makedirs('views')
2✔
472
            with open(os.path.join('views', 'side.vp'), 'w') as f:
2✔
473
                f.write('rvu -vtv -vp -10 1.5 3 -vd 1.581 0 -0.519234 '+
2✔
474
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
475
            with open(os.path.join('views', 'front.vp'), 'w') as f:
2✔
476
                f.write('rvu -vtv -vp 0 -3 5 -vd 0 0.894427 -0.894427 '+
2✔
477
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
478
            with open(os.path.join('views', 'module.vp'), 'w') as f:
2✔
479
                f.write('rvu -vtv -vp -3 -3 0.3 -vd 0.8139 0.5810 0.0 '+
2✔
480
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
481
    def getfilelist(self):
2✔
482
        """ 
483
        Return concat of matfiles, radfiles and skyfiles
484
        """
485

486
        return self.materialfiles + self.skyfiles + self._getradfiles()
2✔
487
    
488
    def _getradfiles(self, scenelist=None):
2✔
489
        """
490
        iterate over self.scenes to get the radfiles
491

492
        Returns
493
        -------
494
        None.
495

496
        """
497
        if scenelist is None:
2✔
498
            scenelist = self.scenes
2✔
499
        a = []
2✔
500
        for scene in scenelist:
2✔
501
            if type(scene.radfiles) == list:
2✔
502
                for f in scene.radfiles:
2✔
503
                    a.append(f) 
2✔
504
            else:
505
                a.append(scene.radfiles)
2✔
506
        return a
2✔
507
        
508
    def save(self, savefile=None):
2✔
509
        """
510
        Pickle the radiance object for further use.
511
        Very basic operation - not much use right now.
512

513
        Parameters
514
        ----------
515
        savefile : str
516
            Optional savefile name, with .pickle extension.
517
            Otherwise default to save.pickle
518

519
        """
520
        
521
        import pickle
2✔
522

523
        if savefile is None:
2✔
524
            savefile = 'save.pickle'
×
525

526
        with open(savefile, 'wb') as f:
2✔
527
            pickle.dump(self, f)
2✔
528
        print('Saved to file {}'.format(savefile))
2✔
529

530
    #def setHPC(self, hpc=True):
531
    #    self.hpc = hpc
532
        
533
    def addMaterial(self, material, Rrefl, Grefl, Brefl, materialtype='plastic', 
2✔
534
                    specularity=0, roughness=0, material_file=None, comment=None, rewrite=True):
535
        """
536
        Function to add a material in Radiance format. 
537

538

539
        Parameters
540
        ----------
541
        material : str
542
            DESCRIPTION.
543
        Rrefl : str
544
            Reflectivity for first wavelength, or 'R' bin.
545
        Grefl : str
546
            Reflecstrtivity for second wavelength, or 'G' bin.
547
        Brefl : str
548
            Reflectivity for third wavelength, or 'B' bin.
549
        materialtype : str, optional
550
            Type of material. The default is 'plastic'. Others can be mirror,
551
            trans, etc. See RADIANCe documentation. 
552
        specularity : str, optional
553
            Ratio of reflection that is specular and not diffuse. The default is 0.
554
        roughness : str, optional
555
            This is the microscopic surface roughness: the more jagged the 
556
            facets are, the rougher it is and more blurry reflections will appear.
557
        material_file : str, optional
558
            DESCRIPTION. The default is None.
559
        comment : str, optional
560
            DESCRIPTION. The default is None.
561
        rewrite : str, optional
562
            DESCRIPTION. The default is True.
563

564
        Returns
565
        -------
566
        None. Just adds the material to the material_file specified or the 
567
        default in ``materials\ground.rad``.
568

569
        References:
570
            See examples of documentation for more materialtype details.
571
            http://www.jaloxa.eu/resources/radiance/documentation/docs/radiance_tutorial.pdf page 10
572
     
573
            Also, you can use https://www.jaloxa.eu/resources/radiance/colour_picker.shtml 
574
            to have a sense of how the material would look with the RGB values as 
575
            well as specularity and roughness.
576

577
            To understand more on reflectivity, specularity and roughness values
578
            https://thinkmoult.com/radiance-specularity-and-roughness-value-examples.html
579
            
580
        """
581
        if material_file is None:
2✔
582
            material_file = 'ground.rad'    
2✔
583
    
584
        matfile = os.path.join('materials', material_file)
2✔
585
        
586
        with open(matfile, 'r') as fp:
2✔
587
            buffer = fp.readlines()
2✔
588
                
589
        # search buffer for material matching requested addition
590
        found = False
2✔
591
        for i in buffer:
2✔
592
            if materialtype and material in i:
2✔
593
                loc = buffer.index(i)
2✔
594
                found = True
2✔
595
                break
2✔
596
        if found:
2✔
597
            if rewrite:            
2✔
598
                print('Material exists, overwriting...\n')
2✔
599
                if comment is None:
2✔
600
                    pre = loc - 1
×
601
                else:
602
                    pre = loc - 2            
2✔
603
                # commit buffer without material match
604
                with open(matfile, 'w') as fp:
2✔
605
                    for i in buffer[0:pre]:
2✔
606
                        fp.write(i)
2✔
607
                    for i in buffer[loc+4:]:
2✔
608
                        fp.write(i)
×
609
        if (found and rewrite) or (not found):
2✔
610
            # append -- This will create the file if it doesn't exist
611
            file_object = open(matfile, 'a')
2✔
612
            file_object.write("\n\n")
2✔
613
            if comment is not None:
2✔
614
                file_object.write("#{}".format(comment))
2✔
615
            file_object.write("\nvoid {} {}".format(materialtype, material))
2✔
616
            if materialtype == 'glass' or materialtype =='mirror':
2✔
617
                file_object.write("\n0\n0\n3 {} {} {}".format(Rrefl, Grefl, Brefl))
×
618
            else:
619
                file_object.write("\n0\n0\n5 {} {} {} {} {}".format(Rrefl, Grefl, Brefl, specularity, roughness))
2✔
620
            file_object.close()
2✔
621
            print('Added material {} to file {}'.format(material, material_file))
2✔
622
        if (found and not rewrite):
2✔
623
            print('Material already exists\n')
2✔
624

625
    def exportTrackerDict(self, trackerdict=None,
2✔
626
                          savefile=None, reindex=None):
627
        """
628
        Use :py:func:`~bifacial_radiance.load._exportTrackerDict` to save a
629
        TrackerDict output as a csv file.
630

631
        Parameters
632
        ----------
633
            trackerdict
634
                The tracker dictionary to save
635
            savefile : str 
636
                path to .csv save file location
637
            reindex : bool
638
                True saves the trackerdict in TMY format, including rows for hours
639
                where there is no sun/irradiance results (empty)
640
                
641
        """
642
        
643
        import bifacial_radiance.load
2✔
644

645
        if trackerdict is None:
2✔
646
            trackerdict = self.trackerdict
2✔
647

648
        if savefile is None:
2✔
649
            savefile = _interactive_load(title='Select a .csv file to save to')
×
650

651
        if reindex is not None:
2✔
652
            reindex = False
2✔
653

654
        if self.cumulativesky is True and reindex is True:
2✔
655
            # don't re-index for cumulativesky,
656
            # which has angles for index
657
            print ("\n Warning: For cumulativesky simulations, exporting the "
×
658
                   "TrackerDict requires reindex = False. Setting reindex = "
659
                   "False and proceeding")
660
            reindex = False
×
661

662
        monthlyyearly = True
2✔
663
        if self.cumulativesky is True:
2✔
664
            monthlyyearly = False
2✔
665
            
666
        bifacial_radiance.load._exportTrackerDict(trackerdict, savefile,
2✔
667
                                                 cumulativesky=self.cumulativesky,
668
                                                 reindex=reindex, monthlyyearly=monthlyyearly)
669

670
    
671
    # loadtrackerdict not updated to match new trackerdict configuration
672
    def loadtrackerdict(self, trackerdict=None, fileprefix=None):
2✔
673
        """
674
        Use :py:class:`bifacial_radiance.load._loadtrackerdict` 
675
        to browse the results directory and load back any results saved in there.
676

677
        Parameters
678
        ----------
679
        trackerdict
680
        fileprefix : str
681

682
        """
683
        from bifacial_radiance.load import loadTrackerDict
2✔
684
        if trackerdict is None:
2✔
685
            trackerdict = self.trackerdict
×
686
        (trackerdict, totaldict) = loadTrackerDict(trackerdict, fileprefix)
2✔
687
        self.Wm2Front = totaldict['Wm2Front']
2✔
688
        self.Wm2Back = totaldict['Wm2Back']
2✔
689
    
690
    def returnOctFiles(self):
2✔
691
        """
692
        Return files in the root directory with `.oct` extension
693

694
        Returns
695
        -------
696
        oct_files : list
697
            List of .oct files
698
        
699
        """
700
        oct_files = [f for f in os.listdir(self.path) if f.endswith('.oct')]
×
701
        #self.oct_files = oct_files
702
        return oct_files
×
703

704
    def returnMaterialFiles(self, material_path=None):
2✔
705
        """
706
        Return files in the Materials directory with .rad extension
707
        appends materials files to the oconv file list
708

709
        Parameters
710
        ----------
711
        material_path : str
712
            Optional parameter to point to a specific materials directory.
713
            otherwise /materials/ is default
714

715
        Returns
716
        -------
717
        material_files : list
718
            List of .rad files
719

720
        """
721
        
722
        if material_path is None:
2✔
723
            material_path = 'materials'
×
724

725
        material_files = [f for f in os.listdir(os.path.join(self.path,
2✔
726
                                                             material_path)) if f.endswith('.rad')]
727

728
        materialfilelist = [os.path.join(material_path, f) for f in material_files]
2✔
729
        self.materialfiles = materialfilelist
2✔
730
        return materialfilelist
2✔
731

732
    '''
2✔
733
    def getResults(self, trackerdict=None):  #DEPRECATED IN FAVOR OF self.results
734
        """
735
        Iterate over trackerdict and return irradiance results
736
        following analysis1axis runs
737

738
        Parameters
739
        ----------
740
        trackerdict : dict, optional
741
            trackerdict, after analysis1axis has been run
742

743
        Returns
744
        -------
745
        results : Pandas.DataFrame
746
            dataframe containing irradiance scan results.
747

748
        """
749
        from bifacial_radiance.load import getResults
750
        
751
        if trackerdict is None:
752
            trackerdict = self.trackerdict
753

754
        return getResults(trackerdict, self.cumulativesky)
755
    '''
756
    
757
    def sceneNames(self, scenes=None):
2✔
758
        if scenes is None: scenes = self.scenes
2✔
759
        return [scene.name for scene in scenes]
2✔
760
    
761
    def setGround(self, material=None, material_file=None):
2✔
762
        """ 
763
        Use GroundObj constructor class and return a ground object
764
        
765
        Parameters
766
        ------------
767
        material : numeric or str
768
            If number between 0 and 1 is passed, albedo input is assumed and assigned.    
769
            If string is passed with the name of the material desired. e.g. 'litesoil',
770
            properties are searched in `material_file`.
771
            Default Material names to choose from: litesoil, concrete, white_EPDM, 
772
            beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
773
        material_file : str
774
            Filename of the material information. Default `ground.rad`
775
    
776
        Returns
777
        -------
778
        self.ground : tuple
779
            self.ground.normval : numeric
780
            Normalized color value
781
            self.ground.ReflAvg : numeric
782
            Average reflectance
783
        """
784

785
        if material is None:
2✔
786
            try:
×
787
                if self.metdata.albedo is not None:
×
788
                    material = self.metdata.albedo
×
789
                    print(" Assigned Albedo from metdata.albedo")
×
790
            except:
×
791
                pass
×
792
            
793
        self.ground = GroundObj(material, material_file)
2✔
794

795

796
    def getEPW(self, lat=None, lon=None, GetAll=False):
2✔
797
        """
798
        Subroutine to download nearest epw files to latitude and longitude provided,
799
        into the directory \EPWs\
800
        based on github/aahoo.
801
        
802
        .. warning::
803
            verify=false is required to operate within NREL's network.
804
            to avoid annoying warnings, insecurerequestwarning is disabled
805
            currently this function is not working within NREL's network.  annoying!
806
        
807
        Parameters
808
        ----------
809
        lat : decimal 
810
            Used to find closest EPW file.
811
        lon : decimal 
812
            Longitude value to find closest EPW file.
813
        GetAll : boolean 
814
            Download all available files. Note that no epw file will be loaded into memory
815
        
816
        
817
        """
818

819
        import requests, re
2✔
820
        from requests.packages.urllib3.exceptions import InsecureRequestWarning
2✔
821
        requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
2✔
822
        hdr = {'User-Agent' : "Magic Browser",
2✔
823
               'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'
824
               }
825

826
        path_to_save = 'EPWs' # create a directory and write the name of directory here
2✔
827
        if not os.path.exists(path_to_save):
2✔
828
            os.makedirs(path_to_save)
×
829

830
        def _returnEPWnames():
2✔
831
            ''' return a dataframe with the name, lat, lon, url of available files'''
832
            r = requests.get('https://github.com/NREL/EnergyPlus/raw/develop/weather/master.geojson', verify=False)
2✔
833
            data = r.json() #metadata for available files
2✔
834
            #download lat/lon and url details for each .epw file into a dataframe
835
            df = pd.DataFrame({'url':[], 'lat':[], 'lon':[], 'name':[]})
2✔
836
            for location in data['features']:
2✔
837
                match = re.search(r'href=[\'"]?([^\'" >]+)', location['properties']['epw'])
2✔
838
                if match:
2✔
839
                    url = match.group(1)
2✔
840
                    name = url[url.rfind('/') + 1:]
2✔
841
                    lontemp = location['geometry']['coordinates'][0]
2✔
842
                    lattemp = location['geometry']['coordinates'][1]
2✔
843
                    dftemp = pd.DataFrame({'url':[url], 'lat':[lattemp], 'lon':[lontemp], 'name':[name]})
2✔
844
                    #df = df.append(dftemp, ignore_index=True)
845
                    df = pd.concat([df, dftemp], ignore_index=True)
2✔
846
            return df
2✔
847

848
        def _findClosestEPW(lat, lon, df):
2✔
849
            #locate the record with the nearest lat/lon
850
            errorvec = np.sqrt(np.square(df.lat - lat) + np.square(df.lon - lon))
2✔
851
            index = errorvec.idxmin()
2✔
852
            url = df['url'][index]
2✔
853
            name = df['name'][index]
2✔
854
            return url, name
2✔
855

856
        def _downloadEPWfile(url, path_to_save, name):
2✔
857
            r = requests.get(url, verify=False, headers=hdr)
2✔
858
            if r.ok:
2✔
859
                filename = os.path.join(path_to_save, name)
2✔
860
                # py2 and 3 compatible: binary write, encode text first
861
                with open(filename, 'wb') as f:
2✔
862
                    f.write(r.text.encode('ascii', 'ignore'))
2✔
863
                print(' ... OK!')
2✔
864
            else:
865
                print(' connection error status code: %s' %(r.status_code))
×
866
                r.raise_for_status()
×
867

868
        # Get the list of EPW filenames and lat/lon
869
        df = _returnEPWnames()
2✔
870

871
        # find the closest EPW file to the given lat/lon
872
        if (lat is not None) & (lon is not None) & (GetAll is False):
2✔
873
            url, name = _findClosestEPW(lat, lon, df)
2✔
874

875
            # download the EPW file to the local drive.
876
            print('Getting weather file: ' + name)
2✔
877
            _downloadEPWfile(url, path_to_save, name)
2✔
878
            self.epwfile = os.path.join('EPWs', name)
2✔
879

880
        elif GetAll is True:
×
881
            if input('Downloading ALL EPW files available. OK? [y/n]') == 'y':
×
882
                # get all of the EPW files
883
                for index, row in df.iterrows():
×
884
                    print('Getting weather file: ' + row['name'])
×
885
                    _downloadEPWfile(row['url'], path_to_save, row['name'])
×
886
            self.epwfile = None
×
887
        else:
888
            print('Nothing returned. Proper usage: epwfile = getEPW(lat,lon)')
×
889
            self.epwfile = None
×
890

891
        return self.epwfile
2✔
892
      
893

894

895
    def readWeatherFile(self, weatherFile=None, starttime=None, 
2✔
896
                        endtime=None, label=None, source=None,
897
                        coerce_year=None, tz_convert_val=None):
898
        """
899
        Read either a EPW or a TMY file, calls the functions 
900
        :py:class:`~bifacial_radiance.readTMY` or
901
        :py:class:`~bifacial_radiance.readEPW` 
902
        according to the weatherfile extention and returns a 
903
        :py:class:`~bifacial_radiance.MetObj` .
904
        
905
        Parameters
906
        ----------
907
        weatherFile : str
908
            File containing the weather information. EPW, TMY or solargis accepted.
909
        starttime : str
910
            Limited start time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
911
        endtime : str
912
            Limited end time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
913
        daydate : str  DEPRECATED
914
            For single day in 'MM/DD' or MM_DD format.  Now use starttime and 
915
            endtime set to the same date.
916
        label : str
917
            'left', 'right', or 'center'. For data that is averaged, defines if
918
            the timestamp refers to the left edge, the right edge, or the 
919
            center of the averaging interval, for purposes of calculating 
920
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
921
            represents data from 10 to 11, and sun position is calculated 
922
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
923
            data and NSRDB uses centered.
924
        source : str
925
            To help identify different types of .csv files. If None, it assumes
926
            it is a TMY3-style formated data. Current options: 'TMY3', 
927
            'solargis', 'EPW', 'SAM'
928
        coerce_year : int
929
            Year to coerce weather data to in YYYY format, ie 2021. 
930
            If more than one year of data in the  weather file, year is NOT coerced.
931
        tz_convert_val : int 
932
            Convert timezone to this fixed value, following ISO standard 
933
            (negative values indicating West of UTC.)
934
        """
935
        #from datetime import datetime
936
        #import warnings
937
        
938
        if weatherFile is None:
2✔
939
            if hasattr(self,'epwfile'):
×
940
                weatherFile = self.epwfile
×
941
            else:
942
                try:
×
943
                    weatherFile = _interactive_load('Select EPW or TMY3 climate file')
×
944
                except:
×
945
                    raise Exception('Interactive load failed. Tkinter not supported'+
×
946
                                    'on this system. Try installing X-Quartz and reloading')
947
        if coerce_year is not None:
2✔
948
            coerce_year = int(coerce_year)
2✔
949
            if str(coerce_year).__len__() != 4:
2✔
950
                warnings.warn('Incorrect coerce_year. Setting to None')
×
951
                coerce_year = None
×
952
                
953

954
        if source is None:
2✔
955
    
956
            if weatherFile[-3:].lower() == 'epw':
2✔
957
                source = 'EPW'
2✔
958
            else:
959
                print('Warning: CSV file passed for input. Assuming it is TMY3'+
2✔
960
                      'style format') 
961
                source = 'TMY3'
2✔
962
            if label is None:
2✔
963
                label = 'right' # EPW and TMY are by deffault right-labeled.
2✔
964

965
        if source.lower() == 'solargis':
2✔
966
            if label is None:
2✔
967
                label = 'center'
2✔
968
            metdata, metadata = self._readSOLARGIS(weatherFile, label=label)
2✔
969

970
        if source.lower() =='epw':
2✔
971
            if label is None:
2✔
972
                label = 'right'
×
973
            metdata, metadata = self._readEPW(weatherFile, label=label)
2✔
974

975
        if source.lower() =='tmy3':
2✔
976
            if label is None:
2✔
977
                label = 'right'
×
978
            metdata, metadata = self._readTMY(weatherFile, label=label)
2✔
979

980
        if source.lower() =='sam':
2✔
981
            if label is None:
×
982
                label = 'left'
×
983
            metdata, metadata = self._readSAM(weatherFile)
×
984
            
985
        self.metdata = self.readWeatherData(metadata, metdata, starttime=starttime, 
2✔
986
                            endtime=endtime,
987
                            coerce_year=coerce_year, label=label,
988
                            tz_convert_val=tz_convert_val)
989
        
990
        return self.metdata
2✔
991
        
992

993
    def readWeatherData(self, metadata, metdata, starttime=None, 
2✔
994
                        endtime=None,
995
                        coerce_year=None, label='center',
996
                        tz_convert_val=None):
997
        """
998
        Intermediate function to read in metadata and metdata objects from 
999
        :py:class:`~bifacial_radiance.readWeatherFile` and export a 
1000
        :py:class:`~bifacial_radiance.MetObj` 
1001
        
1002
        Parameters
1003
        ----------
1004
        metadata : dict
1005
            Dictionary with metadata stats. keys required: 'lat', 'lon', 'altitude',
1006
            'TZ'
1007
        metdata : Pandas DataFrame
1008
            Dataframe with meteo timeseries. Index needs to be datetimelike and TZ-aware.
1009
            columns required: 'DNI', 'DHI','GHI', 'Alb'
1010
        starttime : str, optional
1011
            Limited start time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
1012
        endtime : str, optional
1013
            Limited end time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
1014
        label : str
1015
            'left', 'right', or 'center'. For data that is averaged, defines if
1016
            the timestamp refers to the left edge, the right edge, or the 
1017
            center of the averaging interval, for purposes of calculating 
1018
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1019
            represents data from 10 to 11, and sun position is calculated 
1020
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1021
            data and NSRDB uses centered.
1022
        coerce_year : int
1023
            Year to coerce weather data to in YYYY format, ie 2021. 
1024
            If more than one year of data in the  weather file, year is NOT coerced.
1025
        tz_convert_val : int 
1026
            Convert timezone to this fixed value, following ISO standard 
1027
            (negative values indicating West of UTC.)
1028
        """
1029
        
1030
        def _parseTimes(t, hour, coerce_year):
2✔
1031
            '''
1032
            parse time input t which could be string mm_dd_HH or YYYY-mm-dd_HHMM
1033
            or datetime.datetime object.  Return pd.datetime object.  Define
1034
            hour as hour input if not passed directly.
1035
            '''
1036
            import re
2✔
1037
            
1038
            if type(t) == str:
2✔
1039
                try:
2✔
1040
                    tsplit = re.split('-|_| ', t)
2✔
1041
                    
1042
                    #mm_dd format
1043
                    if tsplit.__len__() == 2 and t.__len__() == 5: 
2✔
1044
                        if coerce_year is None:
2✔
1045
                                coerce_year = 2021 #default year. 
2✔
1046
                        tsplit.insert(0,str(coerce_year))
2✔
1047
                        tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
1048
                        
1049
                    #mm_dd_hh or YYYY_mm_dd format
1050
                    elif tsplit.__len__() == 3 :
2✔
1051
                        if tsplit[0].__len__() == 2:
2✔
1052
                            if coerce_year is None:
2✔
1053
                                coerce_year = 2021 #default year. 
2✔
1054
                            tsplit.insert(0,str(coerce_year))
2✔
1055
                        elif tsplit[0].__len__() == 4:
2✔
1056
                            tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
1057
                            
1058
                    #YYYY-mm-dd_HHMM  format
1059
                    if tsplit.__len__() == 4 and tsplit[0].__len__() == 4:
2✔
1060
                        t_out = pd.to_datetime(''.join(tsplit).ljust(12,'0') ) 
2✔
1061
                    
1062
                    else:
1063
                        raise Exception(f'incorrect time string passed {t}.'
×
1064
                                        'Valid options: mm_dd, mm_dd_HH, '
1065
                                        'mm_dd_HHMM, YYYY-mm-dd_HHMM')  
1066
                except Exception as e:
×
1067
                    # Error for incorrect string passed:
1068
                    raise(e)
×
1069
            else:  #datetime or timestamp
1070
                try:
2✔
1071
                    t_out = pd.to_datetime(t)
2✔
1072
                except pd.errors.ParserError:
×
1073
                    print('incorrect time object passed.  Valid options: '
×
1074
                          'string or datetime.datetime or pd.timeIndex. You '
1075
                          f'passed {type(t)}.')
1076
            return t_out, coerce_year
2✔
1077
        # end _parseTimes
1078
        
1079
        def _tz_convert(metdata, metadata, tz_convert_val):
2✔
1080
            """
1081
            convert metdata to a different local timzone.  Particularly for 
1082
            SolarGIS weather files which are returned in UTC by default.
1083
            ----------
1084
            tz_convert_val : int
1085
                Convert timezone to this fixed value, following ISO standard 
1086
                (negative values indicating West of UTC.)
1087
            Returns: metdata, metadata  
1088
            """
1089
            import pytz
2✔
1090
            if (type(tz_convert_val) == int) | (type(tz_convert_val) == float):
2✔
1091
                metadata['TZ'] = tz_convert_val
2✔
1092
                metdata = metdata.tz_convert(pytz.FixedOffset(tz_convert_val*60))
2✔
1093
            return metdata, metadata
2✔
1094
        # end _tz_convert
1095
        
1096
        def _correctMetaKeys(m):
2✔
1097
            # put correct keys on m = metadata dict
1098

1099
            m['altitude'] = _firstlist([m.get('altitude'), m.get('elevation')])
2✔
1100
            m['TZ'] = _firstlist([m.get('TZ'), m.get('Time Zone'), m.get('timezone')])
2✔
1101
           
1102
            if not m.get('city'):
2✔
1103
                try:
2✔
1104
                    m['city'] = (m['county'] + ',' + m['state'] +
2✔
1105
                                        ',' + m['country'])
1106
                except KeyError:
2✔
1107
                    m['city'] = '-'
2✔
1108
            m['Name'] = _firstlist([m.get('Name'), m.get('city'), m.get('county'), 
2✔
1109
                                    f"nsrdb_{m.get('Location ID')}"])    
1110
            
1111
            return m
2✔
1112
        
1113
        
1114
        metadata = _correctMetaKeys(metadata)
2✔
1115

1116
        metdata.rename(columns={'dni': 'DNI',
2✔
1117
                                'dhi': 'DHI',
1118
                                'ghi': 'GHI',
1119
                                'air_temperature': 'DryBulb',
1120
                                'wind_speed': 'Wspd',
1121
                                'surface_albedo': 'Alb'
1122
                                }, inplace=True)        
1123
        
1124
    
1125
        metdata, metadata = _tz_convert(metdata, metadata, tz_convert_val)
2✔
1126
        tzinfo = metdata.index.tzinfo
2✔
1127
        tempMetDatatitle = 'metdata_temp.csv'
2✔
1128

1129
        # Parse the start and endtime strings. 
1130
        if starttime is not None:
2✔
1131
            starttime, coerce_year = _parseTimes(starttime, 1, coerce_year)
2✔
1132
            starttime = starttime.tz_localize(tzinfo)
2✔
1133
        if endtime is not None:
2✔
1134
            endtime, coerce_year = _parseTimes(endtime, 23, coerce_year)
2✔
1135
            endtime = endtime.tz_localize(tzinfo)
2✔
1136
        '''
2✔
1137
        #TODO: do we really need this check?
1138
        if coerce_year is not None and starttime is not None:
1139
            if coerce_year != starttime.year or coerce_year != endtime.year:
1140
                print("Warning: Coerce year does not match requested sampled "+
1141
                      "date(s)'s years. Setting Coerce year to None.")
1142
                coerce_year = None
1143
        '''        
1144

1145
        tmydata_trunc = self._saveTempTMY(metdata, filename=tempMetDatatitle, 
2✔
1146
                                          starttime=starttime, endtime=endtime, 
1147
                                          coerce_year=coerce_year,
1148
                                          label=label)
1149

1150
        if tmydata_trunc.__len__() > 0:
2✔
1151
            self.metdata = MetObj(tmydata_trunc, metadata, label = label)
2✔
1152
        else:
1153
            self.metdata = None
×
1154
            raise Exception('Weather file returned zero points for the '
×
1155
                  'starttime / endtime  provided')
1156
        
1157
        
1158
        return self.metdata
2✔
1159
        
1160

1161

1162
    def _saveTempTMY(self, tmydata, filename=None, starttime=None, endtime=None, 
2✔
1163
                     coerce_year=None, label=None):
1164
        '''
1165
        private function to save part or all of tmydata into /EPWs/ for use 
1166
        in gencumsky -G mode and return truncated  tmydata. Gencumsky 8760
1167
        starts with Jan 1, 1AM and ends Dec 31, 2400
1168
        
1169
        starttime:  tz-localized pd.TimeIndex
1170
        endtime:    tz-localized pd.TimeIndex
1171
        
1172
        returns: tmydata_truncated  : subset of tmydata based on start & end
1173
        '''
1174
        
1175
        
1176
        if filename is None:
2✔
1177
            filename = 'temp.csv'
×
1178
                        
1179
        gencumskydata = None
2✔
1180
        gencumdict = None
2✔
1181
        if len(tmydata) == 8760: 
2✔
1182
            print("8760 line in WeatherFile. Assuming this is a standard hourly"+
2✔
1183
                  " WeatherFile for the year for purposes of saving Gencumulativesky"+
1184
                  " temporary weather files in EPW folder.")
1185
            if coerce_year is None and starttime is not None:
2✔
1186
                coerce_year = starttime.year
2✔
1187

1188
            elif coerce_year is None and len(tmydata.index[:-1].year.unique())>1:
2✔
1189
                coerce_year = 2021                
2✔
1190
            
1191
            if coerce_year:
2✔
1192
                print(f"Coercing year to {coerce_year}")
2✔
1193
                tz = tmydata.index.tz
2✔
1194
                year_vector = np.full(shape=len(tmydata), fill_value=coerce_year)
2✔
1195
                year_vector[-1] = coerce_year+1
2✔
1196
                tmydata.index =  pd.to_datetime({
2✔
1197
                                    'year': year_vector,
1198
                                    'month': tmydata.index.month,
1199
                                    'day': tmydata.index.day,
1200
                                    'hour': tmydata.index.hour})
1201
                
1202
                tmydata = tmydata.tz_localize(tz)
2✔
1203

1204

1205

1206
            # FilterDates
1207
            filterdates = None
2✔
1208
            if starttime is not None and endtime is not None:
2✔
1209
                starttime
2✔
1210
                filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
2✔
1211
            else:
1212
                if starttime is not None:
2✔
1213
                    filterdates = (tmydata.index >= starttime)
2✔
1214
                if endtime is not None:
2✔
1215
                    filterdates = (tmydata.index <= endtime)
×
1216
            
1217
            if filterdates is not None:
2✔
1218
                print("Filtering dates")
2✔
1219
                tmydata[~filterdates] = 0
2✔
1220
        
1221
            gencumskydata = tmydata.copy()
2✔
1222
            
1223
        else:
1224
            if len(tmydata.index.year.unique()) == 1:
2✔
1225
                if coerce_year:
×
1226
                    # TODO: check why subhourly data still has 0 entries on the next day on _readTMY3
1227
                    # in the meantime, let's make Silvana's life easy by just deletig 0 entries
1228
                    tmydata = tmydata[~(tmydata.index.hour == 0)] 
×
1229
                    print(f"Coercing year to {coerce_year}")
×
1230
                    # TODO: this coercing shows a python warning. Turn it off or find another method? bleh.
1231
                    tmydata.index.values[:] = tmydata.index[:] + pd.DateOffset(year=(coerce_year))
×
1232
        
1233
                # FilterDates
1234
                filterdates = None
×
1235
                if starttime is not None and endtime is not None:
×
1236
                    filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1237
                else:
1238
                    if starttime is not None:
×
1239
                        filterdates = (tmydata.index >= starttime)
×
1240
                    if endtime is not None:
×
1241
                        filterdates = (tmydata.index <= endtime)
×
1242
                
1243
                if filterdates is not None:
×
1244
                    print("Filtering dates")
×
1245
                    tmydata[~filterdates] = 0
×
1246
        
1247
                gencumskydata = tmydata.copy()
×
1248
                gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata, 
×
1249
                                                                label=label)
1250
        
1251
            else:
1252
                if coerce_year:
2✔
1253
                    print("More than 1 year of data identified. Can't do coercing")
×
1254
                
1255
                # Check if years are consecutive
1256
                l = list(tmydata.index.year.unique())
2✔
1257
                if l != list(range(min(l), max(l)+1)):
2✔
1258
                    print("Years are not consecutive. Won't be able to use Gencumsky"+
×
1259
                          " because who knows what's going on with this data.")
1260
                else:
1261
                    print("Years are consecutive. For Gencumsky, make sure to select"+
2✔
1262
                          " which yearly temporary weather file you want to use"+
1263
                          " else they will all get accumulated to same hour/day")
1264
                    
1265
                    # FilterDates
1266
                    filterdates = None
2✔
1267
                    if starttime is not None and endtime is not None:
2✔
1268
                        filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1269
                    else:
1270
                        if starttime is not None:
2✔
1271
                            filterdates = (tmydata.index >= starttime)
×
1272
                        if endtime is not None:
2✔
1273
                            filterdates = (tmydata.index <= endtime)
×
1274
                    
1275
                    if filterdates is not None:
2✔
1276
                        print("Filtering dates")
×
1277
                        tmydata = tmydata[filterdates] # Reducing years potentially
×
1278
        
1279
                    # Checking if filtering reduced to just 1 year to do usual savin.
1280
                    if len(tmydata.index.year.unique()) == 1:
2✔
1281
                        gencumskydata = tmydata.copy()
×
1282
                        gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
×
1283
                                                                        label=label)
1284

1285
                    else:
1286
                        gencumdict = [g for n, g in tmydata.groupby(pd.Grouper(freq='Y'))]
2✔
1287
                        
1288
                        for ii in range(0, len(gencumdict)):
2✔
1289
                            gencumskydata = gencumdict[ii]
2✔
1290
                            gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
2✔
1291
                                                                            label=label)
1292
                            gencumdict[ii] = gencumskydata
2✔
1293
                        
1294
                        gencumskydata = None # clearing so that the dictionary style can be activated.
2✔
1295
        
1296
        
1297
        # Let's save files in EPWs folder for Gencumsky     
1298
        if gencumskydata is not None:
2✔
1299
            csvfile = os.path.join('EPWs', filename)
2✔
1300
            print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1301
            gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1302
            self.gencumsky_metfile = csvfile
2✔
1303
        
1304
        if gencumdict is not None:
2✔
1305
            self.gencumsky_metfile = []
2✔
1306
            for ii in range (0, len(gencumdict)):
2✔
1307
                gencumskydata = gencumdict[ii]
2✔
1308
                newfilename = filename.split('.')[0]+'_year_'+str(ii)+'.csv'
2✔
1309
                csvfile = os.path.join('EPWs', newfilename)
2✔
1310
                print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1311
                gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1312
                self.gencumsky_metfile.append(csvfile)
2✔
1313

1314
        return tmydata
2✔
1315

1316
        
1317
    def _readTMY(self, tmyfile=None, label = 'right', coerce_year=None):
2✔
1318
        '''
1319
        use pvlib to read in a tmy3 file.
1320
        Note: pvlib 0.7 does not currently support sub-hourly files. Until
1321
        then, use _readTMYdate() to create the index
1322

1323
        Parameters
1324
        ------------
1325
        tmyfile : str
1326
            Filename of tmy3 to be read with pvlib.tmy.readtmy3
1327
        label : str
1328
            'left', 'right', or 'center'. For data that is averaged, defines if
1329
            the timestamp refers to the left edge, the right edge, or the 
1330
            center of the averaging interval, for purposes of calculating 
1331
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1332
            represents data from 10 to 11, and sun position is calculated 
1333
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1334
            data and NSRDB uses centered.
1335
        coerce_year : int
1336
            Year to coerce to. Default is 2021. 
1337
        
1338
        Returns
1339
        -------
1340
        metdata - MetObj collected from TMY3 file
1341
        '''
1342
        def _convertTMYdate(data, meta):
2✔
1343
            ''' requires pvlib 0.8, updated to handle subhourly timestamps '''
1344
            # get the date column as a pd.Series of numpy datetime64
1345
            data_ymd = pd.to_datetime(data['Date (MM/DD/YYYY)'])
2✔
1346
            # shift the time column so that midnite is 00:00 instead of 24:00
1347
            shifted_hour = data['Time (HH:MM)'].str[:2].astype(int) % 24
2✔
1348
            minute = data['Time (HH:MM)'].str[3:].astype(int) 
2✔
1349
            # shift the dates at midnite so they correspond to the next day
1350
            data_ymd[shifted_hour == 0] += datetime.timedelta(days=1)
2✔
1351
            # NOTE: as of pandas>=0.24 the pd.Series.array has a month attribute, but
1352
            # in pandas-0.18.1, only DatetimeIndex has month, but indices are immutable
1353
            # so we need to continue to work with the panda series of dates `data_ymd`
1354
            data_index = pd.DatetimeIndex(data_ymd)
2✔
1355
            # use indices to check for a leap day and advance it to March 1st
1356
            leapday = (data_index.month == 2) & (data_index.day == 29)
2✔
1357
            data_ymd[leapday] += datetime.timedelta(days=1)
2✔
1358
            # shifted_hour is a pd.Series, so use pd.to_timedelta to get a pd.Series of
1359
            # timedeltas
1360
            # NOTE: as of pvlib-0.6.3, min req is pandas-0.18.1, so pd.to_timedelta
1361
            # unit must be in (D,h,m,s,ms,us,ns), but pandas>=0.24 allows unit='hour'
1362
            data.index = (data_ymd + pd.to_timedelta(shifted_hour, unit='h') +
2✔
1363
                         pd.to_timedelta(minute, unit='min') )
1364

1365
            data = data.tz_localize(int(meta['TZ'] * 3600))
2✔
1366
            
1367
            return data
2✔
1368
        
1369
        
1370
        import pvlib
2✔
1371
        #(tmydata, metadata) = pvlib.tmy.readtmy3(filename=tmyfile) #pvlib<=0.6
1372
        try:
2✔
1373
            (tmydata, metadata) = pvlib.iotools.tmy.read_tmy3(filename=tmyfile,
2✔
1374
                                                          coerce_year=coerce_year,
1375
                                                          map_variables=True)
1376
        except TypeError:  # pvlib < 0.10
1✔
1377
            (tmydata, metadata) = pvlib.iotools.tmy.read_tmy3(filename=tmyfile,
1✔
1378
                                                          coerce_year=coerce_year)
1379
        
1380
        try:
2✔
1381
            tmydata = _convertTMYdate(tmydata, metadata) 
2✔
1382
        except KeyError:
×
1383
            print('PVLib >= 0.8.0 is required for sub-hourly data input')
×
1384
        
1385
        tmydata.rename(columns={'dni':'DNI',
2✔
1386
                                'dhi':'DHI',
1387
                                'temp_air':'DryBulb',
1388
                                'wind_speed':'Wspd',
1389
                                'ghi':'GHI',
1390
                                'albedo':'Alb'
1391
                                }, inplace=True)  #as of v0.11, PVLib changed tmy3 column names..
1392

1393
        return tmydata, metadata
2✔
1394

1395
    def _readSAM(self, SAMfile=None):
2✔
1396
        '''
1397
        use pvlib to read in a tmy3 file.
1398
        Note: pvlib 0.7 does not currently support sub-hourly files. Until
1399
        then, use _readTMYdate() to create the index
1400

1401
        Parameters
1402
        ------------
1403
        tmyfile : str
1404
            Filename of tmy3 to be read with pvlib.tmy.readtmy3
1405

1406
        Returns
1407
        -------
1408
        tmydata - Weather dataframe
1409
        metdata - MetObj collected from SAM file
1410
        '''
1411
        
1412
        # Will only work with latest PVLIB Release once htey accept my push..
1413
        # Note Oct. 10
1414
        # import pvlib
1415
        #(tmydata, metadata) = pvlib.iotools.tmy.read_psm3(filename=SAMfile,
1416
        #                                                  map_variables=True)
1417
        with open(SAMfile) as myfile:
×
1418
            head = next(myfile)#
×
1419
            meta = next(myfile)
×
1420
        
1421
        meta2=meta.split(',')
×
1422
        meta2[-1] = meta2[-1][:-1] # Remove the carryover sig
×
1423
        
1424
        head2 = head.split(',')
×
1425
        head2[-1] = head2[-1][:-1] 
×
1426
        
1427
        res = {head2[i]: meta2[i] for i in range(len(head2))}
×
1428
        
1429

1430
        data = pd.read_csv(SAMfile, skiprows=2)
×
1431
        
1432
        metadata = {}
×
1433
        metadata['TZ'] = float(res['Time Zone'])
×
1434
        metadata['latitude'] = float(res['Latitude'])
×
1435
        metadata['longitude'] = float(res['Longitude'])
×
1436
        metadata['altitude'] = float(res['Elevation'])
×
1437
        metadata['city'] = res['Source']
×
1438
        
1439
        allcaps = False
×
1440
        if 'Year' in data.columns:
×
1441
            allcaps = True
×
1442
            
1443
        if allcaps:
×
1444
            if 'Minute' in data.columns:
×
1445
                dtidx = pd.to_datetime(
×
1446
                    data[['Year', 'Month', 'Day', 'Hour', 'Minute']])
1447
            else: 
1448
                dtidx = pd.to_datetime(
×
1449
                    data[['Year', 'Month', 'Day', 'Hour']])
1450
        else:
1451
            if 'minute' in data.columns:
×
1452
                dtidx = pd.to_datetime(
×
1453
                    data[['year', 'month', 'day', 'hour', 'minute']])
1454
            else: 
1455
                dtidx = pd.to_datetime(
×
1456
                    data[['year', 'month', 'day', 'hour']])
1457

1458
        # in USA all timezones are integers
1459
        tz = 'Etc/GMT%+d' % -metadata['TZ']
×
1460
        data.index = pd.DatetimeIndex(dtidx).tz_localize(tz)
×
1461

1462
    
1463

1464
        data.rename(columns={'Temperature':'temp_air'}, inplace=True) 
×
1465
        data.rename(columns={'Surface Albedo':'Alb'}, inplace=True) 
×
1466
        data.rename(columns={'wspd':'wind_speed'}, inplace=True) 
×
1467
        data.rename(columns={'Wind Speed':'wind_speed'}, inplace=True) 
×
1468
        data.rename(columns={'Pressure':'pressure'}, inplace=True) 
×
1469
        data.rename(columns={'Dew Point':'dewpoint'}, inplace=True) 
×
1470

1471
        data.rename(columns={'tdry':'DryBulb'}, inplace=True) 
×
1472
        data.rename(columns={'Tdry':'DryBulb'}, inplace=True) 
×
1473
        data.rename(columns={'dni':'DNI'}, inplace=True) 
×
1474
        data.rename(columns={'dhi':'DHI'}, inplace=True) 
×
1475
        data.rename(columns={'ghi':'GHI'}, inplace=True) 
×
1476

1477
        data.rename(columns={'pres':'atmospheric_pressure'}, inplace=True) 
×
1478
        data.rename(columns={'Tdew':'temp_dew'}, inplace=True) 
×
1479
        data.rename(columns={'albedo':'Alb'}, inplace=True) 
×
1480
        
1481
        print("COLUMN DATAS", data.keys())
×
1482

1483
        tmydata = data
×
1484
        
1485
        return tmydata, metadata
×
1486

1487
    def _readEPW(self, epwfile=None, label = 'right', coerce_year=None):
2✔
1488
        """
1489
        Uses readepw from pvlib>0.6.1 but un-do -1hr offset and
1490
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1491
    
1492
        Parameters
1493
        ------------
1494
        epwfile : str
1495
            Direction and filename of the epwfile. If None, opens an interactive
1496
            loading window.
1497
        label : str
1498
            'left', 'right', or 'center'. For data that is averaged, defines if
1499
            the timestamp refers to the left edge, the right edge, or the 
1500
            center of the averaging interval, for purposes of calculating 
1501
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1502
            represents data from 10 to 11, and sun position is calculated 
1503
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1504
            data and NSRDB uses centered.
1505
        coerce_year : int
1506
            Year to coerce data to.
1507
        
1508
        """
1509
        
1510
        import pvlib
2✔
1511
        #import re
1512
        
1513
        '''
2✔
1514
        NOTE: In PVLib > 0.6.1 the new epw.read_epw() function reads in time 
1515
        with a default -1 hour offset.  This is reflected in our existing
1516
        workflow. 
1517
        '''
1518
        #(tmydata, metadata) = readepw(epwfile) #
1519
        (tmydata, metadata) = pvlib.iotools.epw.read_epw(epwfile, 
2✔
1520
                                                         coerce_year=coerce_year) #pvlib>0.6.1
1521
        #pvlib uses -1hr offset that needs to be un-done. 
1522
        tmydata.index = tmydata.index+pd.Timedelta(hours=1) 
2✔
1523

1524
        # rename different field parameters to match output from 
1525
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1526
        tmydata.rename(columns={'dni':'DNI',
2✔
1527
                                'dhi':'DHI',
1528
                                'temp_air':'DryBulb',
1529
                                'wind_speed':'Wspd',
1530
                                'ghi':'GHI',
1531
                                'albedo':'Alb'
1532
                                }, inplace=True)    
1533

1534
        return tmydata, metadata
2✔
1535

1536

1537
    def _readSOLARGIS(self, filename=None, label='center'):
2✔
1538
        """
1539
        Read solarGIS data file which is timestamped in UTC.
1540
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1541
        Timezone is always returned as UTC. Use tz_convert in readWeatherFile
1542
        to manually convert to local time
1543
    
1544
        Parameters
1545
        ------------
1546
        filename : str
1547
            filename of the solarGIS file. 
1548
        label : str
1549
            'left', 'right', or 'center'. For data that is averaged, defines if
1550
            the timestamp refers to the left edge, the right edge, or the 
1551
            center of the averaging interval. SolarGis default style is center,
1552
            unless user requests a right label. 
1553
       
1554
        """
1555
        # file format: anything with # preceding is in the header
1556
        header = []; lat = None; lon = None; elev = None; name = None
2✔
1557
        with open(filename, 'r') as result:
2✔
1558
            for line in result:
2✔
1559
                if line.startswith('#'):
2✔
1560
                    header.append(line)
2✔
1561
                    if line.startswith('#Latitude:'):
2✔
1562
                        lat = line[11:]
2✔
1563
                    if line.startswith('#Longitude:'):
2✔
1564
                        lon = line[12:]
2✔
1565
                    if line.startswith('#Elevation:'):
2✔
1566
                        elev = line[12:17]
2✔
1567
                    if line.startswith('#Site name:'):
2✔
1568
                        name = line[12:-1]
2✔
1569
                else:
1570
                    break
2✔
1571
        metadata = {'latitude':float(lat),
2✔
1572
                    'longitude':float(lon),
1573
                    'altitude':float(elev),
1574
                    'Name':name,
1575
                    'TZ':0.0}
1576
        # read in remainder of data
1577
        data = pd.read_csv(filename,skiprows=header.__len__(), delimiter=';')
2✔
1578

1579
        # rename different field parameters to match output from 
1580
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1581
        data.rename(columns={'DIF':'DHI',
2✔
1582
                             'TEMP':'DryBulb',
1583
                             'WS':'Wspd',
1584
                             }, inplace=True)    
1585

1586
        # Generate index from Date (DD.HH.YYYY) and Time
1587
        data.index = pd.to_datetime(data.Date + ' ' +  data.Time, 
2✔
1588
                                    dayfirst=True, utc=True,
1589
                                    infer_datetime_format = True)
1590

1591
        
1592
        return data, metadata
2✔
1593

1594

1595
    def getSingleTimestampTrackerAngle(self, timeindex, metdata=None, gcr=None, 
2✔
1596
                                       azimuth=180, axis_tilt=0, 
1597
                                       limit_angle=45, backtrack=True):
1598
        """
1599
        Helper function to calculate a tracker's angle for use with the 
1600
        fixed tilt routines of bifacial_radiance. It calculates tracker angle for
1601
        sun position at the timeindex passed (no left or right time offset, 
1602
        label = 'center')
1603
        
1604
        Parameters
1605
        ----------
1606
        timeindex : int
1607
            Index between 0 to ~4000 indicating hour to simulate.
1608
        metdata : :py:class:`~bifacial_radiance.MetObj` 
1609
            Meterological object to set up geometry. Usually set automatically by
1610
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1611
            Default = self.metdata
1612
        gcr : float
1613
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1614
        azimuth : float or int
1615
            Orientation axis of tracker torque tube. Default North-South (180 deg)
1616
        axis_tilt : float or int
1617
            Default 0. Axis tilt -- not implemented in sensors locations so it's pointless
1618
            at this release to change it.
1619
        limit_angle : float or int
1620
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1621
        backtrack : boolean
1622
            Whether backtracking is enabled (default = True)
1623
        
1624
        """
1625
        '''
2✔
1626
        elev = metdata.elevation
1627
        lat = metdata.latitude
1628
        lon = metdata.longitude
1629
        timestamp = metdata.datetime[timeindex]
1630
        '''
1631
        
1632
        import pvlib
2✔
1633
        
1634
        if not metdata:
2✔
1635
            metdata = self.metdata        
×
1636
        solpos = metdata.solpos.iloc[timeindex]
2✔
1637
        sunzen = float(solpos.apparent_zenith)
2✔
1638
        sunaz = float(solpos.azimuth) # not substracting the 180 
2✔
1639
        
1640
        trackingdata = pvlib.tracking.singleaxis(sunzen, sunaz,
2✔
1641
                                             axis_tilt, azimuth,
1642
                                             limit_angle, backtrack, gcr)
1643
        
1644
        tracker_theta = float(np.round(trackingdata['tracker_theta'],2))
2✔
1645
        tracker_theta = tracker_theta*-1 # bifacial_radiance uses East (morning) theta as positive
2✔
1646
            
1647
        return tracker_theta
2✔
1648

1649

1650
    def gendaylit(self, timeindex, metdata=None, debug=False):
2✔
1651
        """
1652
        Sets and returns sky information using gendaylit.
1653
        Uses PVLIB for calculating the sun position angles instead of
1654
        using Radiance internal sun position calculation (for that use gendaylit function)
1655
        
1656
        Parameters
1657
        ----------
1658
        timeindex : int
1659
            Index from 0 to ~4000 of the MetObj (daylight hours only)
1660
        metdata : ``MetObj``
1661
            MetObj object with list of dni, dhi, ghi and location
1662
        debug : bool
1663
            Flag to print output of sky DHI and DNI
1664

1665
        Returns
1666
        -------
1667
        skyname : str
1668
            Sets as a self.skyname and returns filename of sky in /skies/ directory. 
1669
            If errors exist, such as DNI = 0 or sun below horizon, this skyname is None
1670

1671
        """
1672
        #import warnings
1673
 
1674
        if metdata is None:
2✔
1675
            try:
2✔
1676
                metdata = self.metdata
2✔
1677
            except:
×
1678
                print('usage: pass metdata, or run after running ' +
×
1679
                      'readWeatherfile() ') 
1680
                return
×
1681

1682
        ground = self.ground
2✔
1683
        
1684
        locName = metdata.city
2✔
1685
        dni = metdata.dni[timeindex]
2✔
1686
        dhi = metdata.dhi[timeindex]
2✔
1687
        ghi = metdata.ghi[timeindex]
2✔
1688
        elev = metdata.elevation
2✔
1689
        lat = metdata.latitude
2✔
1690
        lon = metdata.longitude
2✔
1691

1692
        # Assign Albedos
1693
        try:
2✔
1694
            if ground.ReflAvg.shape == metdata.dni.shape:
2✔
1695
                groundindex = timeindex  
2✔
1696
            elif self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1697
                groundindex = 0
2✔
1698
            else:
1699
                warnings.warn("Shape of ground Albedos and TMY data do not match.")
×
1700
                return
×
1701
        except:
×
1702
            print('usage: make sure to run setGround() before gendaylit()')
×
1703
            return
×
1704

1705
        if debug is True:
2✔
1706
            print('Sky generated with Gendaylit, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1707
            print("Datetime TimeIndex", metdata.datetime[timeindex])
2✔
1708

1709

1710

1711
        #Time conversion to correct format and offset.
1712
        #datetime = metdata.sunrisesetdata['corrected_timestamp'][timeindex]
1713
        #Don't need any of this any more. Already sunrise/sunset corrected and offset by appropriate interval
1714

1715
        # get solar position zenith and azimuth based on site metadata
1716
        #solpos = pvlib.irradiance.solarposition.get_solarposition(datetimetz,lat,lon,elev)
1717
        solpos = metdata.solpos.iloc[timeindex]
2✔
1718
        sunalt = float(solpos.elevation)
2✔
1719
        # Radiance expects azimuth South = 0, PVlib gives South = 180. Must substract 180 to match.
1720
        sunaz = float(solpos.azimuth)-180.0
2✔
1721

1722
        sky_path = 'skies'
2✔
1723

1724
        if dhi <= 0:
2✔
1725
            self.skyfiles = [None]
×
1726
            return None
×
1727
        # We should already be filtering for elevation >0. But just in case...
1728
        if sunalt <= 0:
2✔
1729
            sunalt = np.arcsin((ghi-dhi)/(dni+.001))*180/np.pi # reverse engineer elevation from ghi, dhi, dni
×
1730
            print('Warning: negative sun elevation at '+
×
1731
                  '{}.  '.format(metdata.datetime[timeindex])+
1732
                  'Re-calculated elevation: {:0.2}'.format(sunalt))
1733

1734
        # Note - -W and -O1 option is used to create full spectrum analysis in units of Wm-2
1735
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1736
        skyStr = ("# start of sky definition for daylighting studies\n" + \
2✔
1737
            "# location name: " + str(locName) + " LAT: " + str(lat)
1738
            +" LON: " + str(lon) + " Elev: " + str(elev) + "\n"
1739
            "# Sun position calculated w. PVLib\n" + \
1740
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1741
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, ground.ReflAvg[groundindex]) + \
1742
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1743
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1744
            ground._makeGroundString(index=groundindex, cumulativesky=False)
1745

1746
        time = metdata.datetime[timeindex]
2✔
1747
        #filename = str(time)[2:-9].replace('-','_').replace(' ','_').replace(':','_')
1748
        filename = time.strftime('%Y-%m-%d_%H%M')
2✔
1749
        skyname = os.path.join(sky_path,"sky2_%s_%s_%s.rad" %(lat, lon, filename))
2✔
1750

1751
        skyFile = open(skyname, 'w')
2✔
1752
        skyFile.write(skyStr)
2✔
1753
        skyFile.close()
2✔
1754

1755
        self.skyfiles = [skyname]
2✔
1756

1757
        return skyname
2✔
1758

1759
    def gendaylit2manual(self, dni, dhi, sunalt, sunaz):
2✔
1760
        """
1761
        Sets and returns sky information using gendaylit.
1762
        Uses user-provided data for sun position and irradiance.
1763
        
1764
        .. warning::
1765
            This generates the sky at the sun altitude&azimuth provided, make 
1766
            sure it is the right position relative to how the weather data got
1767
            created and read (i.e. label right, left or center).
1768
            
1769
     
1770
        Parameters
1771
        ------------
1772
        dni: int or float
1773
           Direct Normal Irradiance (DNI) value, in W/m^2
1774
        dhi : int or float
1775
           Diffuse Horizontal Irradiance (DHI) value, in W/m^2 
1776
        sunalt : int or float
1777
           Sun altitude (degrees) 
1778
        sunaz : int or float
1779
           Sun azimuth (degrees) 
1780

1781
        Returns
1782
        -------
1783
        skyname : string
1784
           Filename of sky in /skies/ directory
1785
        """
1786

1787
        
1788
        print('Sky generated with Gendaylit 2 MANUAL, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1789

1790
        sky_path = 'skies'
2✔
1791

1792
        if sunalt <= 0 or dhi <= 0:
2✔
1793
            self.skyfiles = [None]
×
1794
            return None
×
1795
        
1796
                # Assign Albedos
1797
        try:
2✔
1798
            if self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1799
                groundindex = 0
2✔
1800
            else:
1801
                print("Ambiguous albedo entry, Set albedo to single value "
×
1802
                      "in setGround()")
1803
                return
×
1804
        except:
×
1805
            print('usage: make sure to run setGround() before gendaylit()')
×
1806
            return
×
1807
        
1808
        
1809
        # Note: -W and -O1 are used to create full spectrum analysis in units of Wm-2       
1810
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1811
        skyStr =   ("# start of sky definition for daylighting studies\n" + \
2✔
1812
            "# Manual inputs of DNI, DHI, SunAlt and SunAZ into Gendaylit used \n" + \
1813
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1814
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, self.ground.ReflAvg[groundindex]) + \
1815
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1816
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1817
            self.ground._makeGroundString(index=groundindex, cumulativesky=False)
1818

1819
        skyname = os.path.join(sky_path, "sky2_%s.rad" %(self.name))
2✔
1820

1821
        skyFile = open(skyname, 'w')
2✔
1822
        skyFile.write(skyStr)
2✔
1823
        skyFile.close()
2✔
1824

1825
        self.skyfiles = [skyname]
2✔
1826

1827
        return skyname
2✔
1828

1829
    def genCumSky(self, gencumsky_metfile=None, savefile=None):
2✔
1830
        """ 
1831
        Generate Skydome using gencumsky. 
1832
        
1833
        .. warning::
1834
            gencumulativesky.exe is required to be installed,
1835
            which is not a standard radiance distribution.
1836
            You can find the program in the bifacial_radiance distribution directory
1837
            in \Lib\site-packages\bifacial_radiance\data
1838
            
1839
 
1840
        Use :func:`readWeatherFile(filename, starttime='YYYY-mm-dd_HHMM', endtime='YYYY-mm-dd_HHMM')` 
1841
        to limit gencumsky simulations instead.
1842

1843
        Parameters
1844
        ------------
1845
        gencumsky_metfile : str
1846
            Filename with path to temporary created meteorological file usually created
1847
            in EPWs folder. This csv file has no headers, no index, and two
1848
            space separated columns with values for GHI and DNI for each hour 
1849
            in the year, and MUST have 8760 entries long otherwise gencumulativesky.exe cries. 
1850
        savefile : string
1851
            If savefile is None, defaults to "cumulative"
1852
            
1853
        Returns
1854
        --------
1855
        skyname : str
1856
            Filename of the .rad file containing cumulativesky info
1857
            
1858
        """
1859
        
1860
        # TODO:  error checking and auto-install of gencumulativesky.exe
1861
        # TODO: add check if readWeatherfile has not be done
1862
        # TODO: check if it fails if gcc module has been loaded? (common hpc issue)
1863
        
1864
        #import datetime
1865
        
1866
        if gencumsky_metfile is None:
2✔
1867
            gencumsky_metfile = self.gencumsky_metfile
×
1868
            if isinstance(gencumsky_metfile, str):
×
1869
                print("Loaded ", gencumsky_metfile)
×
1870
                
1871
        if isinstance(gencumsky_metfile, list):
2✔
1872
            print("There are more than 1 year of gencumsky temporal weather file saved."+
×
1873
                  "You can pass which file you want with gencumsky_metfile input. Since "+
1874
                  "No year was selected, defaulting to using the first year of the list")
1875
            gencumsky_metfile = gencumsky_metfile[0] 
×
1876
            print("Loaded ", gencumsky_metfile)
×
1877

1878

1879
        if savefile is None:
2✔
1880
            savefile = "cumulative"
2✔
1881
        sky_path = 'skies'
2✔
1882
        lat = self.metdata.latitude
2✔
1883
        lon = self.metdata.longitude
2✔
1884
        timeZone = self.metdata.timezone
2✔
1885
        '''
2✔
1886
        cmd = "gencumulativesky +s1 -h 0 -a %s -o %s -m %s %s " %(lat, lon, float(timeZone)*15, filetype) +\
1887
            "-time %s %s -date %s %s %s %s %s" % (startdt.hour, enddt.hour+1,
1888
                                                  startdt.month, startdt.day,
1889
                                                  enddt.month, enddt.day,
1890
                                                  gencumsky_metfile)
1891
        '''
1892
        cmd = (f"gencumulativesky +s1 -h 0 -a {lat} -o {lon} -m "
2✔
1893
               f"{float(timeZone)*15} -G {gencumsky_metfile}" )
1894
               
1895
        with open(savefile+".cal","w") as f:
2✔
1896
            _,err = _popen(cmd, None, f)
2✔
1897
            if err is not None:
2✔
1898
                print(err)
2✔
1899

1900
        # Assign Albedos
1901
        try:
2✔
1902
            groundstring = self.ground._makeGroundString(cumulativesky=True)
2✔
1903
        except:
×
1904
            raise Exception('Error: ground reflection not defined.  '
×
1905
                            'Run RadianceObj.setGround() first')
1906
            return
1907
        
1908

1909

1910
        skyStr = "#Cumulative Sky Definition\n" +\
2✔
1911
            "void brightfunc skyfunc\n" + \
1912
            "2 skybright " + "%s.cal\n" % (savefile) + \
1913
            "0\n" + \
1914
            "0\n" + \
1915
            "\nskyfunc glow sky_glow\n" + \
1916
            "0\n" + \
1917
            "0\n" + \
1918
            "4 1 1 1 0\n" + \
1919
            "\nsky_glow source sky\n" + \
1920
            "0\n" + \
1921
            "0\n" + \
1922
            "4 0 0 1 180\n" + \
1923
            groundstring
1924
            
1925
        skyname = os.path.join(sky_path, savefile+".rad")
2✔
1926

1927
        skyFile = open(skyname, 'w')
2✔
1928
        skyFile.write(skyStr)
2✔
1929
        skyFile.close()
2✔
1930

1931
        self.skyfiles = [skyname]#, 'SunFile.rad' ]
2✔
1932

1933
        return skyname
2✔
1934

1935
    def set1axis(self, metdata=None, azimuth=180, limit_angle=45,
2✔
1936
                 angledelta=5, backtrack=True, gcr=1.0 / 3, cumulativesky=True,
1937
                 fixed_tilt_angle=None, useMeasuredTrackerAngle=False,
1938
                 axis_azimuth=None):
1939
        """
1940
        Set up geometry for 1-axis tracking.  Pull in tracking angle details from
1941
        pvlib, create multiple 8760 metdata sub-files where datetime of met data
1942
        matches the tracking angle.  Returns 'trackerdict' which has keys equal to
1943
        either the tracker angles (gencumsky workflow) or timestamps (gendaylit hourly
1944
        workflow)
1945

1946
        Parameters
1947
        ------------
1948
         metdata : :py:class:`~bifacial_radiance.MetObj` 
1949
            Meterological object to set up geometry. Usually set automatically by
1950
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1951
            Default = self.metdata
1952
        azimuth : numeric
1953
            Orientation axis of tracker torque tube. Default North-South (180 deg).
1954
            For fixed-tilt configuration, input is fixed azimuth (180 is south)
1955
        limit_angle : numeric
1956
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1957
        angledelta : numeric
1958
            Degree of rotation increment to parse irradiance bins. Default 5 degrees.
1959
            (0.4 % error for DNI).  Other options: 4 (.25%), 2.5 (0.1%).
1960
            Note: the smaller the angledelta, the more simulations must be run.
1961
        backtrack : bool
1962
            Whether backtracking is enabled (default = True)
1963
        gcr : float
1964
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1965
        cumulativesky : bool
1966
            [True] Wether individual csv files are
1967
            created with constant tilt angle for the cumulativesky approach.
1968
            if false, the gendaylit tracking approach must be used.
1969
        fixed_tilt_angle : numeric
1970
            If passed, this changes to a fixed tilt simulation where each hour 
1971
            uses fixed_tilt_angle and axis_azimuth as the tilt and azimuth
1972
        useMeasuredTrackerAngle: Bool
1973
            If True, and data for tracker angles has been passed by being included
1974
            in the WeatherFile object (column name 'Tracker Angle (degrees)'),
1975
            then tracker angles will be set to these values instead of being calculated.
1976
            NOTE that the value for azimuth passed to set1axis must be surface 
1977
            azimuth in the morning and not the axis_azimuth 
1978
            (i.e. for a N-S HSAT, azimuth = 90).
1979
        axis_azimuth : numeric
1980
            DEPRECATED.  returns deprecation warning. Pass the tracker 
1981
            axis_azimuth through to azimuth input instead.
1982

1983

1984
        Returns
1985
        -------
1986
        trackerdict : dictionary 
1987
            Keys represent tracker tilt angles (gencumsky) or timestamps (gendaylit)
1988
            and list of csv metfile, and datetimes at that angle
1989
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
1990
            - or -
1991
            trackerdict[time]['tracker_theta';'surf_azm';'surf_tilt']
1992
        """
1993

1994
        # Documentation check:
1995
        # Removed         Internal variables
1996
        # -------
1997
        # metdata.solpos          dataframe with solar position data
1998
        # metdata.surface_azimuth list of tracker azimuth data
1999
        # metdata.surface_tilt    list of tracker surface tilt data
2000
        # metdata.tracker_theta   list of tracker tilt angle
2001
        #import warnings
2002
        
2003
        if metdata == None:
2✔
2004
            metdata = self.metdata
2✔
2005

2006
        if metdata == {}:
2✔
2007
            raise Exception("metdata doesnt exist yet.  "+
×
2008
                            "Run RadianceObj.readWeatherFile() ")
2009

2010
        if axis_azimuth:
2✔
2011
            azimuth = axis_azimuth
×
2012
            warnings.warn("axis_azimuth is deprecated in set1axis; use azimuth "
×
2013
                          "input instead.", DeprecationWarning)
2014
            
2015
        #backtrack = True   # include backtracking support in later version
2016
        #gcr = 1.0/3.0       # default value - not used if backtrack = False.
2017

2018

2019
        # get 1-axis tracker angles for this location, rounded to nearest 'angledelta'
2020
        trackerdict = metdata._set1axis(cumulativesky=cumulativesky,
2✔
2021
                                       azimuth=azimuth,
2022
                                       limit_angle=limit_angle,
2023
                                       angledelta=angledelta,
2024
                                       backtrack=backtrack,
2025
                                       gcr=gcr,
2026
                                       fixed_tilt_angle=fixed_tilt_angle,
2027
                                       useMeasuredTrackerAngle=useMeasuredTrackerAngle
2028
                                       )
2029
        self.trackerdict = trackerdict
2✔
2030
        self.cumulativesky = cumulativesky
2✔
2031

2032
        return trackerdict
2✔
2033

2034
    def gendaylit1axis(self, metdata=None, trackerdict=None, startdate=None,
2✔
2035
                       enddate=None, debug=False):
2036
        """
2037
        1-axis tracking implementation of gendaylit.
2038
        Creates multiple sky files, one for each time of day.
2039

2040
        Parameters
2041
        ------------
2042
        metdata
2043
            MetObj output from readWeatherFile.  Needs to have 
2044
            RadianceObj.set1axis() run on it first.
2045
        startdate : str 
2046
            DEPRECATED, does not do anything now.
2047
            Recommended to downselect metdata when reading Weather File.
2048
        enddate : str
2049
            DEPRECATED, does not do anything now.
2050
            Recommended to downselect metdata when reading Weather File.
2051
        trackerdict : dictionary
2052
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
2053
        
2054
        Returns
2055
        -------
2056
        Updated trackerdict dictionary 
2057
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
2058
            with the additional dictionary value ['skyfile'] added
2059

2060
        """
2061
        
2062
        if metdata is None:
2✔
2063
            metdata = self.metdata
2✔
2064
        if trackerdict is None:
2✔
2065
            try:
2✔
2066
                trackerdict = self.trackerdict
2✔
2067
            except AttributeError:
×
2068
                print('No trackerdict value passed or available in self')
×
2069

2070
        if startdate is not None or enddate is not None:
2✔
2071
            print("Deprecation Warning: gendyalit1axis no longer downselects"+
×
2072
                  " entries by stardate and enddate. Downselect your data"+
2073
                  " when loading with readWeatherFile")
2074
            return
×
2075
            
2076
        try:
2✔
2077
            metdata.tracker_theta  # this may not exist
2✔
2078
        except AttributeError:
×
2079
            print("metdata.tracker_theta doesn't exist. Run RadianceObj.set1axis() first")
×
2080

2081
        if debug is False:
2✔
2082
            print('Creating ~%d skyfiles. '%(len(trackerdict.keys())))
2✔
2083
        count = 0  # counter to get number of skyfiles created, just for giggles
2✔
2084

2085
        trackerdict2={}
2✔
2086
        for i in range(0, len(trackerdict.keys())):
2✔
2087
            try:
2✔
2088
                time = metdata.datetime[i]
2✔
2089
            except IndexError:  #out of range error
×
2090
                break  # 
×
2091
            #filename = str(time)[5:-12].replace('-','_').replace(' ','_')
2092
            filename = time.strftime('%Y-%m-%d_%H%M')
2✔
2093
            self.name = filename
2✔
2094

2095
            #check for GHI > 0
2096
            #if metdata.ghi[i] > 0:
2097
            if (metdata.ghi[i] > 0) & (~np.isnan(metdata.tracker_theta[i])):  
2✔
2098
                skyfile = self.gendaylit(metdata=metdata,timeindex=i, debug=debug)
2✔
2099
                # trackerdict2 reduces the dict to only the range specified.
2100
                trackerdict2[filename] = trackerdict[filename]  
2✔
2101
                trackerdict2[filename]['skyfile'] = skyfile
2✔
2102
                count +=1
2✔
2103

2104
        print('Created {} skyfiles in /skies/'.format(count))
2✔
2105
        self.trackerdict = trackerdict2
2✔
2106
        return trackerdict2
2✔
2107

2108
    def genCumSky1axis(self, trackerdict=None):
2✔
2109
        """
2110
        1-axis tracking implementation of gencumulativesky.
2111
        Creates multiple .cal files and .rad files, one for each tracker angle.
2112

2113
        Use :func:`readWeatherFile` to limit gencumsky simulations
2114
        
2115
        
2116
        Parameters
2117
        ------------
2118
        trackerdict : dictionary
2119
            Trackerdict generated as output by RadianceObj.set1axis()
2120
            
2121
        Returns
2122
        -------
2123
        trackerdict : dictionary
2124
            Trackerdict dictionary with new entry trackerdict.skyfile  
2125
            Appends 'skyfile'  to the 1-axis dict with the location of the sky .radfile
2126

2127
        """
2128
        
2129
        if trackerdict == None:
2✔
2130
            try:
2✔
2131
                trackerdict = self.trackerdict
2✔
2132
            except AttributeError:
×
2133
                print('No trackerdict value passed or available in self')
×
2134

2135
        for theta in sorted(trackerdict):  
2✔
2136
            # call gencumulativesky with a new .cal and .rad name
2137
            csvfile = trackerdict[theta]['csvfile']
2✔
2138
            savefile = '1axis_%s'%(theta)  #prefix for .cal file and skies\*.rad file
2✔
2139
            skyfile = self.genCumSky(gencumsky_metfile=csvfile, savefile=savefile)
2✔
2140
            trackerdict[theta]['skyfile'] = skyfile
2✔
2141
            print('Created skyfile %s'%(skyfile))
2✔
2142
        # delete default skyfile (not strictly necessary)
2143
        self.skyfiles = None
2✔
2144
        self.trackerdict = trackerdict
2✔
2145
        return trackerdict
2✔
2146

2147

2148
    def makeOct(self, filelist=None, octname=None):
2✔
2149
        """
2150
        Combine everything together into a .oct file
2151

2152
        Parameters
2153
        ----------
2154
        filelist : list 
2155
            Files to include.  otherwise takes self.filelist
2156
        octname : str
2157
            filename (without .oct extension)
2158

2159

2160
        Returns
2161
        -------
2162
        octname : str
2163
            filename of .oct file in root directory including extension
2164
        err : str
2165
            Error message returned from oconv (if any)
2166
        """
2167
        
2168
        if filelist is None:
2✔
2169
            filelist = self.getfilelist()
2✔
2170
        if octname is None:
2✔
2171
            octname = self.name
2✔
2172

2173
        debug = False
2✔
2174
        #JSS. With the way that the break is handled now, this will wait the 10 for all the hours
2175
        # that were not generated sky files.
2176
        if self.hpc :
2✔
2177
            import time
2✔
2178
            time_to_wait = 10
2✔
2179
            time_counter = 0
2✔
2180
            for file in filelist:
2✔
2181
                if debug:
2✔
2182
                    print("HPC Checking for file %s" % (file))
×
2183
                if None in filelist:  # are we missing any files? abort!
2✔
2184
                    print('Missing files, skipping...')
×
2185
                    self.octfile = None
×
2186
                    return None
×
2187
                #Filesky is being saved as 'none', so it crashes !
2188
                while not os.path.exists(file):
2✔
2189
                    time.sleep(1)
×
2190
                    time_counter += 1
×
2191
                if time_counter > time_to_wait:
2✔
2192
                    print ("filenotfound")
×
2193
                    break
×
2194

2195
        #os.system('oconv '+ ' '.join(filelist) + ' > %s.oct' % (octname))
2196
        if None in filelist:  # are we missing any files? abort!
2✔
2197
            print('Missing files, skipping...')
×
2198
            self.octfile = None
×
2199
            return None
×
2200

2201
        #cmd = 'oconv ' + ' '.join(filelist)
2202
        filelist.insert(0,'oconv')
2✔
2203
        with open('%s.oct' % (octname), "w") as f:
2✔
2204
            _,err = _popen(filelist, None, f)
2✔
2205
            #TODO:  exception handling for no sun up
2206
            if err is not None:
2✔
UNCOV
2207
                if err[0:5] == 'error':
×
2208
                    raise Exception(err[7:])
×
UNCOV
2209
                if err[0:7] == 'message':
×
UNCOV
2210
                    warnings.warn(err[9:], Warning)
×
2211
                    
2212

2213
        #use rvu to see if everything looks good. 
2214
        # use cmd for this since it locks out the terminal.
2215
        #'rvu -vf views\side.vp -e .01 monopanel_test.oct'
2216
        print("Created %s.oct" % (octname))
2✔
2217
        self.octfile = '%s.oct' % (octname)
2✔
2218
        return '%s.oct' % (octname)
2✔
2219

2220
    def makeOct1axis(self, trackerdict=None, singleindex=None, customname=None):
2✔
2221
        """
2222
        Combine files listed in trackerdict into multiple .oct files
2223

2224
        Parameters
2225
        ------------
2226
        trackerdict 
2227
            Output from :py:class:`~bifacial_radiance.RadianceObj.makeScene1axis`
2228
        singleindex : str
2229
            Single index for trackerdict to run makeOct1axis in single-value mode,
2230
            format 'YYYY-MM-DD_HHMM'.
2231
        customname : str 
2232
            Custom text string added to the end of the OCT file name.
2233

2234
        Returns
2235
        -------
2236
        trackerdict
2237
            Append 'octfile'  to the 1-axis dict with the location of the scene .octfile
2238
        """
2239

2240
        if customname is None:
2✔
2241
            customname = ''
2✔
2242

2243
        if trackerdict is None:
2✔
2244
            try:
×
2245
                trackerdict = self.trackerdict
×
2246
            except AttributeError:
×
2247
                print('No trackerdict value passed or available in self')
×
2248
        if singleindex is None:   # loop through all values in the tracker dictionary
2✔
2249
            indexlist = trackerdict.keys()
2✔
2250
        else:  # just loop through one single index in tracker dictionary
2251
            indexlist = [singleindex]
2✔
2252

2253
        print('\nMaking {} octfiles in root directory.'.format(indexlist.__len__()))
2✔
2254
        for index in sorted(indexlist):  # run through either entire key list of trackerdict, or just a single value
2✔
2255
            try:  #TODO: check if this works
2✔
2256
                filelist = self.materialfiles + [trackerdict[index]['skyfile']] + self._getradfiles(trackerdict[index]['scenes'])
2✔
2257
                octname = '1axis_%s%s'%(index, customname)
2✔
2258
                trackerdict[index]['octfile'] = self.makeOct(filelist, octname)
2✔
2259
            except KeyError as e:
×
2260
                print('Trackerdict key error: {}'.format(e))
×
2261
                
2262
        self.trackerdict = trackerdict
2✔
2263
        return trackerdict
2✔
2264

2265
    
2266
    def makeModule(self, name=None, x=None, y=None, z=None,  modulefile=None, 
2✔
2267
                 text=None, customtext='',  xgap=0.01, ygap=0.0, 
2268
                 zgap=0.1, numpanels=1, rewriteModulefile=True, 
2269
                 glass=False, modulematerial=None, bifi=1,  **kwargs):
2270
        """
2271
        pass module generation details into ModuleObj(). See ModuleObj() 
2272
        docstring for more details
2273
        """
2274
        from bifacial_radiance import ModuleObj
2✔
2275

2276
        if name is None:
2✔
2277
            print("usage:  makeModule(name,x,y,z, modulefile = '\objects\*.rad', "+
2✔
2278
                  " zgap = 0.1 (module offset)"+
2279
                  "numpanels = 1 (# of panels in portrait), ygap = 0.05 "+
2280
                  "(slope distance between panels when arrayed), "+
2281
                  "rewriteModulefile = True (or False), bifi = 1")
2282
            print("You can also override module_type info by passing 'text'"+
2✔
2283
                  "variable, or add on at the end for racking details with "+
2284
                  "'customtext'. See function definition for more details")
2285
            print("Optional: tubeParams={} (torque tube details including "
2✔
2286
                  "diameter (torque tube dia. in meters), tubetype='Round' "
2287
                  "(or 'square', 'hex'), material='Metal_Grey' (or 'black')"
2288
                  ", axisofrotation=True (does scene rotate around tube)")
2289
            print("Optional: cellModule={} (create cell-level module by "+
2✔
2290
                  " passing in dictionary with keys 'numcellsx'6 (#cells in "+
2291
                  "X-dir.), 'numcellsy', 'xcell' (cell size in X-dir. in meters),"+
2292
                  "'ycell', 'xcellgap' (spacing between cells in X-dir.), 'ycellgap'")
2293
            print("Optional: omegaParams={} (create the support structure omega by "+
2✔
2294
                  "passing in dictionary with keys 'omega_material' (the material of "+
2295
                  "omega), 'mod_overlap'(the length of the module adjacent piece of"+
2296
                  " omega that overlaps with the module),'x_omega1', 'y_omega' (ideally same"+
2297
                  " for all the parts of omega),'z_omega1', 'x_omega2' (X-dir length of the"+
2298
                  " vertical piece), 'x_omega3', z_omega3")
2299

2300
            return
2✔
2301
        
2302
        """
2✔
2303
        # TODO: check for deprecated torquetube and axisofrotationTorqueTube in
2304
          kwargs.  
2305
        """
2306
        if 'tubeParams' in kwargs:
2✔
2307
            tubeParams = kwargs.pop('tubeParams')
2✔
2308
        else:
2309
            tubeParams = None
2✔
2310
        if 'torquetube' in kwargs:
2✔
2311
            torquetube = kwargs.pop('torquetube')
×
2312
            print("\nWarning: boolean input `torquetube` passed into makeModule"
×
2313
                  ". Starting in v0.4.0 this boolean parameter is deprecated."
2314
                  " Use module.addTorquetube() with `visible` parameter instead.")
2315
            if tubeParams:
×
2316
                tubeParams['visible'] =  torquetube
×
2317
            elif (tubeParams is None) & (torquetube is True):
×
2318
                tubeParams = {'visible':True} # create default TT
×
2319
            
2320
        if 'axisofrotationTorqueTube' in kwargs:
2✔
2321
            axisofrotation = kwargs.pop('axisofrotationTorqueTube')
×
2322
            print("\nWarning: input boolean `axisofrotationTorqueTube` passed "
×
2323
                "into makeModule. Starting in v0.4.0 this boolean parameter is"
2324
                " deprecated. Use module.addTorquetube() with `axisofrotation`"
2325
                "parameter instead.")
2326
            if tubeParams:  #this kwarg only does somehting if there's a TT.
×
2327
                tubeParams['axisofrotation'] = axisofrotation
×
2328
        
2329
        if self.hpc:  # trigger HPC simulation in ModuleObj
2✔
2330
            kwargs['hpc']=True
2✔
2331
            
2332
        self.module = ModuleObj(name=name, x=x, y=y, z=z, bifi=bifi, modulefile=modulefile,
2✔
2333
                   text=text, customtext=customtext, xgap=xgap, ygap=ygap, 
2334
                   zgap=zgap, numpanels=numpanels, 
2335
                   rewriteModulefile=rewriteModulefile, glass=glass, 
2336
                   modulematerial=modulematerial, tubeParams=tubeParams,
2337
                   **kwargs)
2338
        return self.module
2✔
2339
    
2340
    
2341
    
2342
    def makeCustomObject(self, name=None, text=None):
2✔
2343
        """
2344
        Function for development and experimenting with extraneous objects in the scene.
2345
        This function creates a `name.rad` textfile in the objects folder
2346
        with whatever text that is passed to it.
2347
        It is up to the user to pass the correct radiance format.
2348
        
2349
        For example, to create a box at coordinates 0,0 (with its bottom surface
2350
        on the plane z=0):
2351
            
2352
        .. code-block:
2353
        
2354
            name = 'box'
2355
            text='! genbox black PVmodule 0.5 0.5 0.5 | xform -t -0.25 -0.25 0'
2356

2357
        Parameters
2358
        ----------
2359
        name : str
2360
            String input to name the module type
2361
        text : str
2362
            Text used in the radfile to generate the module
2363
        
2364
        """
2365

2366
        customradfile = os.path.join('objects', '%s.rad'%(name)) # update in 0.2.3 to shorten radnames
2✔
2367
        # py2 and 3 compatible: binary write, encode text first
2368
        with open(customradfile, 'wb') as f:
2✔
2369
            f.write(text.encode('ascii'))
2✔
2370

2371
        print("\nCustom Object Name", customradfile)
2✔
2372
        #self.customradfile = customradfile
2373
        return customradfile
2✔
2374

2375

2376
    def printModules(self):
2✔
2377
        # print available module types from ModuleObj
2378
        from bifacial_radiance import ModuleObj
2✔
2379
        modulenames = ModuleObj().readModule()
2✔
2380
        print('Available module names: {}'.format([str(x) for x in modulenames]))
2✔
2381
        return modulenames
2✔
2382
    
2383
        
2384
    def makeScene(self, module=None, sceneDict=None, radname=None,
2✔
2385
                  customtext=None, append=False, 
2386
                  moduletype=None, appendtoScene=None):
2387
        """
2388
        Create a SceneObj which contains details of the PV system configuration including
2389
        tilt, row pitch, height, nMods per row, nRows in the system. Append to
2390
        self.scenes list
2391

2392
        Parameters
2393
        ----------
2394
        module : str or ModuleObj
2395
            String name of module created with makeModule()
2396
        sceneDict : dictionary
2397
            Dictionary with keys: `tilt`, `clearance_height`*, `pitch`,
2398
            `azimuth`, `nMods`, `nRows`, `hub_height`*, `height`*
2399
            * height deprecated from sceneDict. For makeScene (fixed systems)
2400
            if passed it is assumed it reffers to clearance_height.
2401
            `clearance_height` recommended for fixed_tracking systems.
2402
            `hub_height` can also be passed as a possibility.
2403
        radname : str
2404
            Gives a custom name to the scene file. Useful when parallelizing.
2405
        customtext : str
2406
            Appends to the scene a custom text pointing to a custom object
2407
            created by the user; format of the text should start with the rad 
2408
            file path and name, and then any other geometry transformations 
2409
            native to Radiance necessary.
2410
        append : bool, default False
2411
            If multiple scenes exist (makeScene called multiple times), either 
2412
            overwrite the existing scene (default) or append a new SceneObj to
2413
            self.scenes
2414
        moduletype: DEPRECATED. use the `module` kwarg instead.
2415
        appendtoScene : DEPRECATED.  use the `customtext` kwarg instead.
2416

2417
        
2418
        Returns
2419
        -------
2420
        SceneObj 
2421
            'scene' with configuration details
2422
            
2423
        """
2424
        if appendtoScene is not None:
2✔
2425
            customtext = appendtoScene
×
2426
            print("Warning:  input `appendtoScene` is deprecated. Use kwarg "
×
2427
                  "`customtext` instead")
2428
        if moduletype is not None:
2✔
2429
            module = moduletype
×
2430
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2431
                  "`module` instead")
2432
        if module is None:
2✔
2433
            try:
×
2434
                module = self.module
×
2435
                print(f'Using last saved module, name: {module.name}')
×
2436
            except AttributeError:
×
2437
                print('makeScene(module, sceneDict, nMods, nRows).  '+\
×
2438
                          'Available moduletypes: ' )
2439
                self.printModules() #print available module types
×
2440
                return
×
2441
        scene = SceneObj(module, hpc=self.hpc, name=f'Scene{self.scenes.__len__()}')
2✔
2442
        if self.scenes.__len__() >=1:
2✔
2443
            print(f"Additional scene {scene.name} created! See list of names with RadianceObj.scenes and sceneNames")
2✔
2444

2445
        if sceneDict is None:
2✔
2446
            print('makeScene(moduletype, sceneDict, nMods, nRows).  '+\
×
2447
                  'sceneDict inputs: .tilt .clearance_height .pitch .azimuth')
2448
            self.scenes.append(scene)
×
2449
            return scene
×
2450

2451
        if 'azimuth' not in sceneDict:
2✔
2452
            sceneDict['azimuth'] = 180
2✔
2453

2454
        if 'nRows' not in sceneDict:
2✔
2455
            sceneDict['nRows'] = 7
2✔
2456

2457
        if 'nMods' not in sceneDict:
2✔
2458
            sceneDict['nMods'] = 20
2✔
2459

2460
        # Fixed tilt routine
2461
        # Preferred: clearance_height,
2462
        # If only height is passed, it is assumed to be clearance_height.
2463
        
2464
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
2465
                                                                preferred='clearance_height', 
2466
                                                                nonpreferred='hub_height')
2467
        
2468
        #self.nMods = sceneDict['nMods']
2469
        #self.nRows = sceneDict['nRows']
2470
        sceneRAD = scene._makeSceneNxR(sceneDict=sceneDict,
2✔
2471
                                                 radname=radname)
2472

2473
        # TODO: deprecate this section in favor of multiple sceneObjs?
2474
        # This functionality allows additional radfiles to be added to the same
2475
        # sceneObj, so it's somewhat distinct from making new sceneObjs...        
2476
        if 'appendRadfile' not in sceneDict:
2✔
2477
            appendRadfile = False
2✔
2478
        else:
2479
            appendRadfile = sceneDict['appendRadfile']
×
2480

2481
        if appendRadfile:
2✔
2482
            debug = False
×
2483
            try:
×
2484
                scene.radfiles.append(sceneRAD)
×
2485
                if debug:
×
2486
                    print( "Radfile APPENDED!")
×
2487
            except:
×
2488
                #TODO: Manage situation where radfile was created with
2489
                #appendRadfile to False first..
2490
                scene.radfiles=[]
×
2491
                scene.radfiles.append(sceneRAD)
×
2492
                if debug:
×
2493
                    print( "Radfile APPENDAGE created!")
×
2494
        else:
2495
            scene.radfiles = [sceneRAD]
2✔
2496
        #
2497
        if customtext is not None:
2✔
2498
            self.appendtoScene(radfile=scene.radfiles[0], customObject = customtext)
×
2499
            
2500
        # default behavior: overwrite. (backwards compatible behavior.)
2501
        if append:
2✔
2502
            self.scenes.append(scene)
×
2503
        else:
2504
            self.scenes = [scene]
2✔
2505
        return scene
2✔
2506

2507
    def appendtoScene(self, radfile=None, customObject=None, text=''):
2✔
2508
        """
2509
        Appends to the `Scene radfile` in folder `\objects` the text command in Radiance
2510
        lingo created by the user.
2511
        Useful when using addCustomObject to the scene.
2512
        
2513
        DEPRECATED: use the identical version in SceneObj instead
2514

2515
        Parameters
2516
        ----------
2517
        radfile: str
2518
            Directory and name of where .rad scene file is stored
2519
        customObject : str
2520
            Directory and name of custom object .rad file is stored, and any geometry
2521
            modifications needed for it.
2522
        text : str 
2523
            Command to be appended to the radfile which specifies its position 
2524
            in the scene. Do not leave empty spaces at the end.
2525

2526
        Returns
2527
        -------
2528
        Nothing, the radfile must already be created and assigned when running this.
2529
        
2530
        """        
2531
        warnings.warn('RadObj.appendtoScene is deprecated.  Use the equivalent'
2✔
2532
              ' functionality in SceneObj.appendtoScene.', DeprecationWarning)
2533
        # py2 and 3 compatible: binary write, encode text first
2534
        text2 = '\n!xform -rx 0 ' + text + ' ' + customObject
2✔
2535
        
2536
        debug = False
2✔
2537
        if debug:
2✔
2538
            print (text2)
×
2539

2540
        with open(radfile, 'a+') as f:
2✔
2541
            f.write(text2)
2✔
2542

2543

2544
    
2545
    def makeScene1axis(self, trackerdict=None, module=None, sceneDict=None,
2✔
2546
                       cumulativesky=None, customtext=None, append=False, 
2547
                       moduletype=None, appendtoScene=None):
2548
        """
2549
        Creates a SceneObj for each tracking angle which contains details of the PV
2550
        system configuration including row pitch, hub_height, nMods per row, nRows in the system...
2551

2552
        Parameters
2553
        ------------
2554
        trackerdict
2555
            Output from GenCumSky1axis
2556
        module : str or ModuleObj
2557
            Name or ModuleObj created with makeModule()
2558
        sceneDict : 
2559
            Dictionary with keys:`tilt`, `hub_height`, `pitch` (or GCR), `azimuth`,
2560
            optional: 'originx', 'originy'
2561
        cumulativesky : bool
2562
            Defines if sky will be generated with cumulativesky or gendaylit.
2563
        customtext : str
2564
            Appends to each scene a custom text pointing to a custom object
2565
            created by the user; format of the text should start with the rad 
2566
            file path and name, and then any other geometry transformations 
2567
            native to Radiance necessary. e.g '!xform -rz 90 '+self.makeCustomObject()
2568
        append : bool, default False
2569
            If multiple scenes exist (makeScene called multiple times), either 
2570
            overwrite the existing scene (default) or append a new SceneObj to
2571
            self.scenes
2572
        moduletype: DEPRECATED. use the `module` kwarg instead.
2573
        appendtoScene : DEPRECATED. use the `customtext` kwarg instead
2574
            
2575
        Returns
2576
        --------
2577
        trackerdict 
2578
            Append the following keys
2579
                'scene'
2580
                    SceneObj for each tracker theta
2581
                'clearance_height'
2582
                    Calculated ground clearance based on
2583
                    `hub height`, `tilt` angle and overall collector width `sceney`
2584
                
2585
        """
2586
        
2587
        import math, copy
2✔
2588

2589
        if sceneDict is None:
2✔
2590
            print('usage: makeScene1axis(module, sceneDict, nMods, nRows).'+
×
2591
                  'sceneDict inputs: .hub_height .azimuth .nMods .nRows'+
2592
                  'and .pitch or .gcr')
2593
            return
×
2594
        
2595
        if appendtoScene is not None: #kwarg is deprecated.
2✔
2596
            customtext = appendtoScene
×
2597
            warnings.warn("Warning:  input `appendtoScene` is deprecated. Use kwarg "
×
2598
                  "`customtext` instead", DeprecationWarning)
2599
        # If no nRows or nMods assigned on deprecated variable or dictionary,
2600
        # assign default.
2601
        if 'nRows' not in sceneDict:
2✔
2602
            sceneDict['nRows'] = 7
2✔
2603
        if 'nMods' not in sceneDict:
2✔
2604
            sceneDict['nMods'] = 20
2✔
2605

2606
        if trackerdict is None:
2✔
2607
            try:
2✔
2608
                trackerdict = self.trackerdict
2✔
2609
            except AttributeError:
×
2610
                print('No trackerdict value passed or available in self')
×
2611

2612
        if cumulativesky is None:
2✔
2613
            try:
2✔
2614
                # see if cumulativesky = False was set earlier,
2615
                # e.g. in RadianceObj.set1axis
2616
                cumulativesky = self.cumulativesky
2✔
2617
            except AttributeError:
×
2618
                # default cumulativesky = true to maintain backward compatibility.
2619
                cumulativesky = True
×
2620

2621

2622
        if moduletype is not None:
2✔
2623
            module = moduletype
×
2624
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2625
                  "`module` instead")
2626
        if module is None:
2✔
2627
            try:
2✔
2628
                module = self.module
2✔
2629
                print(f'Using last saved module, name: {module.name}')
2✔
2630
            except AttributeError:
×
2631
                print('usage:  makeScene1axis(trackerdict, module, '+
×
2632
                      'sceneDict, nMods, nRows). ')
2633
                self.printModules() #print available module types
×
2634
                return
×
2635

2636
        if 'orientation' in sceneDict:
2✔
2637
            raise Exception('\n\n ERROR: Orientation format has been '
×
2638
                'deprecated since version 0.2.4. If you want to flip your '
2639
                'modules, on makeModule switch the x and y values.\n\n')
2640
       
2641
        # 1axis routine
2642
        # Preferred hub_height
2643
        sceneDict, use_clearanceheight = _heightCasesSwitcher(sceneDict, 
2✔
2644
                                                        preferred='hub_height', 
2645
                                                        nonpreferred='clearance_height')
2646

2647
        if use_clearanceheight:
2✔
2648
            simplefix = 0
2✔
2649
            hubheight = sceneDict['clearance_height'] # Not really, but this is the fastest 
2✔
2650
            # to make it work with the simplefix as below the actual clearnace height
2651
            # gets calculated and the 0 sets the cosine correction to 0. 
2652
            # TODO CLEAN THIS UP.
2653
            
2654
        else:
2655
            #the hub height is the tracker height at center of rotation.
2656
            hubheight = sceneDict['hub_height']
2✔
2657
            simplefix = 1
2✔
2658

2659
        # we no longer need sceneDict['hub_height'] - it'll be replaced by 'clearance_height' below
2660
        sceneDict.pop('hub_height',None)
2✔
2661
        if cumulativesky is True:        # cumulativesky workflow
2✔
2662
            print('\nMaking .rad files for cumulativesky 1-axis workflow')
2✔
2663
            for theta in trackerdict:
2✔
2664
                scene = SceneObj(module, hpc=self.hpc)
2✔
2665
                if trackerdict[theta]['surf_azm'] >= 180:
2✔
2666
                    trackerdict[theta]['surf_azm'] = trackerdict[theta]['surf_azm']-180
2✔
2667
                    trackerdict[theta]['surf_tilt'] = trackerdict[theta]['surf_tilt']*-1
2✔
2668
                radname = '1axis%s_'%(theta,)
2✔
2669

2670
                # Calculating clearance height for this theta.
2671
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2672
                        * scene.module.sceney + scene.module.offsetfromaxis \
2673
                        * math.sin(abs(theta)*math.pi/180)
2674
                # Calculate the ground clearance height based on the hub height. Add abs(theta) to avoid negative tilt angle errors
2675
                #trackerdict[theta]['clearance_height'] = height
2676

2677
                sceneDict.update({'tilt' : trackerdict[theta]['surf_tilt'],
2✔
2678
                                 'clearance_height' :  height,
2679
                                 'azimuth' : trackerdict[theta]['surf_azm'],
2680
                                 'modulez' :  scene.module.z})
2681

2682
                radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2683
                                             radname=radname, addhubheight=True)
2684
                #trackerdict[theta]['radfile'] = radfile
2685
                # TODO: determine radfiles dynamically from scenes
2686
                try:
2✔
2687
                    name=f"Scene{trackerdict[theta]['scenes'].__len__()}"
2✔
2688
                    scene.name = name
2✔
2689
                    if customtext is not None:
2✔
2690
                        scene.appendtoScene(customObject = customtext)
2✔
2691

2692
                    if append:
2✔
2693
                        trackerdict[theta]['scenes'].append(scene)
2✔
2694
                    else:
2695
                        trackerdict[theta]['scenes'] = [scene]
2✔
2696
                except KeyError: #either KeyError or maybe IndexError?  
2✔
2697
                    trackerdict[theta]['scenes'] = [scene]
2✔
2698

2699
            print('{} Radfiles created in /objects/'.format(trackerdict.__len__()))
2✔
2700

2701
        else:  #gendaylit workflow
2702
            print('\nMaking ~%s .rad files for gendaylit 1-axis workflow (this takes a minute..)' % (len(trackerdict)))
2✔
2703
            count = 0
2✔
2704
            for time in trackerdict:
2✔
2705
                scene = SceneObj(module, hpc=self.hpc)
2✔
2706

2707
                if trackerdict[time]['surf_azm'] >= 180:
2✔
2708
                    trackerdict[time]['surf_azm'] = trackerdict[time]['surf_azm']-180
×
2709
                    trackerdict[time]['surf_tilt'] = trackerdict[time]['surf_tilt']*-1
×
2710
                theta = trackerdict[time]['theta']
2✔
2711
                radname = '1axis%s_'%(time,)
2✔
2712

2713
                # Calculating clearance height for this time.
2714
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2715
                        * scene.module.sceney + scene.module.offsetfromaxis \
2716
                        * math.sin(abs(theta)*math.pi/180)
2717

2718
                if trackerdict[time]['ghi'] > 0:
2✔
2719

2720

2721
                    sceneDict.update({'tilt' : trackerdict[time]['surf_tilt'],
2✔
2722
                                     'clearance_height' :  height,
2723
                                     'azimuth' : trackerdict[time]['surf_azm'],
2724
                                     'modulez' :  scene.module.z})
2725

2726
                    # if sceneDict isn't copied, it will change inside the SceneObj since dicts are mutable!
2727
                    radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2728
                                                 radname=radname, addhubheight=True)
2729
                    
2730
                    #try:
2731
                    if customtext is not None:
2✔
2732
                        scene.appendtoScene(customObject = customtext)
2✔
2733
                        
2734
                    if ('scenes' in trackerdict[time]) and append:
2✔
2735
                        scene.name=f"Scene{trackerdict[time]['scenes'].__len__()}"
2✔
2736
                        trackerdict[time]['scenes'].append(scene)
2✔
2737
                    else:
2738
                        scene.name="Scene0"
2✔
2739
                        trackerdict[time]['scenes'] = [scene]
2✔
2740
                    
2741
                    count+=1
2✔
2742
            print('{} Radfiles created in /objects/'.format(count))
2✔
2743

2744

2745

2746
        self.trackerdict = trackerdict
2✔
2747
        self.hub_height = hubheight
2✔
2748
        
2749
        return trackerdict
2✔
2750

2751

2752
    def analysis1axis(self, trackerdict=None, singleindex=None, accuracy='low',
2✔
2753
                      customname=None, modWanted=None, rowWanted=None, 
2754
                      sensorsy=9, sensorsx=1,  
2755
                      modscanfront = None, modscanback = None, relative=False, 
2756
                      debug=False, sceneNum=0, append=True, 
2757
                      frontsurfaceoffset = None, backsurfaceoffset=None):
2758
        """
2759
        Loop through trackerdict and runs linescans for each scene and scan in there.
2760
        If multiple scenes exist in the trackerdict, only ONE scene can be analyzed at a 
2761
        time.  
2762
        TODO: how to run calculateResults with array of multiple results
2763

2764
        Parameters
2765
        ----------------
2766
        trackerdict 
2767
        singleindex : str
2768
            For single-index mode, just the one index we want to run (new in 0.2.3).
2769
            Example format '21_06_14_12_30' for 2021 June 14th 12:30 pm
2770
        accuracy : str
2771
            'low' or 'high', resolution option used during _irrPlot and rtrace
2772
        customname : str
2773
            Custom text string to be added to the file name for the results .CSV files
2774
        modWanted : int or list
2775
            Module to be sampled. Index starts at 1.
2776
        rowWanted : int or list
2777
            Row to be sampled. Index starts at 1. (row 1)
2778
        sensorsy : int or list 
2779
            Number of 'sensors' or scanning points along the collector width 
2780
            (CW) of the module(s). If multiple values are passed, first value
2781
            represents number of front sensors, second value is number of back sensors
2782
        sensorsx : int or list 
2783
            Number of 'sensors' or scanning points along the length, the side perpendicular 
2784
            to the collector width (CW) of the module(s) for the back side of the module. 
2785
            If multiple values are passed, first value represents number of 
2786
            front sensors, second value is number of back sensors.
2787
        modscanfront : dict
2788
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2789
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2790
            floats except for 'orient' which takes x y z values as string 'x y z'
2791
            for example '0 0 -1'. These values will overwrite the internally
2792
            calculated frontscan dictionary for the module & row selected. If modifying 
2793
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2794
            results writing stage. 
2795
        modscanback : dict
2796
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2797
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2798
            floats except for 'orient' which takes x y z values as string 'x y z'
2799
            for example '0 0 -1'. These values will overwrite the internally
2800
            calculated frontscan dictionary for the module & row selected.  If modifying 
2801
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2802
            results writing stage. 
2803
        relative : Bool
2804
            if passing modscanfront and modscanback to modify dictionarie of positions,
2805
            this sets if the values passed to be updated are relative or absolute. 
2806
            Default is absolute value (relative=False)
2807
        debug : Bool
2808
            Activates internal printing of the function to help debugging.
2809
        sceneNum : int
2810
            Index of the scene number in the list of scenes per trackerdict. default 0
2811
        append : Bool (default True)
2812
            Append trackerdict['AnalysisObj'] to list.  Otherwise over-write any
2813
            AnalysisObj's and start 1axis analysis from scratch
2814

2815
        Returns
2816
        -------
2817
        trackerdict is returned with :py:class:`bifacial_radiance.AnalysisObj`  
2818
            for each timestamp:
2819
    
2820
        trackerdict.key.'AnalysisObj'  : analysis object for this tracker theta
2821
            to get a dictionary of results, run :py:class:`bifacial_radiance.AnalysisObj`.results
2822
        :py:class:`bifacial_radiance.AnalysisObj`.results returns the following df:
2823
            'name', 'modNum', 'rowNum', 'sceneNum', 'x','y','z', 'mattype', 'rearMat', 
2824
            'Wm2Front'     : np.array of front Wm-2 irradiances, len=sensorsy_back
2825
            'Wm2Back'      : np.array of rear Wm-2 irradiances, len=sensorsy_back
2826
            'backRatio'    : np.array of rear irradiance ratios, len=sensorsy_back
2827

2828
        """
2829
        
2830
        import warnings, itertools
2✔
2831

2832
        if customname is None:
2✔
2833
            customname = ''
2✔
2834

2835
        if trackerdict == None:
2✔
2836
            try:
2✔
2837
                trackerdict = self.trackerdict
2✔
2838
            except AttributeError:
×
2839
                print('No trackerdict value passed or available in self')
×
2840
        
2841
        if not append:
2✔
2842
            warnings.warn('Append=False. Over-writing any existing `AnalysisObj` in trackerdict.')
×
2843
            for key in trackerdict:
×
2844
                trackerdict[key]['AnalysisObj'] = []
×
2845
    
2846
        if singleindex is None:  # run over all values in trackerdict
2✔
2847
            trackerkeys = sorted(trackerdict.keys())
2✔
2848
        else:                   # run in single index mode.
2849
            trackerkeys = [singleindex]
×
2850
    
2851
        if modWanted == None:
2✔
2852
            modWanted = round(trackerdict[trackerkeys[0]]['scenes'][sceneNum].sceneDict['nMods'] / 1.99)
2✔
2853
        if rowWanted == None:
2✔
2854
            rowWanted = round(trackerdict[trackerkeys[0]]['scenes'][sceneNum].sceneDict['nRows'] / 1.99)
2✔
2855
    
2856
    
2857
        #frontWm2 = 0 # container for tracking front irradiance across module chord. Dynamically size based on first analysis run
2858
        #backWm2 = 0 # container for tracking rear irradiance across module chord.
2859
        for index in trackerkeys:   # either full list of trackerdict keys, or single index
2✔
2860
            octfile = trackerdict[index]['octfile']
2✔
2861
            scene = trackerdict[index]['scenes'][sceneNum]
2✔
2862
            name = '1axis_%s%s_%s'%(index, customname, scene.name)
2✔
2863
            if not trackerdict[index].get('AnalysisObj'):
2✔
2864
                trackerdict[index]['AnalysisObj'] = []
2✔
2865
            if octfile is None:
2✔
2866
                continue  # don't run analysis if the octfile is none
×
2867
            # loop over rowWanted and modWanted.  Need to listify it first
2868
            if type(rowWanted)!=list:   rowWanted = [rowWanted]
2✔
2869
            if type(modWanted)!=list:   modWanted = [modWanted]
2✔
2870
    
2871
            row_mod_pairs = list(itertools.product(rowWanted,modWanted))
2✔
2872
            for (r,m) in row_mod_pairs:  
2✔
2873
                #Results = {'rowWanted':r,'modWanted':m, 'sceneNum':sceneNum}
2874
                #if customname: Results['customname'] = customname
2875
                try:  # look for missing data
2✔
2876
                    analysis = AnalysisObj(octfile,name)
2✔
2877
                    analysis.sceneNum = sceneNum
2✔
2878
                    #name = '1axis_%s%s_%s'%(index, customname, scene.name) #defined above
2879
                    frontscanind, backscanind = analysis.moduleAnalysis(scene=scene, modWanted=m, 
2✔
2880
                                                    rowWanted=r, 
2881
                                                    sensorsy=sensorsy, 
2882
                                                    sensorsx=sensorsx, 
2883
                                                    modscanfront=modscanfront, modscanback=modscanback,
2884
                                                    relative=relative, debug=debug,
2885
                                                    frontsurfaceoffset=frontsurfaceoffset, 
2886
                                                    backsurfaceoffset=backsurfaceoffset)
2887
                    analysis.analysis(octfile=octfile,name=name,frontscan=frontscanind,backscan=backscanind,accuracy=accuracy)                
2✔
2888
                    trackerdict[index]['AnalysisObj'].append(analysis)
2✔
2889
                except Exception as e: # problem with file. TODO: only catch specific error types here.
×
2890
                    warnings.warn('Index: {}. Problem with file. Error: {}. Skipping'.format(index,e), Warning)
×
2891
                    return
×
2892
    
2893
                #combine cumulative front and back irradiance for each tracker angle
2894
                """
2✔
2895
                try:  #on error, trackerdict[index] is returned empty
2896
                    Results['Wm2Front'] = analysis.Wm2Front
2897
                    Results['Wm2Back'] = analysis.Wm2Back
2898
                    Results['backRatio'] = analysis.backRatio
2899
                except AttributeError as  e:  # no key Wm2Front.
2900
                    warnings.warn('Index: {}. Trackerdict key not found: {}. Skipping'.format(index,e), Warning)
2901
                    return
2902
                trackerdict[index]['Results'].append(Results)
2903
                """
2904
                try:
2✔
2905
                    print('Index: {}. Wm2Front: {}. Wm2Back: {}'.format(index,
2✔
2906
                      np.mean(analysis.Wm2Front), np.mean(analysis.Wm2Back)))
2907
                except AttributeError:  #no Wm2Front
×
2908
                    warnings.warn('AnalysisObj not successful.')
×
2909

2910
        self.trackerdict = trackerdict
2✔
2911
        return trackerdict
2✔
2912

2913
    def analysis1axisground(self, trackerdict=None, singleindex=None, accuracy='low',
2✔
2914
                      customname=None, modWanted=None, rowWanted=None, sensorsground=None, 
2915
                      sensorsgroundx=1, sceneNum=0, append=True):
2916
        """
2917
        uses :py:class:`bifacial_radiance.AnalysisObj`.groundAnalysis to run a
2918
        single ground scan along the entire row-row pitch. 
2919

2920
        Parameters
2921
        ----------
2922
        trackerdict : optional
2923
        singleindex : str
2924
            For single-index mode, just the one index we want to run (new in 0.2.3).
2925
            Example format '21_06_14_12_30' for 2021 June 14th 12:30 pm
2926
        accuracy : str
2927
            'low' (default) or 'high', resolution option used during _irrPlot and rtrace
2928
        customname : str
2929
            Custom text string to be added to the file name for the results .CSV files
2930
        modWanted : int
2931
            Module to be sampled. Index starts at 1.
2932
        rowWanted : int
2933
            Row to be sampled. Index starts at 1. (row 1)
2934
        sensorsground : int (default None)
2935
            Number of scan points along the scene pitch.  Default every 20cm
2936
        sensorsgroundx : int (default 1)
2937
            Number of scans in the x dimension
2938
        sceneNum : int
2939
            Index of the scene number in the list of scenes per trackerdict. default 0
2940
        append : Bool (default True)
2941
            Append trackerdict['AnalysisObj'] to list.  Otherwise over-write any
2942
            AnalysisObj's and start 1axis analysis from scratch
2943

2944
        Returns
2945
        -------
2946
        trackerdict is returned with :py:class:`bifacial_radiance.AnalysisObj`  
2947
            for each timestamp:
2948
    
2949
        trackerdict.key.'AnalysisObj'  : analysis object for this tracker theta
2950
            to get a dictionary of results, run :py:class:`bifacial_radiance.AnalysisObj`.results
2951
        :py:class:`bifacial_radiance.AnalysisObj`.results returns the following keys:
2952
            'Wm2Ground'     : np.array of Wm-2 irradiances along the ground, len=sensorsground
2953
            'sensorsground'      : int of number of ground scan points
2954

2955
        """
2956
        
2957
        import warnings, itertools
2✔
2958

2959
        if customname is None:
2✔
2960
            customname = ''
2✔
2961

2962
        if trackerdict == None:
2✔
2963
            try:
2✔
2964
                trackerdict = self.trackerdict
2✔
2965
            except AttributeError:
×
2966
                print('No trackerdict value passed or available in self')
×
2967
        
2968
        if not append:
2✔
2969
            warnings.warn('Append=False. Over-writing any existing `AnalysisObj` in trackerdict.')
×
2970
            for key in trackerdict:
×
2971
                trackerdict[key]['AnalysisObj'] = []
×
2972
    
2973
        if singleindex is None:  # run over all values in trackerdict
2✔
2974
            trackerkeys = sorted(trackerdict.keys())
2✔
2975
        else:                   # run in single index mode.
2976
            trackerkeys = [singleindex]
×
2977

2978
        for index in trackerkeys:   # either full list of trackerdict keys, or single index
2✔
2979
            octfile = trackerdict[index]['octfile']
2✔
2980
            scene = trackerdict[index]['scenes'][sceneNum]
2✔
2981
            name = '1axis_groundscan_%s%s'%(index,customname)
2✔
2982
            trackerdict[index]['Results'] = []
2✔
2983
            if octfile is None:
2✔
2984
                continue  # don't run analysis if the octfile is none
×
2985
        
2986
            #Results = {'Groundscan':customname}
2987
            try:  # look for missing data
2✔
2988
                analysis = AnalysisObj(octfile,name)
2✔
2989
                analysis.sceneNum = sceneNum
2✔
2990
                #name = '1axis_%s%s'%(index,customname)
2991
                groundscanid = analysis.groundAnalysis(scene=scene, modWanted=modWanted,
2✔
2992
                                                       rowWanted=rowWanted,
2993
                                                       sensorsground=sensorsground)
2994
                analysis.analysis(octfile=octfile,name=name,
2✔
2995
                                  frontscan=groundscanid, accuracy=accuracy)
2996
                #Results['AnalysisObj']=analysis
2997
                # try to push Wm2Ground and sensorsground into the AnalysisObj...
2998
                analysis.Wm2Ground = analysis.Wm2Front
2✔
2999
                del analysis.Wm2Front
2✔
3000
                analysis.sensorsground = analysis.Wm2Ground.__len__()
2✔
3001
                trackerdict[index]['AnalysisObj'].append(analysis)
2✔
3002
            except Exception as e: # problem with file. TODO: only catch specific error types here.
×
3003
                warnings.warn('Index: {}. Problem with file. Error: {}. Skipping'.format(index,e), Warning)
×
3004
                return
×
3005
            """
2✔
3006
            try:  #on error, trackerdict[index] is returned empty
3007
                Results['Wm2Ground'] = analysis.Wm2Front
3008
                Results['sensorsground'] = analysis.Wm2Front.__len__()
3009
            except AttributeError as  e:  # no key Wm2Front.
3010
                warnings.warn('Index: {}. Trackerdict key not found: {}. Skipping'.format(index,e), Warning)
3011
                return
3012
            trackerdict[index]['Results'].append(Results)
3013
            """
3014
            try:
2✔
3015
                print('Index: {}. Wm2Ground: {}. sensorsground: {}'.format(index,
2✔
3016
                    np.mean(analysis.Wm2Ground), sensorsground))
3017
            except AttributeError:  #no Wm2Front
×
3018
                warnings.warn('AnalysisObj not successful.')
×
3019
        return trackerdict
2✔
3020

3021

3022
    def calculatePerformance1axis(self, trackerdict=None, module=None,
2✔
3023
                             CECMod2=None, agriPV=False):
3024
            '''
3025
            Loops through all results in trackerdict and calculates performance, 
3026
            considering electrical mismatch, using
3027
            PVLib. Cell temperature is calculated 
3028

3029

3030
            Parameters
3031
             ----------
3032
            module: ModuleObj from scene.module
3033
                It's best to set this in advance in the ModuleObj. 
3034
                If passed in here, it overrides the value that may be set in the
3035
                trackerdict already.
3036
            CEcMod2 : Dict
3037
                Dictionary with CEC Module Parameters for a Monofacial module. If None,
3038
                same module as CECMod is used for the BGE calculations, but just 
3039
                using the front irradiance (Gfront). 
3040
                
3041
            Returns
3042
            -------
3043
            trackerdict 
3044
                Trackerdict with new entries for each key of irradiance and Power 
3045
                Output for the module.
3046
                POA_eff: mean of [(mean of clean Gfront) + clean Grear * bifaciality factor]
3047
                Gfront_mean: mean of clean Gfront
3048
                Grear_mean: mean of clean Grear
3049
                Mismatch: mismatch calculated from the MAD distribution of 
3050
                          POA_total
3051
                Pout_raw: power output calculated from POA_total, considers 
3052
                      wind speed and temp_amb if in trackerdict.
3053
                Pout: power output considering electrical mismatch
3054
            '''
3055
            
3056
            from bifacial_radiance import performance
2✔
3057
            import pandas as pd
2✔
3058
            
3059
            if trackerdict is None:
2✔
3060
                trackerdict = self.trackerdict
2✔
3061

3062
            keys = list(trackerdict.keys())
2✔
3063
            
3064
            def _trackerMeteo(tracker_item):
2✔
3065
                keylist = ['dni', 'ghi', 'dhi', 'temp_air', 'wind_speed' ]
2✔
3066
                return {k: v for k, v in tracker_item.items() if k in keylist}
2✔
3067
                
3068
            def _printRow(analysisobj, key):
2✔
3069
                if self.cumulativesky:
2✔
UNCOV
3070
                    keyname = 'theta'
×
3071
                else:
3072
                    keyname = 'timestamp'
2✔
3073
                return pd.concat([pd.DataFrame({keyname:key},index=[0]),
2✔
3074
                                 analysisobj.results], axis=1)
3075
        
3076
                
3077

3078
            # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!1``34
3079
            # TODO IMPORTANT: ADD CUMULATIVE CHEck AND WHOLE OTHER PROCESSING OPTION
3080
            # TO EMULATE WHAT HAPPENED BEFORE WITH GENCUMSKY1AXIS when trackerdict = cumulative = True
3081
            # if cumulative:
3082
            #    print("Add HERE gencusky1axis results for each tracekr angle")
3083

3084
            #else:
3085
            # loop over module and row values in 'Results'
3086
            keys_all = []
2✔
3087
            self.compiledResults = pd.DataFrame(None)
2✔
3088

3089
            if not self.cumulativesky:
2✔
3090
                                
3091
                for key in keys:
2✔
3092
            
3093
                    meteo_data = _trackerMeteo(trackerdict[key])
2✔
3094
                    
3095
                    # TODO HERE: SUM all keys for rows that have the same rowWanted/modWanted
3096
                    try:
2✔
3097
                        for analysis in trackerdict[key]['AnalysisObj']: # loop over multiple row & module in trackerDict['AnalysisObj']
2✔
3098
                            keys_all.append(key)
2✔
3099
                            # Search for module object 
3100
                            if module is None:
2✔
3101
                                module_local = trackerdict[key]['scenes'][analysis.sceneNum].module
2✔
3102
                            else:
NEW
3103
                                module_local = module
×
3104
                            analysis.calculatePerformance(meteo_data=meteo_data, 
2✔
3105
                                                          module=module_local,
3106
                                                          cumulativesky=self.cumulativesky,   
3107
                                                           CECMod2=CECMod2, 
3108
                                                          agriPV=agriPV)
3109
                            self.compiledResults = pd.concat([self.compiledResults, 
2✔
3110
                                                              _printRow(analysis, key)], ignore_index=True)
3111
                    except KeyError:
×
3112
                        pass
×
3113
                    
3114
                            
3115
                        
3116
                
3117
            else:
3118
                if module is None:
2✔
NEW
3119
                    for key in keys:  # loop over trackerdict to find first available module
×
NEW
3120
                        try:
×
NEW
3121
                            for analysis in trackerdict[key]['AnalysisObj']:
×
NEW
3122
                                module_local = trackerdict[key]['scenes'][analysis.sceneNum].module
×
NEW
3123
                                break
×
NEW
3124
                        except (KeyError, AttributeError, IndexError):
×
NEW
3125
                            pass
×
3126
                else:
3127
                    module_local = module
2✔
3128
                self.compiledResults = performance.calculatePerformanceGencumsky(results=self.results,
2✔
3129
                                           bifacialityfactor=module_local.bifi,
3130
                                           fillcleanedSensors=True, agriPV=False)
3131
               
3132
                self.compiledResults.to_csv(os.path.join('results', 'Cumulative_Results.csv'),
2✔
3133
                                            float_format='%0.3f')
3134
                
3135
            self.trackerdict = trackerdict    
2✔
3136
            return self.compiledResults
2✔
3137
        
3138
            
3139
    def generate_spectra(self, metdata=None, simulation_path=None, ground_material=None, scale_spectra=False,
2✔
3140
                         scale_albedo=False, scale_albedo_nonspectral_sim=False, scale_upper_bound=2500):
3141
        '''
3142
        Generate spectral irradiance files for spectral simulations using pySMARTS
3143
        Or
3144
        Generate an hourly albedo weighted by pySMARTS spectral irradiances
3145

3146
        Parameters
3147
        ----------
3148
        metdata : radianceObject.metdata, optional
3149
            DESC
3150
        simulation_path : path object or string, optional
3151
            path of current simulation directory
3152
        ground_material : str or (R,G,B), optional
3153
            ground material string from pySMARTS glossary or compatible
3154
            (R,G,B) tuple.
3155
        scale_spectra : boolean, default=False
3156
            Apply a simple scaling to the generated spectra. Scales by
3157
            integrated irradiance below specified upper wavelength bound
3158
        scale_albedo : boolean, default=False
3159
            Apply a scaling factor to generated spectral albedo.
3160
            Scales by mean value below specified upper wavelength bound
3161
        scale_albedo_nonspectral_sim : boolean, default=False
3162
            You intend to run a non-spectral simulation. This will scale
3163
            the albedo read from the weather file by a calculation
3164
            on measured and generated spectra and the spectral responsivity
3165
            of device (spectral responsivity currently not implemented)
3166
        scale_upper_bound : integer, optional
3167
            Sets an upper bound for the wavelength in all scaling
3168
            calculations. Limits the bounds of integration for spectral DNI,
3169
            DHI, and GHI. Limits the domain over which spectral albedo
3170
            is averaged.
3171
        
3172
        Returns
3173
        -------
3174
        spectral_alb : spectral_property class
3175
            spectral_alb.data:  dataframe with frequency and magnitude data.
3176
            returns as None when scale_albedo_nonspectral_sim == True
3177
        spectral_dni : spectral_property class
3178
            spectral_dni.data:  dataframe with frequency and magnitude data.
3179
        spectral_dhi : spectral_property class
3180
            spectral_dhi.data:  dataframe with frequency and magnitude data.
3181
        weighted_alb : pd.series
3182
            datetime-indexed series of weighted albedo values
3183
            returns as None when scale_albedo_nonspectral_sim == False
3184
        '''
3185
        if metdata == None:
2✔
3186
            metdata = self.metdata                        
2✔
3187
        if simulation_path == None:
2✔
3188
            simulation_path = self.path
2✔
3189

3190
        from bifacial_radiance import spectral_utils as su
2✔
3191
      
3192
        spectra_path = 'spectra'
2✔
3193
        if not os.path.exists(spectra_path):
2✔
3194
            os.mkdir(spectra_path)
×
3195
        
3196
        (spectral_alb, spectral_dni, spectral_dhi, weighted_alb) = su.generate_spectra(metdata=metdata,
2✔
3197
                            simulation_path=simulation_path,
3198
                            ground_material=ground_material,
3199
                            spectra_folder=spectra_path,
3200
                            scale_spectra=scale_spectra,
3201
                            scale_albedo=scale_albedo,
3202
                            scale_albedo_nonspectral_sim=scale_albedo_nonspectral_sim,
3203
                            scale_upper_bound=scale_upper_bound)
3204
        
3205
        if scale_albedo_nonspectral_sim:
2✔
3206
            self.metdata.albedo = weighted_alb.values
2✔
3207
        return (spectral_alb, spectral_dni, spectral_dhi, weighted_alb)
2✔
3208

3209
    def generate_spectral_tmys(self, wavelengths, weather_file, location_name, spectra_folder=None,
2✔
3210
                               output_folder=None):
3211
        """
3212
        Generate a series of TMY-like files with per-wavelength irradiance. There will be one file
3213
        per wavelength. These are necessary to run a spectral simulation with gencumsky
3214
        
3215
        Paramters:
3216
        ----------
3217
        wavelengths: (np.array or list)
3218
            array or list of integer wavelengths to simulate, in units [nm]. example: [300,325,350]
3219
        spectra_folder: (path or str)
3220
            File path or path-like string pointing to the folder contained the SMARTS generated spectra
3221
        weather_file: (path or str)
3222
            File path or path-like string pointing to the weather file used for spectra generation.
3223
            The structure of this file, and it's meta-data, will be copied into the new files.
3224
        location_name: 
3225
            _description_
3226
        output_folder: 
3227
            File path or path-like string pointing to the destination folder for spectral TMYs
3228
        """
3229
        from bifacial_radiance import spectral_utils as su
×
3230

3231
        if spectra_folder is None:
×
3232
            spectra_folder = 'spectra'
×
3233
        
3234
        if output_folder is None:
×
3235
            output_folder = os.path.join('data','spectral_tmys')
×
3236
        if not os.path.exists(output_folder):
×
3237
            os.makedirs(output_folder, exist_ok=True)
×
3238
        
3239
        su.generate_spectral_tmys(wavelengths=wavelengths, spectra_folder=spectra_folder,
×
3240
                                  weather_file=weather_file, location_name=location_name,
3241
                                  output_folder=output_folder)
3242

3243
# End RadianceObj definition
3244

3245
class GroundObj(SuperClass):
2✔
3246
    """
3247
    Class to set and return details for the ground surface materials and reflectance.
3248
    If 1 albedo value is passed, it is used as default.
3249
    If 3 albedo values are passed, they are assigned to each of the three wavelength placeholders (RGB),
3250
    
3251
    If material type is known, it is used to get reflectance info.  
3252
    if material type isn't known, material_info.list is returned
3253

3254
    Parameters
3255
    ------------
3256
    materialOrAlbedo : numeric or str
3257
        If number between 0 and 1 is passed, albedo input is assumed and assigned.    
3258
        If string is passed with the name of the material desired. e.g. 'litesoil',
3259
        properties are searched in `material_file`.
3260
        Default Material names to choose from: litesoil, concrete, white_EPDM, 
3261
        beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
3262
    material_file : str
3263
        Filename of the material information. Default `ground.rad`
3264
    silent       :  bool   
3265
        suppress print statements. Default False  
3266

3267
    Returns
3268
    -------
3269

3270
    """
3271
    def __repr__(self):
2✔
3272
        return str(self.__dict__)   
×
3273
    def __init__(self, materialOrAlbedo=None, material_file=None, silent=False):
2✔
3274
        #import warnings
3275
        from numbers import Number
2✔
3276
        
3277
        self.normval = None
2✔
3278
        self.ReflAvg = None
2✔
3279
        self.Rrefl = None
2✔
3280
        self.Grefl = None
2✔
3281
        self.Brefl = None
2✔
3282

3283
        self.ground_type = 'custom'        
2✔
3284

3285
        if material_file is None:
2✔
3286
            material_file = 'ground.rad'
2✔
3287
            
3288
        self.material_file = material_file           
2✔
3289
        if materialOrAlbedo is None: # Case where it's none.
2✔
3290
            print('\nInput albedo 0-1, or string from ground.printGroundMaterials().'
×
3291
            '\nAlternatively, run setGround after readWeatherData()'
3292
            'and setGround will read metdata.albedo if available')
3293
            return
×
3294
            
3295
        if isinstance(materialOrAlbedo, str) :
2✔
3296
            self.ground_type = materialOrAlbedo  
2✔
3297
            # Return the RGB albedo for material ground_type
3298
            materialOrAlbedo = self.printGroundMaterials(self.ground_type)
2✔
3299
            
3300
        # Check for double and int. 
3301
        if isinstance(materialOrAlbedo, Number):
2✔
3302
            materialOrAlbedo = np.array([[materialOrAlbedo, 
2✔
3303
                                          materialOrAlbedo, materialOrAlbedo]])
3304
        
3305
        if isinstance(materialOrAlbedo, list):
2✔
3306
            materialOrAlbedo = np.asarray(materialOrAlbedo)
×
3307
        
3308
        # By this point, materialOrAlbedo should be a np.ndarray:
3309
        if isinstance(materialOrAlbedo, np.ndarray):
2✔
3310

3311
            if materialOrAlbedo.ndim == 0:
2✔
3312
            # numpy array of one single value, i.e. np.array(0.62)
3313
            # after this if, np.array([0.62])
3314
                materialOrAlbedo = materialOrAlbedo.reshape([1])
×
3315
                
3316
            if materialOrAlbedo.ndim == 1:
2✔
3317
            # If np.array is ([0.62]), this repeats it so at the end it's
3318
            # np.array ([0.62, 0.62, 0.62])
3319
                materialOrAlbedo = np.repeat(np.array([materialOrAlbedo]), 
2✔
3320
                                             3, axis=1).reshape(
3321
                                                     len(materialOrAlbedo),3)
3322
            
3323
            if (materialOrAlbedo.ndim == 2) & (materialOrAlbedo.shape[1] > 3): 
2✔
3324
                    warnings.warn("Radiance only raytraces 3 wavelengths at "
2✔
3325
                                  "a time. Trimming albedo np.array input to "
3326
                                  "3 wavelengths.")
3327
                    materialOrAlbedo = materialOrAlbedo[:,0:3]
2✔
3328
        # By this point we should have np.array of dim=2 and shape[1] = 3.        
3329
        # Check for invalid values
3330
        if (materialOrAlbedo > 1).any() or (materialOrAlbedo < 0).any():
2✔
3331
            if not silent:
2✔
3332
                print('Warning: albedo values greater than 1 or less than 0. '
2✔
3333
                      'Constraining to [0..1]')
3334
            materialOrAlbedo = materialOrAlbedo.clip(min=0, max=1)
2✔
3335
        try:
2✔
3336
            self.Rrefl = materialOrAlbedo[:,0]
2✔
3337
            self.Grefl = materialOrAlbedo[:,1]
2✔
3338
            self.Brefl = materialOrAlbedo[:,2]
2✔
3339
            self.normval = _normRGB(materialOrAlbedo[:,0],materialOrAlbedo[:,1],
2✔
3340
                                    materialOrAlbedo[:,2])
3341
            self.ReflAvg = np.round(np.mean(materialOrAlbedo, axis=1),4)
2✔
3342
            if not silent:
2✔
3343
                print(f'Loading albedo, {self.ReflAvg.__len__()} value(s), '
2✔
3344
                      f'{self._nonzeromean(self.ReflAvg):0.3f} avg\n'
3345
                      f'{self.ReflAvg[self.ReflAvg != 0].__len__()} nonzero albedo values.')
3346
        except IndexError as e:
×
3347
            print('albedo.shape should be 3 column (N x 3)')
×
3348
            raise e
×
3349
    
3350
    def printGroundMaterials(self, materialString=None):
2✔
3351
        """
3352
        printGroundMaterials(materialString=None)
3353
        
3354
        input: None or materialString.  If None, return list of acceptable
3355
        material types from ground.rad.  If valid string, return RGB albedo
3356
        of the material type selected.
3357
        """
3358
        
3359
        #import warnings
3360
        material_path = 'materials'
2✔
3361
        
3362
        f = open(os.path.join(material_path, self.material_file))
2✔
3363
        keys = [] #list of material key names
2✔
3364
        Rreflall = []; Greflall=[]; Breflall=[] #RGB material reflectance  
2✔
3365
        temp = f.read().split()
2✔
3366
        f.close()
2✔
3367
        #return indices for 'plastic' definition
3368
        index = _findme(temp,'plastic')
2✔
3369
        for i in index:
2✔
3370
            keys.append(temp[i+1])# after plastic comes the material name
2✔
3371
            Rreflall.append(float(temp[i+5]))#RGB reflectance comes a few more down the list
2✔
3372
            Greflall.append(float(temp[i+6]))
2✔
3373
            Breflall.append(float(temp[i+7]))
2✔
3374
        
3375
        if materialString is not None:
2✔
3376
            try:
2✔
3377
                index = _findme(keys,materialString)[0]
2✔
3378
            except IndexError:
×
3379
                warnings.warn('Error - materialString not in '
×
3380
                              f'{self.material_file}: {materialString}')
3381
            return(np.array([[Rreflall[index], Greflall[index], Breflall[index]]]))
2✔
3382
        else:
3383
            return(keys)
2✔
3384
            
3385
    def _nonzeromean(self, val):
2✔
3386
        '''  array mean excluding zero. return zero if everything's zero'''
3387
        tempmean = np.nanmean(val)
2✔
3388
        if tempmean > 0:
2✔
3389
            tempmean = np.nanmean(val[val !=0])
2✔
3390
        return tempmean     
2✔
3391
        
3392
    def _makeGroundString(self, index=0, cumulativesky=False):
2✔
3393
        '''
3394
        create string with ground reflectance parameters for use in 
3395
        gendaylit and gencumsky.
3396
        
3397
        Parameters
3398
        -----------
3399
        index : integer
3400
            Index of time for time-series albedo. Default 0
3401
        cumulativesky:  Boolean
3402
            If true, set albedo to average of time series values.
3403

3404
        Returns
3405
        -------
3406
        groundstring:  text with albedo details to append to sky.rad in
3407
                       gendaylit
3408
        '''
3409
         
3410
        
3411
        try:  
2✔
3412
            if cumulativesky is True:
2✔
3413
                Rrefl = self._nonzeromean(self.Rrefl) 
2✔
3414
                Grefl = self._nonzeromean(self.Grefl) 
2✔
3415
                Brefl = self._nonzeromean(self.Brefl)
2✔
3416
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
3417
            else:
3418
                Rrefl = self.Rrefl[index]
2✔
3419
                Grefl = self.Grefl[index]
2✔
3420
                Brefl = self.Brefl[index]
2✔
3421
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
3422

3423
            # Check for all zero albedo case
3424
            if normval == 0:
2✔
3425
                normval = 1
×
3426
            
3427
            groundstring = ( f'\nskyfunc glow ground_glow\n0\n0\n4 ' 
2✔
3428
                f'{Rrefl/normval} {Grefl/normval} {Brefl/normval} 0\n' 
3429
                '\nground_glow source ground\n0\n0\n4 0 0 -1 180\n' 
3430
                f'\nvoid plastic {self.ground_type}\n0\n0\n5 '
3431
                f'{Rrefl:0.3f} {Grefl:0.3f} {Brefl:0.3f} 0 0\n'
3432
                f"\n{self.ground_type} ring groundplane\n" 
3433
                '0\n0\n8\n0 0 -.01\n0 0 1\n0 100' )
3434
        except IndexError as err:
×
3435
            print(f'Index {index} passed to albedo with only '
×
3436
                  f'{self.Rrefl.__len__()} values.'   )
3437
            raise err
×
3438
        return groundstring
2✔
3439

3440
        
3441

3442
class SceneObj(SuperClass):
2✔
3443
    '''
3444
    Scene information including PV module type, bifaciality, array info
3445
    pv module orientation defaults: Azimuth = 180 (south)
3446
    pv module origin: z = 0 bottom of frame. y = 0 lower edge of frame. 
3447
    x = 0 vertical centerline of module
3448

3449
    scene includes module details (x,y,bifi, sceney (collector_width), scenex)
3450
    
3451
    Parameters
3452
    ------------
3453
    module : str or ModuleObj
3454
            String name of module created with makeModule()
3455
    name : str
3456
           Identifier of scene in case of multiple scenes. Default `Scene0'.
3457
           Automatically increments if makeScene is run multiple times.
3458
    
3459
    '''
3460

3461
    def __init__(self, module=None, name=None, hpc=False):
2✔
3462
        ''' initialize SceneObj
3463
        '''
3464
        from bifacial_radiance import ModuleObj
2✔
3465
        # should sceneDict be initialized here? This is set in _makeSceneNxR
3466
        if module is None:
2✔
3467
            return
×
3468
        elif type(module) == str:
2✔
3469
            self.module = ModuleObj(name=module)
2✔
3470

3471

3472
        elif str(type(module)) == "<class 'bifacial_radiance.module.ModuleObj'>": # try moduleObj
2✔
3473
            self.module = module
2✔
3474

3475
        #self.moduleDict = self.module.getDataDict()
3476
        #self.scenex = self.module.scenex
3477
        #self.sceney = self.module.sceney
3478
        #self.offsetfromaxis = self.moduleDict['offsetfromaxis']
3479
        
3480
        self.modulefile = self.module.modulefile
2✔
3481
        self.hpc = hpc  #default False.  Set True by makeScene after sceneobj created.
2✔
3482
        if name is None:
2✔
3483
            self.name = 'Scene0'
2✔
3484
        else:
3485
            self.name = name
2✔
3486

3487
    def _makeSceneNxR(self, modulename=None, sceneDict=None, radname=None, addhubheight=False):
2✔
3488
        """
3489
        Arrange module defined in :py:class:`bifacial_radiance.SceneObj` into a N x R array.
3490
        Returns a :py:class:`bifacial_radiance.SceneObj` which contains details 
3491
        of the PV system configuration including `tilt`, `row pitch`, `hub_height`
3492
        or `clearance_height`, `nMod`s per row, `nRows` in the system.
3493

3494
        The returned scene has (0,0) coordinates centered at the module at the
3495
        center of the array. For 5 rows, that is row 3, for 4 rows, that is
3496
        row 2 also (rounds down). For 5 modules in the row, that is module 3,
3497
        for 4 modules in the row, that is module 2 also (rounds down)
3498

3499
        Parameters
3500
        ------------
3501
        modulename: str 
3502
            Name of module created with :py:class:`~bifacial_radiance.RadianceObj.makeModule`.
3503
        sceneDict : dictionary 
3504
            Dictionary of scene parameters.
3505
                clearance_height : numeric
3506
                    (meters). 
3507
                pitch : numeric
3508
                    Separation between rows
3509
                tilt : numeric 
3510
                    Valid input ranges -90 to 90 degrees
3511
                azimuth : numeric 
3512
                    A value denoting the compass direction along which the
3513
                    axis of rotation lies. Measured in decimal degrees East
3514
                    of North. [0 to 180) possible.
3515
                nMods : int 
3516
                    Number of modules per row (default = 20)
3517
                nRows : int 
3518
                    Number of rows in system (default = 7)
3519
        radname : str
3520
            String for name for radfile.
3521
        addhubheight : Bool, default False
3522
            Add hubheight back to the sceneDict since it was stripped out 
3523
            by makeScene1axis
3524

3525

3526
        Returns
3527
        -------
3528
        radfile : str
3529
             Filename of .RAD scene in /objects/
3530
        scene : :py:class:`~bifacial_radiance.SceneObj `
3531
             Returns a `SceneObject` 'scene' with configuration details
3532

3533
        """
3534
        import copy
2✔
3535
        
3536
        if modulename is None:
2✔
3537
            modulename = self.module.name
2✔
3538

3539
        if sceneDict is None:
2✔
3540
            print('makeScene(modulename, sceneDict, nMods, nRows).  sceneDict'
×
3541
                  ' inputs: .tilt .azimuth .nMods .nRows' 
3542
                  ' AND .tilt or .gcr ; AND .hub_height or .clearance_height')
3543
        else: sceneDict = copy.deepcopy(sceneDict)
2✔
3544

3545

3546
        if 'orientation' in sceneDict:
2✔
3547
            raise Exception('\n\n ERROR: Orientation format has been '
×
3548
                'deprecated since version 0.2.4. If you want to flip your '
3549
                'modules, on makeModule switch the x and y values.\n\n')
3550

3551
        if 'azimuth' not in sceneDict:
2✔
3552
            sceneDict['azimuth'] = 180
×
3553

3554
        if 'axis_tilt' not in sceneDict:
2✔
3555
            sceneDict['axis_tilt'] = 0
2✔
3556

3557
        if 'originx' not in sceneDict:
2✔
3558
            sceneDict['originx'] = 0
2✔
3559

3560
        if 'originy' not in sceneDict:
2✔
3561
            sceneDict['originy'] = 0
2✔
3562

3563
        if radname is None:
2✔
3564
            radname =  str(self.module.name).strip().replace(' ', '_')
2✔
3565

3566
        # loading variables
3567
        tilt = round(sceneDict['tilt'], 2)
2✔
3568
        azimuth = round(sceneDict['azimuth'], 2)
2✔
3569
        nMods = sceneDict['nMods']
2✔
3570
        nRows = sceneDict['nRows']
2✔
3571
        axis_tilt = sceneDict['axis_tilt']
2✔
3572
        originx = sceneDict ['originx']
2✔
3573
        originy = sceneDict['originy']
2✔
3574

3575
        # hub_height, clearance_height and height logic.
3576
        # this routine uses hub_height to move the panels up so it's important 
3577
        # to have a value for that, either obtianing from clearance_height 
3578
        # (if coming from makeScene) or from hub_height itself.
3579
        # it is assumed that if no clearance_height or hub_height is passed,
3580
        # hub_height = height.
3581

3582
        
3583
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, preferred='hub_height', 
2✔
3584
                                                     nonpreferred='clearance_height')
3585
        
3586
        if use_clearanceheight :
2✔
3587
            hubheight = sceneDict['clearance_height'] + 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
3588
            * self.module.sceney - self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
3589

3590
            title_clearance_height = sceneDict['clearance_height'] 
2✔
3591
            if addhubheight:
2✔
3592
                sceneDict['hub_height'] = np.round(hubheight,3)
2✔
3593
        else:
3594
            hubheight = sceneDict['hub_height'] 
2✔
3595
            # this calculates clearance_height, used for the title
3596
            title_clearance_height = sceneDict['hub_height'] - 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
3597
            * self.module.sceney + self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
3598

3599
        try: 
2✔
3600
            if sceneDict['pitch'] >0:
2✔
3601
                pitch = sceneDict['pitch'] 
2✔
3602
            else:
3603
                raise Exception('default to gcr')
×
3604
            
3605
        except:
2✔
3606

3607
            if 'gcr' in sceneDict:
2✔
3608
                pitch = np.round(self.module.sceney/sceneDict['gcr'],3)
2✔
3609
            else:
3610
                raise Exception('No valid `pitch` or `gcr` in sceneDict')
×
3611

3612

3613

3614
        ''' INITIALIZE VARIABLES '''
2✔
3615
        text = '!xform '
2✔
3616

3617
        text += '-rx %s -t %s %s %s ' %(tilt, 0, 0, hubheight)
2✔
3618
        
3619
        # create nMods-element array along x, nRows along y. 1cm module gap.
3620
        text += '-a %s -t %s 0 0 -a %s -t 0 %s 0 ' %(nMods, self.module.scenex, nRows, pitch)
2✔
3621

3622
        # azimuth rotation of the entire shebang. Select the row to scan here based on y-translation.
3623
        # Modifying so center row is centered in the array. (i.e. 3 rows, row 2. 4 rows, row 2 too)
3624
        # Since the array is already centered on row 1, module 1, we need to increment by Nrows/2-1 and Nmods/2-1
3625

3626
        text += (f'-i 1 -t {-self.module.scenex*(round(nMods/1.999)*1.0-1)} '
2✔
3627
                 f'{-pitch*(round(nRows / 1.999)*1.0-1)} 0 -rz {180-azimuth} '
3628
                 f'-t {originx} {originy} 0 ' )
3629
        
3630
        #axis tilt only working for N-S trackers
3631
        if axis_tilt != 0 and azimuth == 90:  
2✔
3632
            print("Axis_Tilt is still under development. The scene will be "
×
3633
                  "created with the proper axis tilt, and the tracking angle"
3634
                  "will consider the axis_tilt, but the sensors for the "
3635
                  "analysis might not fall in the correct surfaces unless you"
3636
                  " manually position them for this version. Sorry! :D ")
3637
                  
3638
            text += (f'-rx {axis_tilt} -t 0 0 %s ' %(
×
3639
                self.module.scenex*(round(nMods/1.99)*1.0-1)*np.sin(
3640
                        axis_tilt * np.pi/180) ) )
3641

3642
        filename = (f'{radname}_C_{title_clearance_height:0.2f}_rtr_{pitch:0.2f}_tilt_{tilt:0.0f}_'
2✔
3643
                    f'{nMods}modsx{nRows}rows_origin{originx},{originy}.rad' )
3644
        
3645
        if self.hpc:
2✔
3646
            text += f'"{os.path.join(os.getcwd(), self.modulefile)}"'
2✔
3647
            radfile = os.path.join(os.getcwd(), 'objects', filename)
2✔
3648
        else:
3649
            text += f'"{os.path.join(self.modulefile)}"'
2✔
3650
            radfile = os.path.join('objects',filename)
2✔
3651

3652
        # py2 and 3 compatible: binary write, encode text first
3653
        with open(radfile, 'wb') as f:
2✔
3654
            f.write(text.encode('ascii'))
2✔
3655

3656
        self.gcr = self.module.sceney / pitch
2✔
3657
        self.text = text
2✔
3658
        self.radfiles = radfile
2✔
3659
        self.sceneDict = sceneDict
2✔
3660
#        self.hub_height = hubheight
3661
        return radfile
2✔
3662
    
3663
    def appendtoScene(self, radfile=None, customObject=None,  text=''):
2✔
3664
        """
3665
        Appends to the `Scene radfile` in folder `\objects` the text command in Radiance
3666
        lingo created by the user.
3667
        Useful when using addCustomObject to the scene.
3668

3669
        Parameters
3670
        ----------
3671
        radfile: str, optional
3672
            Directory and name of where .rad scene file is stored. Default: self.radfiles
3673
        customObject : str
3674
            Directory and name of custom object .rad file is stored, and any geometry
3675
            modifications needed for it.
3676
        text : str, optional 
3677
            Command to be appended to the radfile which specifies its position 
3678
            in the scene. Do not leave empty spaces at the end.
3679

3680

3681
        Returns
3682
        -------
3683
        Nothing, the radfile must already be created and assigned when running this.
3684
        
3685
        """
3686
        
3687
        # py2 and 3 compatible: binary write, encode text first
3688

3689
        if not radfile: #by default, append to the first radfile in the list
2✔
3690
            if type(self.radfiles) == list:
2✔
3691
                radfile = self.radfiles[0]
2✔
3692
            elif type(self.radfiles) == str:
2✔
3693
                radfile = self.radfiles
2✔
3694
            else:
3695
                raise Exception('SceneObj.radfiles set improperly')
×
3696

3697
        if customObject:
2✔
3698
            text2 = '\n!xform -rx 0 ' + text + ' ' + customObject
2✔
3699
            
3700
            debug = False
2✔
3701
            if debug:
2✔
3702
                print (text2)
×
3703
    
3704
            with open(radfile, 'a+') as f:
2✔
3705
                f.write(text2)
2✔
3706
    
3707
   
3708
    def showScene(self):
2✔
3709
        """ 
3710
        Method to call objview on the scene included in self
3711
            
3712
        """
3713
        cmd = 'objview %s %s' % (os.path.join('materials', 'ground.rad'),
×
3714
                                         self.radfiles)
3715
        print('Rendering scene. This may take a moment...')
×
3716
        _,err = _popen(cmd,None)
×
3717
        if err is not None:
×
3718
            print('Error: {}'.format(err))
×
3719
            print('possible solution: install radwinexe binary package from '
×
3720
                  'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml'
3721
                  ' into your RADIANCE binaries path')
3722
            return
×
3723

3724
    def saveImage(self, filename=None, view=None):
2✔
3725
        """
3726
        Save an image of the scene to /images/. A default ground (concrete material) 
3727
        and sun (due East or West azimuth and 65 elevation) are created. 
3728

3729
        Parameters:    
3730
            filename : string, optional. name for image file, defaults to scene name
3731
            view     : string, optional.  name for view file in /views. default to 'side.vp'  
3732
                      Input of 'XYZ' into view will do a zoomed out view of the whole scene              
3733

3734
        """
3735
        import tempfile
2✔
3736
        
3737
        temp_dir = tempfile.TemporaryDirectory()
2✔
3738
        pid = os.getpid()
2✔
3739
        if filename is None:
2✔
3740
            filename = f'{self.name}'
×
3741
            
3742
        if view is None:
2✔
3743
            view = 'side.vp'
2✔
3744

3745
        # fake lighting temporary .radfile.  Use 65 elevation and +/- 90 azimuth
3746
        # use a concrete ground surface
3747
        if (self.sceneDict['azimuth'] > 100 and self.sceneDict['tilt'] >= 0) or \
2✔
3748
            (self.sceneDict['azimuth'] <= 100 and self.sceneDict['tilt'] < 0):
3749
            sunaz = 90
×
3750
        else:
3751
            sunaz = -90
2✔
3752
        ground = GroundObj('concrete', silent=True) 
2✔
3753
        ltfile = os.path.join(temp_dir.name, f'lt{pid}.rad')
2✔
3754
        with open(ltfile, 'w') as f:
2✔
3755
            f.write("!gensky -ang %s %s +s\n" %(65, sunaz) + \
2✔
3756
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
3757
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
3758
            ground._makeGroundString() )
3759
        
3760
        # make .rif and run RAD
3761
        riffile = os.path.join(temp_dir.name, f'ov{pid}.rif')
2✔
3762
        with open(riffile, 'w') as f:
2✔
3763
                f.write("scene= materials/ground.rad " +\
2✔
3764
                        f"{self.radfiles} {ltfile}\n".replace("\\",'/') +\
3765
                    f"EXPOSURE= .5\nUP= Z\nview= {view.replace('.vp','')} -vf views/{view}\n" +\
3766
                    f"oconv= -f\nPICT= images/{filename}")
3767
        _,err = _popen(["rad",'-s',riffile], None)
2✔
3768
        if err:
2✔
3769
            print(err)
×
3770
        else:
3771
            print(f"Scene image saved: images/{filename}_{view.replace('.vp','')}.hdr")
2✔
3772
        
3773
        temp_dir.cleanup()
2✔
3774

3775
    def addPiles(self, spacingPiles=6, pile_lenx=0.2, pile_leny=0.2, pile_height=None, debug=True):
2✔
3776
        '''
3777
        Function to add support piles at determined intervals throughout the rows.
3778
        TODO: enable functionality or check for scenes using 'clearance_height' ?
3779
        TODO: enable functionality with makeScene1axis (append radfile to each trackerdict entry)
3780
        
3781
        Parameters
3782
        ----------
3783
        spacingPiles : float
3784
            Distance between support piles.
3785
        pile_lenx : float
3786
            Dimension of the pile on the row-x direction, in meters. Default is 0.2
3787
        pile_leny: float
3788
            Dimension of the pile on the row-y direction, in meters. Defualt is 0.2
3789
        pile_height : float
3790
            Dimension of the pile on the z-direction, from the ground up. If None,
3791
            value of hub_height is used. Default: None.
3792
            
3793
        Returns
3794
        -------
3795
        None
3796
        
3797
        '''
3798

3799
        nMods = self.sceneDict['nMods'] 
2✔
3800
        nRows = self.sceneDict['nRows']           
2✔
3801
        module = self.module
2✔
3802

3803
        if pile_height is None:
2✔
3804
            pile_height = self.sceneDict['hub_height']
2✔
3805
            print("pile_height!", pile_height)
2✔
3806
            
3807
        rowlength = nMods * module.scenex
2✔
3808
        nPiles = np.floor(rowlength/spacingPiles) + 1
2✔
3809
        pitch = self.sceneDict['pitch']
2✔
3810
        azimuth=self.sceneDict['azimuth']
2✔
3811
        originx = self.sceneDict['originx']
2✔
3812
        originy = self.sceneDict['originy']
2✔
3813
    
3814
        text='! genbox black post {} {} {} '.format(pile_lenx, pile_leny, pile_height)
2✔
3815
        text+='| xform -t {} {} 0 '.format(pile_lenx/2.0, pile_leny/2.0)
2✔
3816

3817
        if self.hpc:
2✔
3818
            radfilePiles = os.path.join(os.getcwd(), 'objects', 'Piles.rad')
×
3819
        else:
3820
            radfilePiles = os.path.join('objects','post.rad')
2✔
3821

3822
        # py2 and 3 compatible: binary write, encode text first
3823
        with open(radfilePiles, 'wb') as f:
2✔
3824
            f.write(text.encode('ascii'))
2✔
3825
                    
3826
        
3827
        # create nPiles -element array along x, nRows along y. 1cm module gap.
3828
        text = '!xform -rx 0 -a %s -t %s 0 0 -a %s -t 0 %s 0 ' %(nPiles, spacingPiles, nRows, pitch)
2✔
3829

3830
        # azimuth rotation of the entire shebang. Select the row to scan here based on y-translation.
3831
        # Modifying so center row is centered in the array. (i.e. 3 rows, row 2. 4 rows, row 2 too)
3832
        # Since the array is already centered on row 1, module 1, we need to increment by Nrows/2-1 and Nmods/2-1
3833

3834
        text += (f'-i 1 -t {-self.module.scenex*(round(nMods/1.999)*1.0-1)} '
2✔
3835
                 f'{-pitch*(round(nRows / 1.999)*1.0-1)} 0 -rz {180-azimuth} '
3836
                 f'-t {originx} {originy} 0 ' )
3837

3838
        filename = (f'Piles_{spacingPiles}_{pile_lenx}_{pile_leny}_{pile_height}.rad')
2✔
3839

3840
        if self.hpc:
2✔
3841
            text += f'"{os.path.join(os.getcwd(), radfilePiles)}"'
×
3842
            scenePilesRad = os.path.join(os.getcwd(), 'objects', filename) 
×
3843
        else:
3844
            text += os.path.join(radfilePiles)
2✔
3845
            scenePilesRad = os.path.join('objects',filename ) 
2✔
3846

3847
        # py2 and 3 compatible: binary write, encode text first
3848
        with open(scenePilesRad, 'wb') as f:
2✔
3849
            f.write(text.encode('ascii'))
2✔
3850

3851
        try:
2✔
3852
            self.radfiles.append(scenePilesRad)
2✔
3853
            if debug:
2✔
3854
                print( "Piles Radfile Appended")
2✔
3855
        except:
×
3856
            #TODO: Manage situation where radfile was created with
3857
            #appendRadfile to False first..
3858
            self.radfiles=[]
×
3859
            self.radfiles.append(scenePilesRad)
×
3860
            
3861
        if debug:   
2✔
3862
            print("Piles Created and Appended Successfully.")
2✔
3863

3864
        return
2✔
3865

3866
# end of SceneObj
3867

3868

3869
        
3870
class MetObj(SuperClass):
2✔
3871
    """
3872
    Meteorological data from EPW file.
3873

3874
    Initialize the MetObj from tmy data already read in. 
3875
    
3876
    Parameters
3877
    -----------
3878
    tmydata : DataFrame
3879
        TMY3 output from :py:class:`~bifacial_radiance.RadianceObj.readTMY` or 
3880
        from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3881
    metadata : Dictionary
3882
        Metadata output from output from :py:class:`~bifacial_radiance.RadianceObj.readTMY`` 
3883
        or from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3884
    label : str
3885
        label : str
3886
        'left', 'right', or 'center'. For data that is averaged, defines if the
3887
        timestamp refers to the left edge, the right edge, or the center of the
3888
        averaging interval, for purposes of calculating sunposition. For
3889
        example, TMY3 data is right-labeled, so 11 AM data represents data from
3890
        10 to 11, and sun position should be calculated at 10:30 AM.  Currently
3891
        SAM and PVSyst use left-labeled interval data and NSRDB uses centered.
3892
        
3893
    Once initialized, the following parameters are available in the MetObj:
3894
        -latitude, longitude, elevation, timezone, city [scalar values]
3895
        
3896
        -datetime, ghi, dhi, dni, albedo, dewpoint, pressure, temp_air, 
3897
        wind_speed, meastracker_angle [numpy.array]
3898
        
3899
        -solpos [pandas dataframe of solar position]
3900

3901
    """
3902
    @property
2✔
3903
    def tmydata(self):
2✔
3904
        keys = ['ghi', 'dhi', 'dni', 'albedo', 'dewpoint', 'pressure', 
2✔
3905
                'temp_air', 'wind_speed', 'meastracker_angle', 'tracker_theta', 
3906
                'surface_tilt', 'surface_azimuth'] 
3907
        return pd.DataFrame({key:self.__dict__.get(key, None) for key in keys },
2✔
3908
                            index = self.__dict__['datetime']).dropna(axis=1)
3909
        
3910
    @property
2✔
3911
    def metadata(self):
2✔
3912
        keys = ['latitude', 'longitude', 'elevation', 'timezone', 'city', 'label', 
2✔
3913
                'timezone']
3914
        return {key:self.__dict__.get(key, None) for key in keys} 
2✔
3915
    
3916
    def __repr__(self):
2✔
3917
        # return metadata and tmydata stats...
3918
        import io
×
3919
        buf = io.StringIO()
×
3920
        self.tmydata.info(memory_usage=False, buf=buf)
×
3921
        tmyinfo = buf.getvalue()
×
3922
        buf.close()
×
3923
        return f"<class 'bifacial_radiance.main.MetObj'>.metadata:\n"\
×
3924
            f"{self.metadata}\n<class 'bifacial_radiance.main.MetObj'>.tmydata:\n {tmyinfo}\n"
3925

3926
    def __init__(self, tmydata, metadata, label = 'right'):
2✔
3927

3928
        import pytz
2✔
3929
        import pvlib
2✔
3930
        #import numpy as np
3931
        
3932
        #First prune all GHI = 0 timepoints.  New as of 0.4.0
3933
        # TODO: is this a good idea?  This changes default behavior...
3934
        tmydata = tmydata[tmydata.GHI > 0]
2✔
3935

3936
        #  location data.  so far needed:
3937
        # latitude, longitude, elevation, timezone, city
3938
        self.latitude = metadata['latitude']; lat=self.latitude
2✔
3939
        self.longitude = metadata['longitude']; lon=self.longitude
2✔
3940
        self.elevation = metadata['altitude']; elev=self.elevation
2✔
3941
        self.timezone = metadata['TZ']
2✔
3942

3943
        try:
2✔
3944
            self.city = metadata['Name'] # readepw version
2✔
3945
        except KeyError:
2✔
3946
            self.city = metadata['city'] # pvlib version
2✔
3947
        #self.location.state_province_region = metadata['State'] # unecessary
3948
        self.datetime = tmydata.index.tolist() # this is tz-aware.
2✔
3949
        self.ghi = np.array(tmydata.GHI)
2✔
3950
        self.dhi = np.array(tmydata.DHI)
2✔
3951
        self.dni = np.array(tmydata.DNI)
2✔
3952
        self.albedo = np.array(_firstlist([tmydata.get('Alb'), tmydata.get('albedo'), 
2✔
3953
                                           tmydata.get('Albedo')]) )
3954
        if pd.isnull(self.albedo).all():   self.albedo = None
2✔
3955
        
3956
        # Try and retrieve dewpoint and pressure
3957
        try:
2✔
3958
            self.dewpoint = np.array(tmydata['temp_dew'])
2✔
3959
        except KeyError:
2✔
3960
            self.dewpoint = None
2✔
3961

3962
        try:
2✔
3963
            self.pressure = np.array(tmydata['atmospheric_pressure'])
2✔
3964
        except KeyError:
2✔
3965
            self.pressure = None
2✔
3966

3967
        try:
2✔
3968
            self.temp_air = np.array(tmydata['temp_air'])
2✔
3969
        except KeyError:
2✔
3970
            self.temp_air = None
2✔
3971

3972
        if self.temp_air is None:
2✔
3973
            try:
2✔
3974
                self.temp_air = np.array(tmydata['DryBulb'])
2✔
3975
            except KeyError:
×
3976
                self.temp_air = None
×
3977

3978
        try:
2✔
3979
            self.wind_speed = np.array(tmydata['wind_speed'])
2✔
3980
        except KeyError:
2✔
3981
            self.wind_speed = None
2✔
3982
        
3983
        if self.wind_speed is None:
2✔
3984
            try:
2✔
3985
                self.wind_speed = np.array(tmydata['Wspd'])
2✔
3986
            except KeyError:
×
3987
                self.wind_speed = None
×
3988
            
3989
        # Try and retrieve TrackerAngle
3990
        try:
2✔
3991
            self.meastracker_angle = np.array(tmydata['Tracker Angle (degrees)'])
2✔
3992
        except KeyError:
2✔
3993
            self.meastracker_angle= None
2✔
3994
            
3995
            
3996
        #v0.2.5: initialize MetObj with solpos, sunrise/set and corrected time
3997
        datetimetz = pd.DatetimeIndex(self.datetime)
2✔
3998
        try:  # make sure the data is tz-localized.
2✔
3999
            datetimetz = datetimetz.tz_localize(pytz.FixedOffset(self.timezone*60))#  use pytz.FixedOffset (in minutes)
2✔
4000
        except TypeError:  # data is tz-localized already. Just put it in local time.
2✔
4001
            datetimetz = datetimetz.tz_convert(pytz.FixedOffset(self.timezone*60))
2✔
4002
        #check for data interval. default 1h.
4003
        try:
2✔
4004
            interval = datetimetz[1]-datetimetz[0]
2✔
4005
        except IndexError:
2✔
4006
            interval = pd.Timedelta('1h') # ISSUE: if 1 datapoint is passed, are we sure it's hourly data?
2✔
4007
            print ("WARNING: TMY interval was unable to be defined, so setting it to 1h.")
2✔
4008
        # TODO:  Refactor this into a subfunction. first calculate minutedelta 
4009
        # based on label and interval (-30, 0, +30, +7.5 etc) then correct all.        
4010
        if label.lower() == 'center':
2✔
4011
            print("Calculating Sun position for center labeled data, at exact timestamp in input Weather File")
2✔
4012
            sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
4013
            sunup['corrected_timestamp'] = datetimetz
2✔
4014
        else:
4015
            if interval== pd.Timedelta('1h'):
2✔
4016

4017
                if label.lower() == 'right':
2✔
4018
                    print("Calculating Sun position for Metdata that is right-labeled ", 
2✔
4019
                          "with a delta of -30 mins. i.e. 12 is 11:30 sunpos")
4020
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
4021
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes before timestamp
2✔
4022
                    # vector update of minutedelta at sunrise
4023
                    sunrisemask = sunup.index.hour-1==sunup['sunrise'].dt.hour
2✔
4024
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunrisemask,np.floor((60-(sunup['sunrise'].dt.minute))/2))
2✔
4025
                    # vector update of minutedelta at sunset
4026
                    sunsetmask = sunup.index.hour-1==sunup['sunset'].dt.hour
2✔
4027
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunsetmask,np.floor((60-(sunup['sunset'].dt.minute))/2))
2✔
4028
                    # save corrected timestamp
4029
                    sunup['corrected_timestamp'] = sunup.index-pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
4030
        
4031
                elif label.lower() == 'left':        
2✔
4032
                    print("Calculating Sun position for Metdata that is left-labeled ",
2✔
4033
                          "with a delta of +30 mins. i.e. 12 is 12:30 sunpos.")
4034
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) 
2✔
4035
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes after timestamp
2✔
4036
                    # vector update of minutedelta at sunrise
4037
                    sunrisemask = sunup.index.hour==sunup['sunrise'].dt.hour
2✔
4038
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunrisemask,np.ceil((60+sunup['sunrise'].dt.minute)/2))
2✔
4039
                    # vector update of minutedelta at sunset
4040
                    sunsetmask = sunup.index.hour==sunup['sunset'].dt.hour
2✔
4041
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunsetmask,np.ceil((60+sunup['sunset'].dt.minute)/2))
2✔
4042
                    # save corrected timestamp
4043
                    sunup['corrected_timestamp'] = sunup.index+pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
4044
                else: raise ValueError('Error: invalid weather label passed. Valid inputs: right, left or center')
×
4045
            else:
4046
                minutedelta = int(interval.seconds/2/60)
×
4047
                print("Interval in weather data is less than 1 hr, calculating"
×
4048
                      f" Sun position with a delta of -{minutedelta} minutes.")
4049
                print("If you want no delta for sunposition, use "
×
4050
                      "readWeatherFile( label='center').")
4051
                #datetimetz=datetimetz-pd.Timedelta(minutes = minutedelta)   # This doesn't check for Sunrise or Sunset
4052
                #sunup= pvlib.irradiance.solarposition.get_sun_rise_set_transit(datetimetz, lat, lon) # deprecated in pvlib 0.6.1
4053
                sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
×
4054
                sunup['corrected_timestamp'] = sunup.index-pd.Timedelta(minutes = minutedelta)
×
4055
    
4056
        self.solpos = pvlib.irradiance.solarposition.get_solarposition(sunup['corrected_timestamp'],lat,lon,elev)
2✔
4057
        self.sunrisesetdata=sunup
2✔
4058
        self.label = label
2✔
4059

4060
    def _set1axis(self, azimuth=180, limit_angle=45, angledelta=None, 
2✔
4061
                  backtrack=True, gcr=1.0/3.0, cumulativesky=True, 
4062
                  fixed_tilt_angle=None, axis_tilt=0, useMeasuredTrackerAngle=False):
4063

4064
        """
4065
        Set up geometry for 1-axis tracking cumulativesky.  Solpos data
4066
        already stored in `metdata.solpos`. Pull in tracking angle details from
4067
        pvlib, create multiple 8760 metdata sub-files where datetime of met
4068
        data matches the tracking angle.
4069

4070
        Parameters
4071
        ------------
4072
        cumulativesky : bool
4073
            Whether individual csv files are created
4074
            with constant tilt angle for the cumulativesky approach.
4075
            if false, the gendaylit tracking approach must be used.
4076
        azimuth : numerical
4077
            orientation axis of tracker torque tube. Default North-South (180 deg)
4078
            For fixed tilt simulations  this is the orientation azimuth
4079
        limit_angle : numerical
4080
            +/- limit angle of the 1-axis tracker in degrees. Default 45
4081
        angledelta : numerical
4082
            Degree of rotation increment to parse irradiance bins.
4083
            Default 5 degrees (0.4 % error for DNI).
4084
            Other options: 4 (.25%), 2.5 (0.1%).
4085
            (the smaller the angledelta, the more simulations)
4086
        backtrack : bool
4087
            Whether backtracking is enabled (default = True)
4088
        gcr : float
4089
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
4090
        axis_tilt : float
4091
            Tilt of the axis. While it can be considered for the tracking calculation,
4092
            the scene geometry creation of the trackers does not support tilte
4093
            axis_trackers yet (but can be done manuallyish. See Tutorials)
4094
        fixed_tilt_angle : numeric
4095
            If passed, this changes to a fixed tilt simulation where each hour
4096
            uses fixed_tilt_angle and azimuth as the tilt and azimuth
4097

4098
        Returns
4099
        -------
4100
        trackerdict : dictionary 
4101
            Keys for tracker tilt angles and
4102
            list of csv metfile, and datetimes at that angle
4103
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
4104
        metdata.solpos : dataframe
4105
            Dataframe with output from pvlib solar position for each timestep
4106
        metdata.sunrisesetdata :
4107
            Pandas dataframe with sunrise, sunset and adjusted time data.
4108
        metdata.tracker_theta : list
4109
            Tracker tilt angle from pvlib for each timestep
4110
        metdata.surface_tilt : list
4111
            Tracker surface tilt angle from pvlib for each timestep
4112
        metdata.surface_azimuth : list
4113
            Tracker surface azimuth angle from pvlib for each timestep
4114
        """
4115
          
4116
        #axis_tilt = 0       # only support 0 tilt trackers for now
4117
        self.cumulativesky = cumulativesky   # track whether we're using cumulativesky or gendaylit
2✔
4118

4119
        if (cumulativesky is True) & (angledelta is None):
2✔
4120
            angledelta = 5  # round angle to 5 degrees for cumulativesky
×
4121

4122
        # get 1-axis tracker angles for this location,
4123
        # round to nearest 'angledelta'
4124
        if self.meastracker_angle is not None and useMeasuredTrackerAngle is True:
2✔
4125
            print("Tracking Data: Reading from provided Tracker Angles")
2✔
4126
        elif self.meastracker_angle is None and useMeasuredTrackerAngle is True:
2✔
4127
            useMeasuredTrackerAngle = False
×
4128
            print("Warning: Using Measured Tracker Angles was specified but DATA"+
×
4129
                  " for trackers has not yet been assigned. "+
4130
                  " Assign it by making it a column on your Weatherdata File "+
4131
                  "named 'Tracker Angle (degrees)' and run ReadWeatherFile again")
4132

4133
        trackingdata = self._getTrackingAngles(azimuth,
2✔
4134
                                               limit_angle,
4135
                                               angledelta,
4136
                                               axis_tilt = axis_tilt,
4137
                                               backtrack = backtrack,
4138
                                               gcr = gcr,
4139
                                               fixed_tilt_angle=fixed_tilt_angle,
4140
                                               useMeasuredTrackerAngle=useMeasuredTrackerAngle)
4141

4142
        # get list of unique rounded tracker angles
4143
        theta_list = trackingdata.dropna()['theta_round'].unique()
2✔
4144

4145
        if cumulativesky is True:
2✔
4146
            # create a separate metfile for each unique tracker theta angle.
4147
            # return dict of filenames and details
4148
            trackerdict = self._makeTrackerCSV(theta_list,trackingdata)
2✔
4149
        else:
4150
            # trackerdict uses timestamp as keys. return azimuth
4151
            # and tilt for each timestamp
4152
            #times = [str(i)[5:-12].replace('-','_').replace(' ','_') for i in self.datetime]
4153
            times = [i.strftime('%Y-%m-%d_%H%M') for i in self.datetime]
2✔
4154
            #trackerdict = dict.fromkeys(times)
4155
            trackerdict = {}
2✔
4156
            for i,time in enumerate(times) :
2✔
4157
                # remove NaN tracker theta from trackerdict
4158
                if (self.ghi[i] > 0) & (~np.isnan(self.tracker_theta[i])):
2✔
4159
                    trackerdict[time] = {
2✔
4160
                                        'surf_azm':self.surface_azimuth[i],
4161
                                        'surf_tilt':self.surface_tilt[i],
4162
                                        'theta':self.tracker_theta[i],
4163
                                        'dni':self.dni[i],
4164
                                        'ghi':self.ghi[i],
4165
                                        'dhi':self.dhi[i],
4166
                                        'temp_air':self.temp_air[i],
4167
                                        'wind_speed':self.wind_speed[i]
4168
                                        }
4169

4170
        return trackerdict
2✔
4171

4172

4173
    def _getTrackingAngles(self, azimuth=180, limit_angle=45,
2✔
4174
                           angledelta=None, axis_tilt=0, backtrack=True,
4175
                           gcr = 1.0/3.0, fixed_tilt_angle=None,
4176
                           useMeasuredTrackerAngle=False):
4177
        '''
4178
        Helper subroutine to return 1-axis tracker tilt and azimuth data.
4179

4180
        Parameters
4181
        ----------
4182
        same as pvlib.tracking.singleaxis, plus:
4183
        angledelta : degrees
4184
            Angle to round tracker_theta to.  This is for
4185
            cumulativesky simulations. Other input options: None (no 
4186
            rounding of tracker angle) 
4187
        fixed_tilt_angle : (Optional) degrees
4188
            This changes to a fixed tilt simulation where each hour uses 
4189
            fixed_tilt_angle and azimuth as the tilt and azimuth
4190

4191
        Returns
4192
        -------
4193
        DataFrame with the following columns:
4194
            * tracker_theta: The rotation angle of the tracker.
4195
                tracker_theta = 0 is horizontal, and positive rotation angles 
4196
                are clockwise.
4197
            * aoi: The angle-of-incidence of direct irradiance onto the
4198
                rotated panel surface.
4199
            * surface_tilt: The angle between the panel surface and the earth
4200
                surface, accounting for panel rotation.
4201
            * surface_azimuth: The azimuth of the rotated panel, determined by
4202
                projecting the vector normal to the panel's surface to the 
4203
                earth's  surface.
4204
            * 'theta_round' : tracker_theta rounded to the nearest 'angledelta'
4205
            If no angledelta is specified, it is rounded to the nearest degree.
4206
        '''
4207
        import pvlib
2✔
4208
        #import warnings
4209
        from pvlib.irradiance import aoi 
2✔
4210
        #import numpy as np
4211
        #import pandas as pd
4212
        
4213
        solpos = self.solpos
2✔
4214
        
4215
        #New as of 0.3.2:  pass fixed_tilt_angle and switches to FIXED TILT mode
4216

4217
        if fixed_tilt_angle is not None:
2✔
4218
            # system with fixed tilt = fixed_tilt_angle 
4219
            surface_tilt=fixed_tilt_angle
2✔
4220
            surface_azimuth=azimuth 
2✔
4221
            # trackingdata keys: 'tracker_theta', 'aoi', 'surface_azimuth', 'surface_tilt'
4222
            trackingdata = pd.DataFrame({'tracker_theta':fixed_tilt_angle,
2✔
4223
                                         'aoi':aoi(surface_tilt, surface_azimuth,
4224
                                                   solpos['zenith'], 
4225
                                                   solpos['azimuth']),
4226
                                         'surface_azimuth':azimuth,
4227
                                         'surface_tilt':fixed_tilt_angle})
4228
        elif useMeasuredTrackerAngle:           
2✔
4229
            # tracked system
4230
            surface_tilt=self.meastracker_angle
2✔
4231
            surface_azimuth=azimuth
2✔
4232

4233
            trackingdata = pd.DataFrame({'tracker_theta':self.meastracker_angle,
2✔
4234
                                         'aoi':aoi(surface_tilt, surface_azimuth,
4235
                                                   solpos['zenith'], 
4236
                                                   solpos['azimuth']),
4237
                                         'surface_azimuth':azimuth,
4238
                                         'surface_tilt':abs(self.meastracker_angle)})
4239

4240

4241
        else:
4242
            # get 1-axis tracker tracker_theta, surface_tilt and surface_azimuth
4243
            with warnings.catch_warnings():
2✔
4244
                warnings.filterwarnings("ignore", category=RuntimeWarning)
2✔
4245
                trackingdata = pvlib.tracking.singleaxis(solpos['zenith'],
2✔
4246
                                                     solpos['azimuth'],
4247
                                                     axis_tilt,
4248
                                                     azimuth,
4249
                                                     limit_angle,
4250
                                                     backtrack,
4251
                                                     gcr)
4252
            
4253
        # save tracker tilt information to metdata.tracker_theta,
4254
        # metdata.surface_tilt and metdata.surface_azimuth
4255
        self.tracker_theta = np.round(trackingdata['tracker_theta'],2).tolist()
2✔
4256
        self.surface_tilt = np.round(trackingdata['surface_tilt'],2).tolist()
2✔
4257
        self.surface_azimuth = np.round(trackingdata['surface_azimuth'],2).tolist()
2✔
4258
        # undo the  timestamp offset put in by solpos.
4259
        #trackingdata.index = trackingdata.index + pd.Timedelta(minutes = 30)
4260
        # It may not be exactly 30 minutes any more...
4261
        trackingdata.index = self.sunrisesetdata.index  #this has the original time data in it
2✔
4262

4263
        # round tracker_theta to increments of angledelta for use in cumulativesky
4264
        def _roundArbitrary(x, base=angledelta):
2✔
4265
            # round to nearest 'base' value.
4266
            # mask NaN's to avoid rounding error message
4267
            return base * (x/float(base)).round()
2✔
4268

4269
        if angledelta == 0:
2✔
4270
            raise ZeroDivisionError('Angledelta = 0. Use None instead')
×
4271
        elif angledelta is None: # don't round theta
2✔
4272
            trackingdata['theta_round'] = trackingdata['tracker_theta']
×
4273
        else:  # round theta
4274
            trackingdata['theta_round'] = \
2✔
4275
                _roundArbitrary(trackingdata['tracker_theta'], angledelta)
4276

4277
        return trackingdata
2✔
4278

4279
    def _makeTrackerCSV(self, theta_list, trackingdata):
2✔
4280
        '''
4281
        Create multiple new irradiance csv files with data for each unique
4282
        rounded tracker angle. Return a dictionary with the new csv filenames
4283
        and other details, Used for cumulativesky tracking
4284

4285
        Parameters
4286
        -----------
4287
        theta_list : array
4288
             Array of unique tracker angle values
4289

4290
        trackingdata : Pandas 
4291
             Pandas Series with hourly tracker angles from
4292
             :pvlib.tracking.singleaxis
4293

4294
        Returns
4295
        --------
4296
        trackerdict : dictionary
4297
              keys: *theta_round tracker angle  (default: -45 to +45 in
4298
                                                 5 degree increments).
4299
              sub-array keys:
4300
                  *datetime:  array of datetime strings in this group of angles
4301
                  *count:  number of datapoints in this group of angles
4302
                  *surf_azm:  tracker surface azimuth during this group of angles
4303
                  *surf_tilt:  tilt angle average during this group of angles
4304
                  *csvfile:  name of csv met data file saved in /EPWs/
4305
        '''
4306

4307
        dt = pd.to_datetime(self.datetime)
2✔
4308

4309
        trackerdict = dict.fromkeys(theta_list)
2✔
4310

4311
        for theta in sorted(trackerdict):  
2✔
4312
            trackerdict[theta] = {}
2✔
4313
            csvfile = os.path.join('EPWs', '1axis_{}.csv'.format(theta))
2✔
4314
            tempdata = trackingdata[trackingdata['theta_round'] == theta]
2✔
4315

4316
            #Set up trackerdict output for each value of theta
4317
            trackerdict[theta]['csvfile'] = csvfile
2✔
4318
            trackerdict[theta]['surf_azm'] = tempdata['surface_azimuth'].median()
2✔
4319
            trackerdict[theta]['surf_tilt'] = abs(theta)
2✔
4320
            datetimetemp = tempdata.index.strftime('%Y-%m-%d %H:%M:%S') #local time
2✔
4321
            trackerdict[theta]['datetime'] = datetimetemp
2✔
4322
            trackerdict[theta]['count'] = datetimetemp.__len__()
2✔
4323
            #Create new temp csv file with zero values for all times not equal to datetimetemp
4324
            # write 8760 2-column csv:  GHI,DHI
4325
            dni_temp = []
2✔
4326
            ghi_temp = []
2✔
4327
            dhi_temp = []
2✔
4328
            for g, d, time in zip(self.ghi, self.dhi,
2✔
4329
                                  dt.strftime('%Y-%m-%d %H:%M:%S')):
4330

4331
                # is this time included in a particular theta_round angle?
4332
                if time in datetimetemp:
2✔
4333
                    ghi_temp.append(g)
2✔
4334
                    dhi_temp.append(d)
2✔
4335
                else:
4336
                    # mask out irradiance at this time, since it
4337
                    # belongs to a different bin
4338
                    ghi_temp.append(0.0)
2✔
4339
                    dhi_temp.append(0.0)
2✔
4340
            # save in 2-column GHI,DHI format for gencumulativesky -G
4341
            savedata = pd.DataFrame({'GHI':ghi_temp, 'DHI':dhi_temp},
2✔
4342
                                    index = self.datetime).tz_localize(None)
4343
            # Fill partial year. Requires 2021 measurement year.
4344
            savedata = _subhourlydatatoGencumskyformat(savedata, 
2✔
4345
                                                       label=self.label)
4346
            print('Saving file {}, # points: {}'.format(
2✔
4347
                  trackerdict[theta]['csvfile'], datetimetemp.__len__()))
4348

4349
            savedata.to_csv(csvfile,
2✔
4350
                            index=False,
4351
                            header=False,
4352
                            sep=' ',
4353
                            columns=['GHI','DHI'])
4354

4355

4356
        return trackerdict
2✔
4357

4358

4359
class AnalysisObj(SuperClass):
2✔
4360
    """
4361
    Analysis class for performing raytrace to obtain irradiance measurements
4362
    at the array, as well as plotting and reporting results.
4363
    """
4364
    @property
2✔
4365
    def results(self):
2✔
4366
        """
4367
        go through the AnalysisObj and return a DF of irradiance result keys.
4368
        """
4369
        try:
2✔
4370
            keylist = ['rowWanted', 'modWanted', 'sceneNum', 'name', 'x', 'y','z',
2✔
4371
                        'Wm2Front', 'Wm2Back', 'Wm2Ground', 'backRatio', 'mattype', 'rearMat' ]
4372
            resultdict = {k: v for k, v in self.__dict__.items() if k in keylist}
2✔
4373
            results = pd.DataFrame.from_dict(resultdict, orient='index').T.rename(
2✔
4374
                columns={'modWanted':'modNum', 'rowWanted':'rowNum'})
4375
            if getattr(self, 'power_data', None) is not None:
2✔
4376
                return pd.concat([results, self.power_data], axis=1)
2✔
4377
            else:    
4378
                return results
2✔
NEW
4379
        except AttributeError:
×
NEW
4380
            return None
×
4381
    
4382
    def __printval__(self, attr):
2✔
4383
        try:
2✔
4384
            t = getattr(self,attr, None)[0]
2✔
4385
        except (TypeError, KeyError):
2✔
4386
            t = None
2✔
4387
        if isinstance(t, (np.floating, float)) : 
2✔
4388
            return np.array(getattr(self,attr)).round(3).tolist()
2✔
4389
        else:
4390
            return getattr(self,attr)
2✔
4391
                       
4392
    def __repr__(self):
2✔
4393
        return str(type(self)) + ' : ' +  str({key:  self.__printval__(key) for key in self.columns if key != 'results'})  
2✔
4394
    def __init__(self, octfile=None, name=None, hpc=False):
2✔
4395
        """
4396
        Initialize AnalysisObj by pointing to the octfile.  Scan information
4397
        is defined separately by passing scene details into AnalysisObj.moduleAnalysis()
4398
        
4399
        Parameters
4400
        ------------
4401
        octfile : string
4402
            Filename and extension of .oct file
4403
        name    :
4404
        hpc     : boolean, default False. Waits for octfile for a
4405
                  longer time if parallel processing.
4406
        modWanted  : Module used for analysis
4407
        rowWanted  : Row used for analysis 
4408
        sceneNum   : Which scene number (in case of multiple scenes)
4409
        """
4410

4411
        self.octfile = octfile
2✔
4412
        self.name = name
2✔
4413
        self.hpc = hpc
2✔
4414
        self.modWanted = None
2✔
4415
        self.rowWanted = None
2✔
4416
        self.sceneNum = 0 # should this be 0 or None by default??
2✔
4417
        self.power_data = None  # results from self.calculatePerformance() stored here
2✔
4418

4419
    """
2✔
4420
    def getResults(self):   ### REPLACED BY `results` PROPERTY
4421

4422
        #TODO (optional?) Merge power_data to returned values??
4423
        keylist = ['rowWanted', 'modWanted', 'sceneNum', 'name', 'x', 'y','z',
4424
                    'Wm2Front', 'Wm2Back', 'Wm2Ground', 'backRatio', 'mattype', 'rearMat' ]
4425
        resultdict = {k: v for k, v in self.__dict__.items() if k in keylist}
4426
        return pd.DataFrame.from_dict(resultdict, orient='index').T.rename(
4427
            columns={'modWanted':'modNum', 'rowWanted':'rowNum'})
4428
    """
4429

4430
        
4431
    def makeImage(self, viewfile, octfile=None, name=None):
2✔
4432
        """
4433
        Makes a visible image (rendering) of octfile, viewfile
4434
        """
4435
        
4436
        import time
2✔
4437

4438
        if octfile is None:
2✔
4439
            octfile = self.octfile
2✔
4440
        if name is None:
2✔
4441
            name = self.name
2✔
4442

4443
        #TODO: update this for cross-platform compatibility w/ os.path.join
4444
        if self.hpc :
2✔
4445
            time_to_wait = 10
2✔
4446
            time_counter = 0
2✔
4447
            filelist = [octfile, "views/"+viewfile]
2✔
4448
            for file in filelist:
2✔
4449
                while not os.path.exists(file):
2✔
4450
                    time.sleep(1)
×
4451
                    time_counter += 1
×
4452
                    if time_counter > time_to_wait:break
×
4453

4454
        print('Generating visible render of scene')
2✔
4455
        #TODO: update this for cross-platform compatibility w os.path.join
4456
        os.system("rpict -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+
2✔
4457
                  "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 "+
4458
                  "-ad 1536 -as 392 -av 25 25 25 -lr 8 -lw 1e-4 -vf views/"
4459
                  +viewfile+ " " + octfile +
4460
                  " > images/"+name+viewfile[:-3] +".hdr")
4461

4462
    def makeFalseColor(self, viewfile, octfile=None, name=None):
2✔
4463
        """
4464
        Makes a false-color plot of octfile, viewfile
4465
        
4466
        .. note::
4467
            For Windows requires installation of falsecolor.exe,
4468
            which is part of radwinexe-5.0.a.8-win64.zip found at
4469
            http://www.jaloxa.eu/resources/radiance/radwinexe.shtml
4470
        """
4471
        #TODO: error checking for installation of falsecolor.exe 
4472
        
4473
        if octfile is None:
2✔
4474
            octfile = self.octfile
2✔
4475
        if name is None:
2✔
4476
            name = self.name
2✔
4477

4478
        print('Generating scene in WM-2. This may take some time.')
2✔
4479
        #TODO: update and test this for cross-platform compatibility using os.path.join
4480
        cmd = "rpict -i -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+\
2✔
4481
              "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 -ad 1536 -as 392 " +\
4482
              "-av 25 25 25 -lr 8 -lw 1e-4 -vf views/"+viewfile + " " + octfile
4483

4484
        WM2_out,err = _popen(cmd,None)
2✔
4485
        if err is not None:
2✔
4486
            print('Error: {}'.format(err))
×
4487
            return
×
4488

4489
        # determine the extreme maximum value to help with falsecolor autoscale
4490
        extrm_out,err = _popen("pextrem",WM2_out.encode('latin1'))
2✔
4491
        # cast the pextrem string as a float and find the max value
4492
        WM2max = max(map(float,extrm_out.split()))
2✔
4493
        print('Saving scene in false color')
2✔
4494
        #auto scale false color map
4495
        if WM2max < 1100:
2✔
4496
            cmd = "falsecolor -l W/m2 -m 1 -s 1100 -n 11"
2✔
4497
        else:
4498
            cmd = "falsecolor -l W/m2 -m 1 -s %s"%(WM2max,)
×
4499
        with open(os.path.join("images","%s%s_FC.hdr"%(name,viewfile[:-3]) ),"w") as f:
2✔
4500
            data,err = _popen(cmd,WM2_out.encode('latin1'),f)
2✔
4501
            if err is not None:
2✔
4502
                print(err)
×
4503
                print('possible solution: install radwinexe binary package from '
×
4504
                      'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml')
4505

4506
    def _linePtsArray(self, linePtsDict):
2✔
4507
        """
4508
        Helper function to just print the x y and z values in an array format,
4509
        just like they will show in the .csv result files.
4510
        
4511
        """
4512
        xstart = linePtsDict['xstart']
×
4513
        ystart = linePtsDict['ystart']
×
4514
        zstart = linePtsDict['zstart']
×
4515
        xinc = linePtsDict['xinc']
×
4516
        yinc = linePtsDict['yinc']
×
4517
        zinc = linePtsDict['zinc']
×
4518
        sx_xinc = linePtsDict['sx_xinc']
×
4519
        sx_yinc = linePtsDict['sx_yinc']
×
4520
        sx_zinc = linePtsDict['sx_zinc']
×
4521
        Nx = int(linePtsDict['Nx'])
×
4522
        Ny = int(linePtsDict['Ny'])
×
4523
        Nz = int(linePtsDict['Nz'])
×
4524

4525
        x = []
×
4526
        y = []
×
4527
        z = []
×
4528

4529
        for iz in range(0,Nz):
×
4530
            for ix in range(0,Nx):
×
4531
                for iy in range(0,Ny):
×
4532
                    x . append(xstart+iy*xinc+ix*sx_xinc)
×
4533
                    y . append(ystart+iy*yinc+ix*sx_yinc)
×
4534
                    z . append(zstart+iy*zinc+ix*sx_zinc)
×
4535

4536
        return x, y, z
×
4537
        
4538
    def _linePtsMakeDict(self, linePtsDict):
2✔
4539
        a = linePtsDict
2✔
4540
        linepts = self._linePtsMake3D(a['xstart'],a['ystart'],a['zstart'],
2✔
4541
                            a['xinc'], a['yinc'], a['zinc'],
4542
                            a['sx_xinc'], a['sx_yinc'], a['sx_zinc'],
4543
                            a['Nx'],a['Ny'],a['Nz'],a['orient'])
4544
        return linepts
2✔
4545

4546
    def _linePtsMake3D(self, xstart, ystart, zstart, xinc, yinc, zinc,
2✔
4547
                       sx_xinc, sx_yinc, sx_zinc,
4548
                      Nx, Ny, Nz, orient):
4549
        #create linepts text input with variable x,y,z.
4550
        #If you don't want to iterate over a variable, inc = 0, N = 1.
4551

4552
        linepts = ""
2✔
4553
        # make sure Nx, Ny, Nz are ints.
4554
        Nx = int(Nx)
2✔
4555
        Ny = int(Ny)
2✔
4556
        Nz = int(Nz)
2✔
4557

4558

4559
        for iz in range(0,Nz):
2✔
4560
            for ix in range(0,Nx):
2✔
4561
                for iy in range(0,Ny):
2✔
4562
                    xpos = xstart+iy*xinc+ix*sx_xinc
2✔
4563
                    ypos = ystart+iy*yinc+ix*sx_yinc
2✔
4564
                    zpos = zstart+iy*zinc+ix*sx_zinc
2✔
4565
                    linepts = linepts + str(xpos) + ' ' + str(ypos) + \
2✔
4566
                          ' '+str(zpos) + ' ' + orient + " \r"
4567
        return(linepts)
2✔
4568

4569
    def _irrPlot(self, octfile, linepts, mytitle=None, plotflag=None,
2✔
4570
                   accuracy='low'):
4571
        """
4572
        (plotdict) = _irrPlot(linepts,title,time,plotflag, accuracy)
4573
        irradiance plotting using rtrace
4574
        pass in the linepts structure of the view along with a title string
4575
        for the plots.  
4576

4577
        Parameters
4578
        ------------
4579
        octfile : string
4580
            Filename and extension of .oct file
4581
        linepts : 
4582
            Output from :py:class:`bifacial_radiance.AnalysisObj._linePtsMake3D`
4583
        mytitle : string
4584
            Title to append to results files
4585
        plotflag : Boolean
4586
            Include plot of resulting irradiance
4587
        accuracy : string
4588
            Either 'low' (default - faster) or 'high'
4589
            (better for low light)
4590

4591
        Returns
4592
        -------
4593
        out : dictionary
4594
            out.x,y,z  - coordinates of point
4595
            .r,g,b     - r,g,b values in Wm-2
4596
            .Wm2            - equal-weight irradiance
4597
            .mattype        - material intersected
4598
            .title      - title passed in
4599
        """
4600
        
4601
        if mytitle is None:
2✔
4602
            #mytitle = octfile[:-4]
4603
            mytitle = f'{octfile[:-4]}_{self.name}_Row{self.rowWanted}_Module{self.modWanted}'
×
4604

4605
        if plotflag is None:
2✔
4606
            plotflag = False
×
4607

4608
        
4609
        if self.hpc :
2✔
4610
            import time
2✔
4611
            time_to_wait = 10
2✔
4612
            time_counter = 0
2✔
4613
            while not os.path.exists(octfile):
2✔
4614
                time.sleep(1)
×
4615
                time_counter += 1
×
4616
                if time_counter > time_to_wait:
×
4617
                    print('Warning: OCTFILE NOT FOUND')
×
4618
                    break
×
4619

4620
        if octfile is None:
2✔
4621
            print('Analysis aborted. octfile = None' )
×
4622
            return None
×
4623

4624
        keys = ['Wm2','x','y','z','r','g','b','mattype']
2✔
4625
        out = {key: np.empty(0) for key in keys}
2✔
4626
        #out = dict.fromkeys(['Wm2','x','y','z','r','g','b','mattype','title'])
4627
        out['title'] = mytitle
2✔
4628
        print ('Linescan in process: %s' %(mytitle))
2✔
4629
        #rtrace ambient values set for 'very accurate':
4630
        #cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
4631

4632
        if accuracy == 'low':
2✔
4633
            #rtrace optimized for faster scans: (ab2, others 96 is too coarse)
4634
            cmd = "rtrace -i -ab 2 -aa .1 -ar 256 -ad 2048 -as 256 -h -oovs "+ octfile
2✔
4635
        elif accuracy == 'high':
×
4636
            #rtrace ambient values set for 'very accurate':
4637
            cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
×
4638
        else:
4639
            print('_irrPlot accuracy options: "low" or "high"')
×
4640
            return({})
×
4641

4642

4643

4644
        temp_out,err = _popen(cmd,linepts.encode())
2✔
4645
        if err is not None:
2✔
UNCOV
4646
            if err[0:5] == 'error':
×
4647
                raise Exception(err[7:])
×
4648
            else:
UNCOV
4649
                print(err)
×
4650

4651
        # when file errors occur, temp_out is None, and err message is printed.
4652
        if temp_out is not None:
2✔
4653
            for line in temp_out.splitlines():
2✔
4654
                temp = line.split('\t')
2✔
4655
                out['x'] = np.append(out['x'],float(temp[0]))
2✔
4656
                out['y'] = np.append(out['y'],float(temp[1]))
2✔
4657
                out['z'] = np.append(out['z'],float(temp[2]))
2✔
4658
                out['r'] = np.append(out['r'],float(temp[3]))
2✔
4659
                out['g'] = np.append(out['g'],float(temp[4]))
2✔
4660
                out['b'] = np.append(out['b'],float(temp[5]))
2✔
4661
                out['mattype'] = np.append(out['mattype'],temp[6])
2✔
4662
                out['Wm2'] = np.append(out['Wm2'], sum([float(i) for i in temp[3:6]])/3.0)
2✔
4663

4664

4665
            if plotflag is True:
2✔
4666
                import matplotlib.pyplot as plt
×
4667
                plt.figure()
×
4668
                plt.plot(out['Wm2'])
×
4669
                plt.ylabel('Wm2 irradiance')
×
4670
                plt.xlabel('variable')
×
4671
                plt.title(mytitle)
×
4672
                plt.show()
×
4673
        else:
4674
            out = None   # return empty if error message.
×
4675

4676
        return(out)
2✔
4677

4678
    def _saveResults(self, data=None, reardata=None, savefile=None, RGB = False):
2✔
4679
        """
4680
        Function to save output from _irrPlot
4681
        If rearvals is passed in, back ratio is saved
4682
        If data = None then only reardata is saved.
4683
    
4684
        Returns
4685
        --------
4686
        savefile : str
4687
            If set to None, will write to default .csv filename in results folder.
4688
        """
4689

4690
        if savefile is None:
2✔
4691
            savefile = data['title'] + '.csv'
×
4692
        
4693
        if data is None and reardata is not None: # only rear data is passed.
2✔
4694
            data = reardata
2✔
4695
            reardata = None
2✔
4696
            # run process like normal but swap labels at the end
4697
            rearswapflag = True  
2✔
4698
        else:
4699
            rearswapflag = False
2✔
4700
            
4701
        # make savefile dataframe and set self.attributes
4702
        
4703
        if RGB:
2✔
4704
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype', 'Wm2','r', 'g', 'b' ]}
×
4705
        else:
4706
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype','Wm2' ]}
2✔
4707
            
4708
        df = pd.DataFrame(data_sub)
2✔
4709
        df = df.rename(columns={'Wm2':'Wm2Front'})
2✔
4710
        
4711
        if reardata is not None:
2✔
4712
            df.insert(3, 'rearZ', reardata['z'])
2✔
4713
            df.insert(5, 'rearMat', reardata['mattype'])
2✔
4714
            df.insert(7, 'Wm2Back',  reardata['Wm2'])
2✔
4715
            # add 1mW/m2 to avoid dividebyzero
4716
            df.insert(8, 'Back/FrontRatio',  df['Wm2Back'] / (df['Wm2Front']+.001))
2✔
4717
            df['backRatio'] = df['Back/FrontRatio']
2✔
4718
            df['rearX'] = reardata['x']
2✔
4719
            df['rearY'] = reardata['y']
2✔
4720
            if RGB:
2✔
4721
                df['rearR'] = reardata['r']
×
4722
                df['rearG'] = reardata['g']
×
4723
                df['rearB'] = reardata['b']
×
4724
                
4725
        # rename columns if only rear data was originally passed
4726
        if rearswapflag:
2✔
4727
            df = df.rename(columns={'Wm2Front':'Wm2Back','mattype':'rearMat'})
2✔
4728
        # set attributes of analysis to equal columns of df
4729
        for col in df.columns:
2✔
4730
            setattr(self, col, np.array(df[col])) #cdeline: changed from list to np.array on 3/16/24   
2✔
4731
        # only save a subset
4732
        df = df.drop(columns=['backRatio'], errors='ignore')
2✔
4733
        df.to_csv(os.path.join("results", savefile), sep=',',
2✔
4734
                           index=False, float_format='%0.3f')
4735

4736

4737
        print('Saved: %s'%(os.path.join("results", savefile)))
2✔
4738
        return os.path.join("results", savefile)
2✔
4739

4740
    def _saveResultsCumulative(self, data, reardata=None, savefile=None):
2✔
4741
        """
4742
        TEMPORARY FUNCTION -- this is a fix to save ONE cumulative results csv
4743
        in the main working folder for when doing multiple entries in a 
4744
        tracker dict.
4745
        
4746
        Returns
4747
        --------
4748
        savefile : str
4749
            If set to None, will write to default .csv filename in results folder.
4750
        """
4751

4752
        if savefile is None:
×
4753
            savefile = data['title'] + '.csv'
×
4754
        # make dataframe from results
4755
        data_sub = {key:data[key] for key in ['x', 'y', 'z', 'Wm2', 'mattype']}
×
4756
        self.x = data['x']
×
4757
        self.y = data['y']
×
4758
        self.z = data['z']
×
4759
        self.mattype = data['mattype']
×
4760
        #TODO: data_sub front values don't seem to be saved to self.
4761
        if reardata is not None:
×
4762
            self.rearX = reardata['x']
×
4763
            self.rearY = reardata['y']
×
4764
            self.rearMat = reardata['mattype']
×
4765
            data_sub['rearMat'] = self.rearMat
×
4766
            self.rearZ = reardata['z']
×
4767
            data_sub['rearZ'] = self.rearZ
×
4768
            self.Wm2Front = data_sub.pop('Wm2')
×
4769
            data_sub['Wm2Front'] = self.Wm2Front
×
4770
            self.Wm2Back = reardata['Wm2']
×
4771
            data_sub['Wm2Back'] = self.Wm2Back
×
4772
            self.backRatio = [x/(y+.001) for x,y in zip(reardata['Wm2'],data['Wm2'])] # add 1mW/m2 to avoid dividebyzero
×
4773
            data_sub['Back/FrontRatio'] = self.backRatio
×
4774
            df = pd.DataFrame.from_dict(data_sub)
×
4775
            df.to_csv(savefile, sep = ',',
×
4776
                      columns = ['x','y','z','rearZ','mattype','rearMat',
4777
                                 'Wm2Front','Wm2Back','Back/FrontRatio'],
4778
                                 index=False, float_format='%0.3f') # new in 0.2.3
4779

4780
        else:
4781
            df = pd.DataFrame.from_dict(data_sub)
×
4782
            df.to_csv(savefile, sep=',', float_format='%0.3f',
×
4783
                      columns=['x','y','z', 'mattype','Wm2'], index=False)
4784

4785
        print('Saved: %s'%(savefile))
×
4786
        return (savefile)   
×
4787

4788
    def moduleAnalysis(self, scene, modWanted=None, rowWanted=None,
2✔
4789
                       sensorsy=9, sensorsx=1, 
4790
                       frontsurfaceoffset=0.001, backsurfaceoffset=0.001, 
4791
                       modscanfront=None, modscanback=None, relative=False, 
4792
                       debug=False):
4793
        
4794
        """
4795
        Handler function that decides how to handle different number of front
4796
        and back sensors. If number for front sensors is not provided or is 
4797
        the same as for the back, _moduleAnalysis
4798
        is called only once. Else it is called twice to get the different front
4799
        and back dictionary. 
4800
                  
4801
        This function defines the scan points to be used in the 
4802
        :py:class:`~bifacial_radiance.AnalysisObj.analysis` function,
4803
        to perform the raytrace through Radiance function `rtrace`
4804

4805
        Parameters
4806
        ------------
4807
        scene : ``SceneObj``
4808
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
4809
        modWanted : int
4810
            Module wanted to sample. If none, defaults to center module (rounding down)
4811
        rowWanted : int
4812
            Row wanted to sample. If none, defaults to center row (rounding down)
4813
        sensorsy : int or list 
4814
            Number of 'sensors' or scanning points along the collector width 
4815
            (CW) of the module(s). If multiple values are passed, first value
4816
            represents number of front sensors, second value is number of back sensors
4817
        sensorsx : int or list 
4818
            Number of 'sensors' or scanning points along the length, the side perpendicular 
4819
            to the collector width (CW) of the module(s) for the back side of the module. 
4820
            If multiple values are passed, first value represents number of 
4821
            front sensors, second value is number of back sensors.
4822
        debug : bool
4823
            Activates various print statemetns for debugging this function.
4824
        modscanfront : dict
4825
            Dictionary to modify the fronstcan values established by this routine 
4826
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
4827
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'. 
4828
            All of these keys are ints or floats except for 'orient' which 
4829
            takes x y z values as string 'x y z', for example '0 0 -1'. 
4830
            These values will overwrite the internally caculated frontscan
4831
            dictionary for the module & row selected.
4832
        modscanback: dict
4833
            Dictionary to modify the backscan values established by this routine 
4834
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
4835
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'.
4836
            All of these keys are ints or floats except for 'orient' which 
4837
            takes x y z values as string 'x y z', for example '0 0 -1'. 
4838
            These values will overwrite the internally caculated frontscan
4839
            dictionary for the module & row selected.    
4840
        relative : Bool
4841
            if passing modscanfront and modscanback to modify dictionarie of positions,
4842
            this sets if the values passed to be updated are relative or absolute. 
4843
            Default is absolute value (relative=False)
4844
   
4845
        
4846
        Returns
4847
        -------
4848
        frontscan : dictionary
4849
            Scan dictionary for module's front side. Used to pass into 
4850
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4851
        backscan : dictionary 
4852
            Scan dictionary for module's back side. Used to pass into 
4853
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4854
                
4855
        """
4856

4857
        # Height:  clearance height for fixed tilt systems, or torque tube
4858
        #           height for single-axis tracked systems.
4859
        #   Single axis tracked systems will consider the offset to calculate the final height.
4860
        
4861
        def _checkSensors(sensors):
2✔
4862
            # Checking Sensors input data for list or tuple
4863
            if (type(sensors)==tuple or type(sensors)==list):
2✔
4864
                try:
2✔
4865
                    sensors_back = sensors[1]
2✔
4866
                    sensors_front = sensors[0]
2✔
4867
                except IndexError: # only 1 value passed??
×
4868
                    sensors_back = sensors_front = sensors[0]
×
4869
            elif (type(sensors)==int or type(sensors)==float):
2✔
4870
                # Ensure sensors are positive int values.
4871
                if int(sensors) < 1:
2✔
4872
                    raise Exception('input sensorsy must be numeric >0')
×
4873
                sensors_back = sensors_front = int(sensors)
2✔
4874
            else:
4875
                print('Warning: invalid value passed for sensors. Setting = 1')
2✔
4876
                sensors_back = sensors_front = 1
2✔
4877
            return sensors_front, sensors_back
2✔
4878
            
4879
        sensorsy_front, sensorsy_back = _checkSensors(sensorsy)
2✔
4880
        sensorsx_front, sensorsx_back = _checkSensors(sensorsx)
2✔
4881
        
4882
        if (sensorsx_back != sensorsx_front) or (sensorsy_back != sensorsy_front):
2✔
4883
            sensors_diff = True
2✔
4884
        else:
4885
            sensors_diff = False
2✔
4886
          
4887
        dtor = np.pi/180.0
2✔
4888

4889
        # Internal scene parameters are stored in scene.sceneDict. Load these into local variables
4890
        sceneDict = scene.sceneDict      
2✔
4891
        
4892
        azimuth = round(sceneDict['azimuth'], 2)
2✔
4893
        tilt = round(sceneDict['tilt'], 2)
2✔
4894
        nMods = sceneDict['nMods']
2✔
4895
        nRows = sceneDict['nRows']
2✔
4896
        originx = sceneDict['originx']
2✔
4897
        originy = sceneDict['originy']
2✔
4898

4899
       # offset = moduleDict['offsetfromaxis']
4900
        offset = scene.module.offsetfromaxis
2✔
4901
        sceney = scene.module.sceney
2✔
4902
        scenex = scene.module.scenex
2✔
4903

4904
        # x needed for sensorsx>1 case
4905
        x = scene.module.x
2✔
4906
        
4907
        ## Check for proper input variables in sceneDict
4908
        if 'pitch' in sceneDict:
2✔
4909
            pitch = sceneDict['pitch']
2✔
4910
        elif 'gcr' in sceneDict:
2✔
4911
            pitch = sceney / sceneDict['gcr']
2✔
4912
        else:
4913
            raise Exception("Error: no 'pitch' or 'gcr' passed in sceneDict" )
×
4914
        
4915
        if 'axis_tilt' in sceneDict:
2✔
4916
            axis_tilt = sceneDict['axis_tilt']
2✔
4917
        else:
4918
            axis_tilt = 0
×
4919

4920
        if hasattr(scene.module,'z'):
2✔
4921
            modulez = scene.module.z
2✔
4922
        else:
4923
            print ("Module's z not set on sceneDict internal dictionary. Setting to default")
×
4924
            modulez = 0.02
×
4925
            
4926
        if frontsurfaceoffset is None:
2✔
4927
            frontsurfaceoffset = 0.001
2✔
4928
        if backsurfaceoffset is None:
2✔
4929
            backsurfaceoffset = 0.001
2✔
4930
        
4931
        # The Sensor routine below needs a "hub-height", not a clearance height.
4932
        # The below complicated check checks to see if height (deprecated) is passed,
4933
        # and if clearance_height or hub_height is passed as well.
4934

4935
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
4936
                                                               preferred = 'hub_height',
4937
                                                               nonpreferred = 'clearance_height',
4938
                                                               suppress_warning=True)
4939
        
4940
        if use_clearanceheight :
2✔
4941
            height = sceneDict['clearance_height'] + 0.5* \
2✔
4942
                np.sin(abs(tilt) * np.pi / 180) * \
4943
                sceney - offset*np.sin(abs(tilt)*np.pi/180)
4944
        else:
4945
            height = sceneDict['hub_height']
2✔
4946

4947

4948
        if debug:
2✔
4949
            print("For debug:\n hub_height, Azimuth, Tilt, nMods, nRows, "
×
4950
                  "Pitch, Offset, SceneY, SceneX")
4951
            print(height, azimuth, tilt, nMods, nRows,
×
4952
                  pitch, offset, sceney, scenex)
4953

4954
        if modWanted == 0:
2✔
4955
            print( " FYI Modules and Rows start at index 1. "
×
4956
                  "Reindexing to modWanted 1"  )
4957
            modWanted = modWanted+1  # otherwise it gives results on Space.
×
4958

4959
        if rowWanted ==0:
2✔
4960
            print( " FYI Modules and Rows start at index 1. "
×
4961
                  "Reindexing to rowWanted 1"  )
4962
            rowWanted = rowWanted+1
×
4963

4964
        if modWanted is None:
2✔
4965
            modWanted = round(nMods / 1.99)
2✔
4966
        if rowWanted is None:
2✔
4967
            rowWanted = round(nRows / 1.99)
2✔
4968
        self.modWanted = modWanted
2✔
4969
        self.rowWanted = rowWanted
2✔
4970
        if debug is True:
2✔
4971
            print( f"Sampling: modWanted {modWanted}, rowWanted {rowWanted} "
×
4972
                  "out of {nMods} modules, {nRows} rows" )
4973

4974
        x0 = (modWanted-1)*scenex - (scenex*(round(nMods/1.99)*1.0-1))
2✔
4975
        y0 = (rowWanted-1)*pitch - (pitch*(round(nRows / 1.99)*1.0-1))
2✔
4976

4977
        x1 = x0 * np.cos ((180-azimuth)*dtor) - y0 * np.sin((180-azimuth)*dtor)
2✔
4978
        y1 = x0 * np.sin ((180-azimuth)*dtor) + y0 * np.cos((180-azimuth)*dtor)
2✔
4979
        z1 = 0
2✔
4980

4981
        if axis_tilt != 0 and azimuth == 90:
2✔
4982
            print ("fixing height for axis_tilt")
×
4983
            z1 = (modWanted-1)*scenex * np.sin(axis_tilt*dtor)
×
4984

4985
        # Edge of Panel
4986
        x2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4987
        y2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4988
        z2 = -(sceney/2.0) * np.sin(tilt*dtor)
2✔
4989

4990

4991
        # Axis of rotation Offset (if offset is not 0) for the front of the module
4992
        x3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4993
        y3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4994
        z3 = (offset + modulez + frontsurfaceoffset) * np.cos(tilt*dtor)
2✔
4995

4996
        # Axis of rotation Offset, for the back of the module 
4997
        x4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4998
        y4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4999
        z4 = (offset - backsurfaceoffset) * np.cos(tilt*dtor)
2✔
5000

5001
        xstartfront = x1 + x2 + x3 + originx
2✔
5002
        xstartback = x1 + x2 + x4 + originx
2✔
5003

5004
        ystartfront = y1 + y2 + y3 + originy
2✔
5005
        ystartback = y1 + y2 + y4 + originy
2✔
5006

5007
        zstartfront = height + z1 + z2 + z3
2✔
5008
        zstartback = height + z1 + z2 + z4
2✔
5009

5010
        #Adjust orientation of scan depending on tilt & azimuth
5011
        zdir = np.cos((tilt)*dtor)
2✔
5012
        ydir = np.sin((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5013
        xdir = np.sin((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5014
        front_orient = '%0.3f %0.3f %0.3f' % (-xdir, -ydir, -zdir)
2✔
5015
        back_orient = '%0.3f %0.3f %0.3f' % (xdir, ydir, zdir)
2✔
5016
    
5017
        #IF cellmodule:
5018
        #TODO: Add check for sensorsx_back
5019
        
5020
        #if (getattr(scene.module, 'cellModule', None)):  #1/2 cell x and y offset to hit the center of a cell
5021
        #    xcell = scene.module.cellModule.xcell
5022
        #    ycell = scene.module.cellModule.ycell
5023
        #    xstartfront = xstartfront - xcell/2 * np.cos((azimuth)*dtor) + ycell/2 * np.sin((azimuth)*dtor) * np.cos((tilt)*dtor)
5024
        #    xstartback = xstartback  - xcell/2 * np.cos((azimuth)*dtor) + ycell/2 * np.sin((azimuth)*dtor) * np.cos((tilt)*dtor)
5025
        #    ystartfront = ystartfront - xcell/2 * np.sin((azimuth)*dtor) + ycell/2 * np.cos((azimuth)*dtor) * np.cos((tilt)*dtor)
5026
        #    ystartback = ystartback  - xcell/2 * np.sin((azimuth)*dtor) + ycell/2 * np.cos((azimuth)*dtor) * np.cos((tilt)*dtor)
5027
        #    zstartfront = zstartfront +xcell/2*np.sin((tilt)*dtor)
5028
        #    zstartback = zstartback +xcell/2*np.sin((tilt)*dtor)
5029
            
5030
        if ((getattr(scene.module, 'cellModule', None)) and
2✔
5031
            (sensorsy_back == scene.module.cellModule.numcellsy)):
5032
            ycell = scene.module.cellModule.ycell
2✔
5033
            xinc_back = -((sceney - ycell ) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5034
            yinc_back = -((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5035
            zinc_back = ((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.sin(tilt*dtor)
2✔
5036
            firstsensorxstartfront = xstartfront - ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5037
            firstsensorxstartback = xstartback  - ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5038
            firstsensorystartfront = ystartfront - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5039
            firstsensorystartback = ystartback - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5040
            firstsensorzstartfront = zstartfront + ycell/2 * np.sin(tilt*dtor)
2✔
5041
            firstsensorzstartback = zstartback + ycell/2  * np.sin(tilt*dtor)
2✔
5042
            xinc_front = xinc_back
2✔
5043
            yinc_front = yinc_back
2✔
5044
            zinc_front = zinc_back
2✔
5045
            
5046
            sx_xinc_front = 0.0
2✔
5047
            sx_yinc_front = 0.0
2✔
5048
            sx_zinc_front = 0.0
2✔
5049
            sx_xinc_back = 0.0
2✔
5050
            sx_yinc_back = 0.0
2✔
5051
            sx_zinc_back = 0.0
2✔
5052
        
5053
            if (sensorsx_back != 1.0):
2✔
5054
                print("Warning: Cell-level module analysis for sensorsx > 1 not "+
×
5055
                      "fine-tuned yet. Use at own risk, some of the x positions "+
5056
                      "might fall in spacing between cells.")
5057
              
5058
        else:        
5059
            xinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5060
            yinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5061
            zinc_back = (sceney/(sensorsy_back + 1.0)) * np.sin(tilt*dtor)
2✔
5062
            
5063
            
5064
            if sensors_diff:
2✔
5065
                xinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5066
                yinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5067
                zinc_front = (sceney/(sensorsy_front + 1.0)) * np.sin(tilt*dtor)
2✔
5068
                
5069
            else:
5070
                xinc_front = xinc_back
2✔
5071
                yinc_front = yinc_back
2✔
5072
                zinc_front = zinc_back
2✔
5073
                
5074
            firstsensorxstartfront = xstartfront+xinc_front
2✔
5075
            firstsensorxstartback = xstartback+xinc_back
2✔
5076
            firstsensorystartfront = ystartfront+yinc_front
2✔
5077
            firstsensorystartback = ystartback+yinc_back
2✔
5078
            firstsensorzstartfront = zstartfront + zinc_front
2✔
5079
            firstsensorzstartback = zstartback + zinc_back
2✔
5080
        
5081
            ## Correct positions for sensorsx other than 1
5082
            # TODO: At some point, this equations can include the case where 
5083
            # sensorsx = 1, and cleanup the original position calculation to place
5084
            # firstsensorxstartback before this section on edge not on center.
5085
            # will save some multiplications and division but well, it works :)
5086
            
5087
            if sensorsx_back > 1.0:
2✔
5088
                sx_xinc_back = -(x/(sensorsx_back*1.0+1)) * np.cos((azimuth)*dtor)
×
5089
                sx_yinc_back = (x/(sensorsx_back*1.0+1)) * np.sin((azimuth)*dtor)
×
5090
                # Not needed unless axis_tilt != 0, which is not a current option
5091
                sx_zinc_back = 0.0 #       
×
5092
                
5093
                firstsensorxstartback = firstsensorxstartback + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
5094
                firstsensorystartback = firstsensorystartback - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
5095
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
5096
                #firstsensorxstartfront = firstsensorxstartback
5097
                #firstsensorystartfront = firstsensorystartback                
5098
            else:
5099
                sx_xinc_back = 0.0
2✔
5100
                sx_yinc_back = 0.0
2✔
5101
                sx_zinc_back = 0.0
2✔
5102
            
5103
            if sensorsx_front > 1.0:
2✔
5104
                sx_xinc_front = -(x/(sensorsx_front*1.0+1)) * np.cos((azimuth)*dtor)
×
5105
                sx_yinc_front = (x/(sensorsx_front*1.0+1)) * np.sin((azimuth)*dtor)
×
5106
                # Not needed unless axis_tilt != 0, which is not a current option
5107
                sx_zinc_front = 0.0 # 
×
5108
                
5109
                firstsensorxstartfront = firstsensorxstartfront + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
5110
                firstsensorystartfront = firstsensorystartfront - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
5111

5112
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
5113
            else:
5114
                sx_xinc_front = 0.0
2✔
5115
                sx_yinc_front = 0.0
2✔
5116
                sx_zinc_front = 0.0
2✔
5117
                
5118
                
5119
        if debug is True:
2✔
5120
            print("Azimuth", azimuth)
×
5121
            print("Coordinate Center Point of Desired Panel before azm rotation", x0, y0)
×
5122
            print("Coordinate Center Point of Desired Panel after azm rotation", x1, y1)
×
5123
            print("Edge of Panel", x2, y2, z2)
×
5124
            print("Offset Shift", x3, y3, z3)
×
5125
            print("Final Start Coordinate Front", xstartfront, ystartfront, zstartfront)
×
5126
            print("Increase Coordinates", xinc_front, yinc_front, zinc_front)
×
5127
        
5128
        frontscan = {'xstart': firstsensorxstartfront, 'ystart': firstsensorystartfront,
2✔
5129
                     'zstart': firstsensorzstartfront,
5130
                     'xinc':xinc_front, 'yinc': yinc_front, 'zinc':zinc_front,
5131
                     'sx_xinc':sx_xinc_front, 'sx_yinc':sx_yinc_front,
5132
                     'sx_zinc':sx_zinc_front, 
5133
                     'Nx': sensorsx_front, 'Ny':sensorsy_front, 'Nz':1, 'orient':front_orient }
5134
        backscan = {'xstart':firstsensorxstartback, 'ystart': firstsensorystartback,
2✔
5135
                     'zstart': firstsensorzstartback,
5136
                     'xinc':xinc_back, 'yinc': yinc_back, 'zinc':zinc_back,
5137
                     'sx_xinc':sx_xinc_back, 'sx_yinc':sx_yinc_back,
5138
                     'sx_zinc':sx_zinc_back, 
5139
                     'Nx': sensorsx_back, 'Ny':sensorsy_back, 'Nz':1, 'orient':back_orient }
5140

5141
        if modscanfront is not None:
2✔
5142
            frontscan2 = _modDict(originaldict=frontscan, moddict=modscanfront, relative=relative)
2✔
5143
        else:
5144
            frontscan2 = frontscan.copy()
2✔
5145
        if modscanback is not None:
2✔
5146
            backscan2 = _modDict(originaldict=backscan, moddict=modscanback, relative=relative)
×
5147
        else:
5148
            backscan2 = backscan.copy()   
2✔
5149

5150
        return frontscan2, backscan2
2✔
5151
    
5152
    def groundAnalysis(self, scene, modWanted=None, rowWanted=None, 
2✔
5153
                       sensorsground=None, sensorsgroundx=1):
5154
        """
5155
        run a single ground scan along the entire row-row pitch of the scene. 
5156

5157
        Parameters
5158
        ----------
5159
        scene : ``SceneObj``
5160
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5161
        modWanted : int
5162
            Module wanted to sample. If none, defaults to center module (rounding down)
5163
        rowWanted : int
5164
            Row wanted to sample. If none, defaults to center row (rounding down)
5165
        sensorsground : int (default None)
5166
            Number of scan points along the scene pitch.  Default every 20cm
5167
        sensorsgroundx : int (default 1)
5168
            Number of scans in the x dimension, the side perpendicular 
5169
            to the collector width (CW) of the module(s)
5170

5171
        Returns
5172
        -------
5173
        groundscan : dictionary
5174
            Scan dictionary for the ground including beneath modules. Used to pass into 
5175
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
5176

5177
        """
5178
              
5179
        dtor = np.pi/180.0
2✔
5180

5181
        # Internal scene parameters are stored in scene.sceneDict. Load these into local variables
5182
        sceneDict = scene.sceneDict
2✔
5183

5184
        azimuth = sceneDict['azimuth']
2✔
5185
        #tilt = sceneDict['tilt']
5186
        nMods = sceneDict['nMods']
2✔
5187
        nRows = sceneDict['nRows']
2✔
5188
        originx = sceneDict['originx']
2✔
5189
        originy = sceneDict['originy']
2✔
5190

5191
        sceney = scene.module.sceney
2✔
5192
        scenex = scene.module.scenex
2✔
5193

5194
        # x needed for sensorsx>1 case
5195
        #x = scene.module.x
5196
        
5197
        ## Check for proper input variables in sceneDict
5198
        if 'pitch' in sceneDict:
2✔
5199
            pitch = sceneDict['pitch']
×
5200
        elif 'gcr' in sceneDict:
2✔
5201
            pitch = sceney / sceneDict['gcr']
2✔
5202
        else:
5203
            raise Exception("Error: no 'pitch' or 'gcr' passed in sceneDict" )
×
5204
                     
5205
        if sensorsground is None:
2✔
5206
            sensorsground = max(1,round(pitch * 5)) # scan every 20 cm
2✔
5207
        if modWanted is None:
2✔
5208
            modWanted = round(nMods / 1.99)
2✔
5209
        if rowWanted is None:
2✔
5210
            rowWanted = round(nRows / 1.99)
2✔
5211
        self.modWanted = modWanted
2✔
5212
        self.rowWanted = rowWanted
2✔
5213

5214
        
5215
        x0 = (modWanted-1)*scenex - (scenex*(round(nMods/1.99)*1.0-1))
2✔
5216
        y0 = (rowWanted-1)*pitch - (pitch*(round(nRows / 1.99)*1.0-1))
2✔
5217
        
5218
        x1 = x0 * np.cos ((180-azimuth)*dtor) - y0 * np.sin((180-azimuth)*dtor)
2✔
5219
        y1 = x0 * np.sin ((180-azimuth)*dtor) + y0 * np.cos((180-azimuth)*dtor)
2✔
5220
        
5221
        xstart = x1 + originx
2✔
5222
        ystart = y1 + originy
2✔
5223
        zstart = 0.05
2✔
5224

5225
        ground_orient = '0 0 -1'
2✔
5226

5227
        groundsensorspacing = pitch / (sensorsground - 1)
2✔
5228
        xinc = groundsensorspacing * np.sin((azimuth)*dtor)
2✔
5229
        yinc = groundsensorspacing * np.cos((azimuth)*dtor)
2✔
5230
        zinc = 0
2✔
5231
        
5232
        groundscan = {'xstart': xstart, 'ystart': ystart,
2✔
5233
                     'zstart': zstart,
5234
                     'xinc':xinc, 'yinc': yinc, 'zinc':zinc,
5235
                     'sx_xinc':0, 'sx_yinc':0,
5236
                     'sx_zinc':0,
5237
                     'Nx': sensorsgroundx, 'Ny':sensorsground, 'Nz':1,
5238
                     'orient':ground_orient }
5239

5240
        return groundscan
2✔
5241
      
5242
    def analyzeRow(self, octfile, scene, rowWanted=None, name=None, 
2✔
5243
                   sensorsy=None, sensorsx=None ):
5244
        '''
5245
        Function to Analyze every module in the row. 
5246

5247
        Parameters
5248
        ----------
5249
        octfile : string
5250
            Filename and extension of .oct file
5251
        scene : ``SceneObj``
5252
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5253
        rowWanted : int
5254
            Row wanted to sample. If none, defaults to center row (rounding down)
5255
        sensorsy : int or list 
5256
            Number of 'sensors' or scanning points along the collector width 
5257
            (CW) of the module(s). If multiple values are passed, first value
5258
            represents number of front sensors, second value is number of back sensors
5259
        sensorsx : int or list 
5260
            Number of 'sensors' or scanning points along the length, the side perpendicular 
5261
            to the collector width (CW) of the module(s) for the back side of the module. 
5262
            If multiple values are passed, first value represents number of 
5263
            front sensors, second value is number of back sensors.
5264

5265
        Returns
5266
        -------
5267
        df_row : dataframe
5268
            Dataframe with all values sampled for the row.
5269

5270
        '''
5271
        #allfront = []
5272
        #allback = []
5273

5274
        nMods = scene.sceneDict['nMods']
2✔
5275

5276
        if rowWanted == None:
2✔
5277
            rowWanted = round(self.nRows / 1.99)
×
5278
            
5279
        if name is None:
2✔
5280
                name = 'RowAnalysis_'+str(rowWanted)
×
5281

5282
        df_dict_row = {}
2✔
5283
        row_keys = ['x','y','z','rearZ','mattype','rearMat','Wm2Front','Wm2Back','ModNumber']
2✔
5284
        dict_row = df_dict_row.fromkeys(row_keys)
2✔
5285
        df_row = pd.DataFrame(dict_row, index = [j for j in range(nMods)])
2✔
5286
        
5287
        # Starting on 1 because moduleAnalysis does not consider "0" for row or Mod wanted.
5288
        for i in range (0, nMods):
2✔
5289
            temp_dict = {}
2✔
5290
            frontscan, backscan = self.moduleAnalysis(scene, sensorsy=sensorsy, 
2✔
5291
                                        sensorsx=sensorsx, modWanted = i+1, 
5292
                                        rowWanted = rowWanted) 
5293
            allscan = self.analysis(octfile, name, frontscan, backscan) 
2✔
5294
            front_dict = allscan[0]
2✔
5295
            back_dict = allscan[1]
2✔
5296
            temp_dict['x'] = front_dict['x']
2✔
5297
            temp_dict['y'] = front_dict['y']
2✔
5298
            temp_dict['z'] = front_dict['z']
2✔
5299
            temp_dict['rearx'] = back_dict['z']
2✔
5300
            temp_dict['reary'] = back_dict['z']
2✔
5301
            temp_dict['rearZ'] = back_dict['z']
2✔
5302
            temp_dict['mattype'] = front_dict['mattype']
2✔
5303
            temp_dict['rearMat'] = back_dict['mattype']
2✔
5304
            temp_dict['Wm2Front'] = front_dict['Wm2']
2✔
5305
            temp_dict['Wm2Back'] = back_dict['Wm2']
2✔
5306
            temp_dict['ModNumber'] = i+1
2✔
5307
            df_row.iloc[i] = temp_dict
2✔
5308
        
5309
        # check for path in the new Radiance directory:
5310
        rowpath = os.path.join("results", "CompiledResults")
2✔
5311

5312
        def _checkPath(rowpath):  # create the file structure if it doesn't exist
2✔
5313
            if not os.path.exists(rowpath):
2✔
5314
                os.makedirs(rowpath)
2✔
5315
                print('Making path for compiled results: '+rowpath)
2✔
5316
        
5317
        _checkPath(rowpath)
2✔
5318
        
5319
        savefile = 'compiledRow_{}.csv'.format(rowWanted)
2✔
5320

5321
        df_row.to_csv(os.path.join(rowpath, savefile), sep = ',',
2✔
5322
                           index=False, float_format='%0.3f')
5323

5324

5325
        return df_row
2✔
5326

5327
    def analyzeField(self, octfile, scene, name=None, 
2✔
5328
                   sensorsy=None, sensorsx=None ):
5329
        '''
5330
        Function to Analyze every module in a scene
5331

5332
        Parameters
5333
        ----------
5334
        octfile : string
5335
            Filename and extension of .oct file
5336
        scene : ``SceneObj``
5337
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5338
        rowWanted : int
5339
            Row wanted to sample. If none, defaults to center row (rounding down)
5340
        sensorsy : int or list 
5341
            Number of 'sensors' or scanning points along the collector width 
5342
            (CW) of the module(s). If multiple values are passed, first value
5343
            represents number of front sensors, second value is number of back sensors
5344
        sensorsx : int or list 
5345
            Number of 'sensors' or scanning points along the length, the side perpendicular 
5346
            to the collector width (CW) of the module(s) for the back side of the module. 
5347
            If multiple values are passed, first value represents number of 
5348
            front sensors, second value is number of back sensors.
5349

5350
        Returns
5351
        -------
5352
        df_row : dataframe
5353
            Dataframe with all values sampled for the row.
5354

5355
        '''
5356
        #allfront = []
5357
        #allback = []
5358

5359
        nRows = scene.sceneDict['nRows']
×
5360
            
5361
        if name is None:
×
5362
                name = 'FieldAnalysis'
×
5363

5364
        frames = []
×
5365

5366

5367
        for ii in range(1, nRows+1):
×
5368
            dfrow = self.analyzeRow(octfile=octfile, scene=scene, rowWanted=ii, name=name+'_Row_'+str(ii), 
×
5369
                               sensorsy=sensorsy, sensorsx=sensorsx)
5370
            dfrow['Row'] = ii
×
5371
            frames.append(dfrow)
×
5372
        
5373
        result = pd.concat(frames)
×
5374
       
5375
        # check for path in the new Radiance directory:
5376
        fieldpath = os.path.join("results", "CompiledResults")       
×
5377
        savefile = 'compiledField_{}.csv'.format(name)
×
5378

5379
        result.to_csv(os.path.join(fieldpath, savefile), sep = ',',
×
5380
                           index=False, float_format='%0.3f')
5381

5382

5383
        return result
×
5384
    
5385
    def analysis(self, octfile, name, frontscan, backscan=None,
2✔
5386
                 plotflag=False, accuracy='low', RGB=False):
5387
        """
5388
        General analysis function, where linepts are passed in for calling the
5389
        raytrace routine :py:class:`~bifacial_radiance.AnalysisObj._irrPlot` 
5390
        and saved into results with 
5391
        :py:class:`~bifacial_radiance.AnalysisObj._saveResults`.
5392

5393
        
5394
        Parameters
5395
        ------------
5396
        octfile : string
5397
            Filename and extension of .oct file
5398
        name : string 
5399
            Name to append to output files
5400
        frontscan : scene.frontscan object
5401
            Object with the sensor location information for the 
5402
            front of the module
5403
        backscan : scene.backscan object. (optional)
5404
            Object with the sensor location information for the 
5405
            rear side of the module.
5406
        plotflag : boolean
5407
            Include plot of resulting irradiance
5408
        accuracy : string 
5409
            Either 'low' (default - faster) or 'high' (better for low light)
5410
        RGB : Bool
5411
            If the raytrace is a spectral raytrace and information for the three channe
5412
            wants to be saved, set RGB to True.
5413

5414
            
5415
        Returns
5416
        -------
5417
         File saved in `\\results\\irr_name.csv`
5418

5419
        """
5420

5421
        if octfile is None:
2✔
5422
            print('Analysis aborted - no octfile \n')
×
5423
            return None, None
×
5424
        linepts = self._linePtsMakeDict(frontscan)
2✔
5425
        if self.rowWanted:
2✔
5426
            name = name + f'_Row{self.rowWanted}'
2✔
5427
        if self.modWanted:
2✔
5428
            name = name + f'_Module{self.modWanted}'
2✔
5429
        frontDict = self._irrPlot(octfile, linepts, name+'_Front',
2✔
5430
                                    plotflag=plotflag, accuracy=accuracy)
5431

5432
        if backscan is None:  #only one scan
2✔
5433
            if frontDict is not None:
2✔
5434
                self.Wm2Front = np.mean(frontDict['Wm2'])
2✔
5435
                self._saveResults(frontDict, reardata=None, savefile='irr_%s.csv'%(name), RGB=RGB)
2✔
5436
            return frontDict
2✔
5437
        #bottom view.
5438
        linepts = self._linePtsMakeDict(backscan)
2✔
5439
        backDict = self._irrPlot(octfile, linepts, name+'_Back',
2✔
5440
                                   plotflag=plotflag, accuracy=accuracy)
5441

5442
        # don't save if _irrPlot returns an empty file.
5443
        if frontDict is not None:
2✔
5444
            if len(frontDict['Wm2']) != len(backDict['Wm2']):
2✔
5445
                self.Wm2Front = np.mean(frontDict['Wm2'])
2✔
5446
                self.Wm2Back = np.mean(backDict['Wm2'])
2✔
5447
                self.backRatio = self.Wm2Back / (self.Wm2Front + .001)
2✔
5448
                self._saveResults(frontDict, reardata=None, savefile='irr_%s.csv'%(name+'_Front'), RGB=RGB)
2✔
5449
                self._saveResults(data=None, reardata=backDict, savefile='irr_%s.csv'%(name+'_Back'), RGB=RGB)
2✔
5450
            else:
5451
                self._saveResults(frontDict, backDict,'irr_%s.csv'%(name), RGB=RGB)
2✔
5452

5453
        return frontDict, backDict
2✔
5454

5455

5456
    def calculatePerformance(self, meteo_data, cumulativesky, module,
2✔
5457
                         CECMod2=None, agriPV=False):
5458
        """
5459
        For a given AnalysisObj, use performance.calculatePerformance to calculate performance, 
5460
        considering electrical mismatch, using PVLib. Cell temperature is calculated 
5461
    
5462
        Parameters
5463
         ----------
5464
        meteo_data : Dict
5465
            Dictionary with meteorological data needed to run CEC model.  Keys:
5466
            'temp_air', 'wind_speed', 'dni', 'dhi', 'ghi'
5467
        module: ModuleObj from scene.module
5468
            Requires CEC Module parameters to be set. If None, default to Prism Solar.
5469
        CECMod2 : Dict
5470
            Dictionary with CEC Module Parameters for a Monofacial module. If None,
5471
            same module as CECMod is used for the BGE calculations, but just 
5472
            using the front irradiance (Gfront). 
5473
    
5474
        Returns
5475
        -------
5476
        performance : dictionary with performance results for that simulation.
5477
            Keys:
5478
            'POA_eff': mean of [(mean of clean Gfront) + clean Grear * bifaciality factor]
5479
            'Gfront_mean': mean of clean Gfront
5480
            'Grear_mean': mean of clean Grear
5481
            'Mismatch': mismatch calculated from the MAD distribution of POA_total
5482
            'Pout_raw': power output calculated from POA_total, considers wind speed and temp_amb if in trackerdict.
5483
            'Pout': power output considering electrical mismatch
5484
            
5485
        """  
5486

5487
        from bifacial_radiance import performance
2✔
5488
        
5489
        #TODO: make this operate on the MetObj class, not special dictionary!
5490
        #TODO: Check that meteo_data only includes correct kwargs
5491
        # 'dni', 'ghi', 'dhi', 'temp_air', 'wind_speed'
5492
        
5493
        if cumulativesky is False:
2✔
5494
            
5495
            # If CECMod details aren't passed, use a default Prism Solar value.
5496
            #if type(module) is not ModuleObj:  # not working for some reason..
5497
            if str(type(module)) != "<class 'bifacial_radiance.module.ModuleObj'>":
2✔
NEW
5498
                raise TypeError('ModuleObj input required for AnalysisObj.calculatePerformance. '+\
×
5499
                                f'type passed: {type(module)}')           
5500
    
5501
            self.power_data = performance.calculatePerformance(module=module, results=self.results,
2✔
5502
                                               CECMod2=CECMod2, agriPV=agriPV,
5503
                                               **meteo_data)
5504

5505
        else:
5506
            # TODO HERE: SUM all keys for rows that have the same rowWanted/modWanted
5507
    
NEW
5508
            self.power_data = performance.calculatePerformanceGencumsky(results=self.results,
×
5509
                                                                 agriPV=agriPV)
5510
            #results.to_csv(os.path.join('results', 'Cumulative_Results.csv'))
5511
    
5512
        #compiledResults = results         
5513
        #trackerdict = trackerdict
5514

5515
def quickExample(testfolder=None):
2✔
5516
    """
5517
    Example of how to run a Radiance routine for a simple rooftop bifacial system
5518

5519
    """
5520

5521
    import bifacial_radiance
2✔
5522
    
5523
    if testfolder == None:
2✔
5524
        testfolder = bifacial_radiance.main._interactive_directory(
×
5525
            title = 'Select or create an empty directory for the Radiance tree')
5526

5527
    demo = bifacial_radiance.RadianceObj('simple_panel', path=testfolder)  # Create a RadianceObj 'object'
2✔
5528

5529
    # input albedo number or material name like 'concrete'.
5530
    # To see options, run setGround without any input.
5531
    demo.setGround(0.62)
2✔
5532
    try:
2✔
5533
        epwfile = demo.getEPW(lat=40.01667, lon=-105.25) # pull TMY data for any global lat/lon
2✔
5534
    except ConnectionError: # no connection to automatically pull data
×
5535
        pass
×
5536

5537
    metdata = demo.readWeatherFile(epwfile, coerce_year=2001) # read in the EPW weather data from above
2✔
5538
    #metdata = demo.readTMY() # select a TMY file using graphical picker
5539
    # Now we either choose a single time point, or use cumulativesky for the entire year.
5540
    cumulativeSky = False
2✔
5541
    if cumulativeSky:
2✔
5542
        demo.genCumSky() # entire year.
×
5543
    else:
5544
        timeindex = metdata.datetime.index(pd.to_datetime('2001-06-17 12:0:0 -7'))
2✔
5545
        demo.gendaylit(metdata=metdata, timeindex=timeindex)  # Noon, June 17th
2✔
5546

5547

5548
    # create a scene using panels in landscape at 10 deg tilt, 1.5m pitch. 0.2 m ground clearance
5549
    moduletype = 'test-module'
2✔
5550
    module = demo.makeModule(name=moduletype, x=1.59, y=0.95)
2✔
5551
    sceneDict = {'tilt':10,'pitch':1.5,'clearance_height':0.2,
2✔
5552
                 'azimuth':180, 'nMods': 10, 'nRows': 3}
5553
    #makeScene creates a .rad file with 10 modules per row, 3 rows.
5554
    scene = demo.makeScene(module=module, sceneDict=sceneDict)
2✔
5555
    # makeOct combines all of the ground, sky and object files into .oct file.
5556
    octfile = demo.makeOct(demo.getfilelist())
2✔
5557

5558
    # return an analysis object including the scan dimensions for back irradiance
5559
    analysis = bifacial_radiance.AnalysisObj(octfile, demo.name)
2✔
5560
    frontscan, backscan = analysis.moduleAnalysis(scene, sensorsy=9)
2✔
5561
    analysis.analysis(octfile, demo.name, frontscan, backscan, accuracy='low')
2✔
5562
    # bifacial ratio should be 11.6% +/- 0.1% (+/- 1% absolute with glass-glass module)
5563
    print('Annual bifacial ratio average:  %0.3f' %(
2✔
5564
            sum(analysis.Wm2Back) / sum(analysis.Wm2Front) ) )
5565

5566
    return analysis
2✔
5567

5568

STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc