• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

NREL / bifacial_radiance / 9194573527

22 May 2024 03:56PM UTC coverage: 70.326% (+0.06%) from 70.268%
9194573527

push

github

cdeline
Fix pytests

3728 of 5301 relevant lines covered (70.33%)

1.41 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

76.32
/bifacial_radiance/main.py
1
#!/usr/bin/env python
2

3
"""
2✔
4
@author: cdeline
5

6
bifacial_radiance.py - module to develop radiance bifacial scenes, including gendaylit and gencumulativesky
7
7/5/2016 - test script based on G173_journal_height
8
5/1/2017 - standalone module
9

10
Pre-requisites:
11
    This software is written for Python >3.6 leveraging many Anaconda tools (e.g. pandas, numpy, etc)
12

13
    *RADIANCE software should be installed from https://github.com/NREL/Radiance/releases
14

15
    *If you want to use gencumulativesky, move 'gencumulativesky.exe' from
16
    'bifacial_radiance\data' into your RADIANCE source directory.
17

18
    *If using a Windows machine you should download the Jaloxa executables at
19
    http://www.jaloxa.eu/resources/radiance/radwinexe.shtml#Download
20

21
    * Installation of  bifacial_radiance from the repo:
22
    1. Clone the repo
23
    2. Navigate to the directory using the command prompt
24
    3. run `pip install -e . `
25

26
Overview:
27
    Bifacial_radiance includes several helper functions to make it easier to evaluate
28
    different PV system orientations for rear bifacial irradiance.
29
    Note that this is simply an optical model - identifying available rear irradiance under different conditions.
30

31
    For a detailed demonstration example, look at the .ipnyb notebook in \docs\
32

33
    There are two solar resource modes in bifacial_radiance: `gendaylit` uses hour-by-hour solar
34
    resource descriptions using the Perez diffuse tilted plane model.
35
    `gencumulativesky` is an annual average solar resource that combines hourly
36
    Perez skies into one single solar source, and computes an annual average.
37

38
    bifacial_radiance includes five object-oriented classes:
39

40
    RadianceObj:  top level class to work on radiance objects, keep track of filenames,
41
    sky values, PV module type etc.
42

43
    GroundObj:    details for the ground surface and reflectance
44

45
    SceneObj:    scene information including array configuration (row spacing, clearance or hub height)
46

47
    MetObj: meteorological data from EPW (energyplus) file.
48
        Future work: include other file support including TMY files
49

50
    AnalysisObj: Analysis class for plotting and reporting
51

52
"""
53
import logging
2✔
54
logging.basicConfig()
2✔
55
LOGGER = logging.getLogger(__name__)
2✔
56
LOGGER.setLevel(logging.DEBUG)
2✔
57

58
import os, datetime
2✔
59
from subprocess import Popen, PIPE  # replacement for os.system()
2✔
60
import pandas as pd
2✔
61
import numpy as np 
2✔
62
import warnings
2✔
63

64

65

66
global DATA_PATH # path to data files including module.json.  Global context
67
DATA_PATH = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data'))
2✔
68

69
def _findme(lst, a): #find string match in a list. script from stackexchange
2✔
70
    return [i for i, x in enumerate(lst) if x == a]
2✔
71

72
def _firstlist(l):  #find first not-none value in a list.  useful for checking multiple keys in dict 
2✔
73
    try:
2✔
74
        return next(item for item in l if item is not None)
2✔
75
    except StopIteration:
2✔
76
        return None
2✔
77

78
def _missingKeyWarning(dictype, missingkey, newvalue): # prints warnings 
2✔
79
    if type(newvalue) is bool:
2✔
80
        valueunit = ''
×
81
    else:
82
        valueunit = 'm'
2✔
83
    print("Warning: {} Dictionary Parameters passed, but {} is missing. ".format(dictype, missingkey))        
2✔
84
    print("Setting it to default value of {} {} to continue\n".format(newvalue, valueunit))
2✔
85

86
def _normRGB(r, g, b): #normalize by each color for human vision sensitivity
2✔
87
    return r*0.216+g*0.7152+b*0.0722
2✔
88

89
def _popen(cmd, data_in, data_out=PIPE):
2✔
90
    """
91
    Helper function subprocess.popen replaces os.system
92
    - gives better input/output process control
93
    usage: pass <data_in> to process <cmd> and return results
94
    based on rgbeimage.py (Thomas Bleicher 2010)
95
    """
96
    if type(cmd) == str:
2✔
97
        cmd = str(cmd) # gets rid of unicode oddities
2✔
98
        shell=True
2✔
99
    else:
100
        shell=False
2✔
101

102
    p = Popen(cmd, bufsize=-1, stdin=PIPE, stdout=data_out, stderr=PIPE, shell=shell) #shell=True required for Linux? quick fix, but may be security concern
2✔
103
    data, err = p.communicate(data_in)
2✔
104

105
    if err:
2✔
106
        if data:
2✔
107
            returntuple = (data.decode('latin1'), 'message: '+err.decode('latin1').strip())
×
108
        else:
109
            returntuple = (None, 'message: '+err.decode('latin1').strip())
2✔
110
    else:
111
        if data:
2✔
112
            returntuple = (data.decode('latin1'), None) #Py3 requires decoding
2✔
113
        else:
114
            returntuple = (None, None)
2✔
115

116
    return returntuple
2✔
117

118
def _interactive_load(title=None):
2✔
119
    # Tkinter file picker
120
    import tkinter
×
121
    from tkinter import filedialog
×
122
    root = tkinter.Tk()
×
123
    root.withdraw() #Start interactive file input
×
124
    root.attributes("-topmost", True) #Bring window into foreground
×
125
    return filedialog.askopenfilename(parent=root, title=title) #initialdir = data_dir
×
126

127
def _interactive_directory(title=None):
2✔
128
    # Tkinter directory picker.  Now Py3.6 compliant!
129
    import tkinter
×
130
    from tkinter import filedialog
×
131
    root = tkinter.Tk()
×
132
    root.withdraw() #Start interactive file input
×
133
    root.attributes("-topmost", True) #Bring to front
×
134
    return filedialog.askdirectory(parent=root, title=title)
×
135

136
def _modDict(originaldict, moddict, relative=False):
2✔
137
    '''
138
    Compares keys in originaldict with moddict and updates values of 
139
    originaldict to moddict if existing.
140
    
141
    Parameters
142
    ----------
143
    originaldict : dictionary
144
        Original dictionary calculated, for example frontscan or backscan dictionaries.
145
    moddict : dictionary
146
        Modified dictinoary, for example modscan['xstart'] = 0 to change position of x.
147
    relative : Bool
148
        if passing modscanfront and modscanback to modify dictionarie of positions,
149
        this sets if the values passed to be updated are relative or absolute. 
150
        Default is absolute value (relative=False)
151
            
152
    Returns
153
    -------
154
    originaldict : dictionary
155
        Updated original dictionary with values from moddict.
156
    '''
157
    newdict = originaldict.copy()
2✔
158

159
    for key in moddict:
2✔
160
        try:
2✔
161
            if relative:
2✔
162
                newdict[key] = moddict[key] + newdict[key]
×
163
            else:
164
                newdict[key] = moddict[key]
2✔
165
        except:
×
166
            print("Wrong key in modified dictionary")
×
167
    
168
    return newdict
2✔
169

170
def _heightCasesSwitcher(sceneDict, preferred='hub_height', nonpreferred='clearance_height'):
2✔
171
        """
172
        
173
        Parameters
174
        ----------
175
        sceneDict : dictionary
176
            Dictionary that might contain more than one way of defining height for 
177
            the array: `clearance_height`, `hub_height`, `height`*
178
            * height deprecated from sceneDict. This function helps choose
179
            * which definition to use.  
180
        preferred : str, optional
181
            When sceneDict has hub_height and clearance_height, or it only has height,
182
            it will leave only the preferred option.. The default is 'hub_height'.
183
        nonpreferred : TYPE, optional
184
            When sceneDict has hub_height and clearance_height, 
185
            it wil ldelete this nonpreferred option. The default is 'clearance_height'.
186
    
187
        Returns
188
        -------
189
        sceneDict : TYPE
190
            Dictionary now containing the appropriate definition for system height. 
191
        use_clearanceheight : Bool
192
            Helper variable to specify if dictionary has only clearancehet for
193
            use inside `makeScene1axis`. Will get deprecated once that internal
194
            function is streamlined.
195
    
196
        """
197
        # TODO: When we update to python 3.9.0, this could be a Switch Cases (Structural Pattern Matching):
198
    
199
            
200
        heightCases = '_'
2✔
201
        if 'height' in sceneDict:
2✔
202
            heightCases = heightCases+'height__'
2✔
203
        if 'clearance_height' in sceneDict:
2✔
204
            heightCases = heightCases+'clearance_height__'
2✔
205
        if 'hub_height' in sceneDict:
2✔
206
            heightCases = heightCases+'hub_height__'
2✔
207
        
208
        use_clearanceheight = False
2✔
209
        # CASES:
210
        if heightCases == '_height__':
2✔
211
            print("sceneDict Warning: 'height' is being deprecated. "+
2✔
212
                                  "Renaming as "+preferred)
213
            sceneDict[preferred]=sceneDict['height']
2✔
214
            del sceneDict['height']
2✔
215
        
216
        elif heightCases == '_clearance_height__':
2✔
217
            #print("Using clearance_height.")
218
            use_clearanceheight = True
2✔
219
            
220
        elif heightCases == '_hub_height__':
2✔
221
            #print("Using hub_height.'")
222
            pass
2✔
223
        elif heightCases == '_height__clearance_height__':  
2✔
224
            print("sceneDict Warning: 'clearance_height and 'height' "+
2✔
225
                  "(deprecated) are being passed. removing 'height' "+
226
                  "from sceneDict for this tracking routine")
227
            del sceneDict['height']
2✔
228
            use_clearanceheight = True
2✔
229
                            
230
        elif heightCases == '_height__hub_height__':     
2✔
231
            print("sceneDict Warning: 'height' is being deprecated. Using 'hub_height'")
2✔
232
            del sceneDict['height']
2✔
233
        
234
        elif heightCases == '_height__clearance_height__hub_height__':       
2✔
235
            print("sceneDict Warning: 'hub_height', 'clearance_height'"+
×
236
                  ", and 'height' are being passed. Removing 'height'"+
237
                  " (deprecated) and "+ nonpreferred+ ", using "+preferred)
238
            del sceneDict[nonpreferred]
×
239
        
240
        elif heightCases == '_clearance_height__hub_height__':  
2✔
241
            print("sceneDict Warning: 'hub_height' and 'clearance_height'"+
2✔
242
                  " are being passed. Using "+preferred+
243
                  " and removing "+ nonpreferred)
244
            del sceneDict[nonpreferred]
2✔
245
    
246
        else: 
247
            print ("sceneDict Error! no argument in sceneDict found "+
×
248
                   "for 'hub_height', 'height' nor 'clearance_height'. "+
249
                   "Exiting routine.")
250
            
251
        return sceneDict, use_clearanceheight
2✔
252

253
def _is_leap_and_29Feb(s): # Removes Feb. 29 if it a leap year.
2✔
254
    return (s.index.year % 4 == 0) & \
2✔
255
           ((s.index.year % 100 != 0) | (s.index.year % 400 == 0)) & \
256
           (s.index.month == 2) & (s.index.day == 29)
257

258
def _subhourlydatatoGencumskyformat(gencumskydata, label='right'):
2✔
259
    # Subroutine to resample, pad, remove leap year and get data in the
260
    # 8760 hourly format
261
    # for saving the temporary files for gencumsky in _saveTempTMY and
262
    # _makeTrackerCSV
263
    
264

265
    #Resample to hourly. Gencumsky wants right-labeled data.
266
    try:
2✔
267
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean()  
2✔
268
    except TypeError: # Pandas 2.0 error
1✔
269
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean(numeric_only=True) 
1✔
270
    
271
    if label == 'left': #switch from left to right labeled by adding an hour
2✔
272
        gencumskydata.index = gencumskydata.index + pd.to_timedelta('1H')
×
273
                     
274

275
    # Padding
276
    tzinfo = gencumskydata.index.tzinfo
2✔
277
    padstart = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0],1,1,1,0 ) ).tz_localize(tzinfo)
2✔
278
    padend = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0]+1,1,1,0,0) ).tz_localize(tzinfo)
2✔
279
    gencumskydata.iloc[0] = 0  # set first datapt to zero to forward fill w zeros
2✔
280
    gencumskydata.iloc[-1] = 0  # set last datapt to zero to forward fill w zeros
2✔
281
    # check if index exists. I'm sure there is a way to do this backwards.
282
    if any(gencumskydata.index.isin([padstart])):
2✔
283
        print("Data starts on Jan. 01")
2✔
284
    else:
285
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padstart]))
286
        gencumskydata=pd.concat([gencumskydata,pd.DataFrame(index=[padstart])])
2✔
287
    if any(gencumskydata.index.isin([padend])):
2✔
288
        print("Data ends on Dec. 31st")
2✔
289
    else:
290
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padend]))
291
        gencumskydata=pd.concat([gencumskydata, pd.DataFrame(index=[padend])])
2✔
292
    gencumskydata.loc[padstart]=0
2✔
293
    gencumskydata.loc[padend]=0
2✔
294
    gencumskydata=gencumskydata.sort_index() 
2✔
295
    # Fill empty timestamps with zeros
296
    gencumskydata = gencumskydata.resample('60min').asfreq().fillna(0)
2✔
297
    # Mask leap year
298
    leapmask =  ~(_is_leap_and_29Feb(gencumskydata))
2✔
299
    gencumskydata = gencumskydata[leapmask]
2✔
300

301
    if (gencumskydata.index.year[-1] == gencumskydata.index.year[-2]+1) and len(gencumskydata)>8760:
2✔
302
        gencumskydata = gencumskydata[:-1]
×
303
    return gencumskydata
2✔
304
    # end _subhourlydatatoGencumskyformat        
305
    
306

307
class RadianceObj:
2✔
308
    """
309
    The RadianceObj top level class is used to work on radiance objects, 
310
    keep track of filenames,  sky values, PV module configuration, etc.
311

312
    Parameters
313
    ----------
314
    name : text to append to output files
315
    filelist : list of Radiance files to create oconv
316
    nowstr : current date/time string
317
    path : working directory with Radiance materials and objects
318

319
    Methods
320
    -------
321
    __init__ : initialize the object
322
    _setPath : change the working directory
323

324
    """
325
    def __repr__(self):
2✔
326
        return str(self.__dict__)  
2✔
327
    def __init__(self, name=None, path=None, hpc=False):
2✔
328
        '''
329
        initialize RadianceObj with path of Radiance materials and objects,
330
        as well as a basename to append to
331

332
        Parameters
333
        ----------
334
        name: string, append temporary and output files with this value
335
        path: location of Radiance materials and objects
336
        hpc:  Keeps track if User is running simulation on HPC so some file 
337
              reading routines try reading a bit longer and some writing 
338
              routines (makeModule) that overwrite themselves are inactivated.
339

340
        Returns
341
        -------
342
        none
343
        '''
344

345
        self.metdata = {}        # data from epw met file
2✔
346
        self.data = {}           # data stored at each timestep
2✔
347
        self.path = ""             # path of working directory
2✔
348
        self.name = ""         # basename to append
2✔
349
        #self.filelist = []         # list of files to include in the oconv
350
        self.materialfiles = []    # material files for oconv
2✔
351
        self.skyfiles = []          # skyfiles for oconv
2✔
352
        #self.radfiles = []      # scene rad files for oconv, compiled from self.scenes
353
        self.scenes = []        # array of scenefiles to be compiled
2✔
354
        self.octfile = []       #octfile name for analysis
2✔
355
        self.Wm2Front = 0       # cumulative tabulation of front W/m2
2✔
356
        self.Wm2Back = 0        # cumulative tabulation of rear W/m2
2✔
357
        self.backRatio = 0      # ratio of rear / front Wm2
2✔
358
        #self.nMods = None        # number of modules per row
359
        #self.nRows = None        # number of rows per scene
360
        self.hpc = hpc           # HPC simulation is being run. Some read/write functions are modified
2✔
361
        self.CompiledResults = None # DataFrame of cumulative results, output from self.calculateResults()
2✔
362
        
363
        now = datetime.datetime.now()
2✔
364
        self.nowstr = str(now.date())+'_'+str(now.hour)+str(now.minute)+str(now.second)
2✔
365

366
        # DEFAULTS
367

368
        if name is None:
2✔
369
            self.name = self.nowstr  # set default filename for output files
×
370
        else:
371
            self.name = name
2✔
372
        self.basename = name # add backwards compatibility for prior versions
2✔
373
        #self.__name__ = self.name  #optional info
374
        #self.__str__ = self.__name__   #optional info
375
        if path is None:
2✔
376
            self._setPath(os.getcwd())
2✔
377
        else:
378
            self._setPath(path)
2✔
379
        # load files in the /materials/ directory
380
        self.materialfiles = self.returnMaterialFiles('materials')
2✔
381
        
382
        # store list of columns and methods for convenience / introspection
383
        # TODO: abstract this by making a super class that this inherits
384
        self.columns =  [attr for attr in dir(self) if not (attr.startswith('_') or callable(getattr(self,attr)))]
2✔
385
        self.methods = [attr for attr in dir(self) if (not attr.startswith('_') and callable(getattr(self,attr)))]
2✔
386

387

388
    def _setPath(self, path):
2✔
389
        """
390
        setPath - move path and working directory
391

392
        """
393
        self.path = os.path.abspath(path)
2✔
394

395
        print('path = '+ path)
2✔
396
        try:
2✔
397
            os.chdir(self.path)
2✔
398
        except OSError as exc:
2✔
399
            LOGGER.error('Path doesn''t exist: %s' % (path))
2✔
400
            LOGGER.exception(exc)
2✔
401
            raise(exc)
2✔
402

403
        # check for path in the new Radiance directory:
404
        def _checkPath(path):  # create the file structure if it doesn't exist
2✔
405
            if not os.path.exists(path):
2✔
406
                os.makedirs(path)
2✔
407
                print('Making path: '+path)
2✔
408

409
        _checkPath('images'); _checkPath('objects')
2✔
410
        _checkPath('results'); _checkPath('skies'); _checkPath('EPWs')
2✔
411
        # if materials directory doesn't exist, populate it with ground.rad
412
        # figure out where pip installed support files.
413
        from shutil import copy2
2✔
414

415
        if not os.path.exists('materials'):  #copy ground.rad to /materials
2✔
416
            os.makedirs('materials')
2✔
417
            print('Making path: materials')
2✔
418

419
            copy2(os.path.join(DATA_PATH, 'ground.rad'), 'materials')
2✔
420
        # if views directory doesn't exist, create it with two default views - side.vp and front.vp
421
        if not os.path.exists('views'):
2✔
422
            os.makedirs('views')
2✔
423
            with open(os.path.join('views', 'side.vp'), 'w') as f:
2✔
424
                f.write('rvu -vtv -vp -10 1.5 3 -vd 1.581 0 -0.519234 '+
2✔
425
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
426
            with open(os.path.join('views', 'front.vp'), 'w') as f:
2✔
427
                f.write('rvu -vtv -vp 0 -3 5 -vd 0 0.894427 -0.894427 '+
2✔
428
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
429
            with open(os.path.join('views', 'module.vp'), 'w') as f:
2✔
430
                f.write('rvu -vtv -vp -3 -3 0.3 -vd 0.8139 0.5810 0.0 '+
2✔
431
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
432
    def getfilelist(self):
2✔
433
        """ 
434
        Return concat of matfiles, radfiles and skyfiles
435
        """
436

437
        return self.materialfiles + self.skyfiles + self._getradfiles()
2✔
438
    
439
    def _getradfiles(self, scenelist=None):
2✔
440
        """
441
        iterate over self.scenes to get the radfiles
442

443
        Returns
444
        -------
445
        None.
446

447
        """
448
        if scenelist is None:
2✔
449
            scenelist = self.scenes
2✔
450
        a = []
2✔
451
        for scene in scenelist:
2✔
452
            if type(scene.radfiles) == list:
2✔
453
                for f in scene.radfiles:
2✔
454
                    a.append(f) 
2✔
455
            else:
456
                a.append(scene.radfiles)
2✔
457
        return a
2✔
458
        
459
    def save(self, savefile=None):
2✔
460
        """
461
        Pickle the radiance object for further use.
462
        Very basic operation - not much use right now.
463

464
        Parameters
465
        ----------
466
        savefile : str
467
            Optional savefile name, with .pickle extension.
468
            Otherwise default to save.pickle
469

470
        """
471
        
472
        import pickle
2✔
473

474
        if savefile is None:
2✔
475
            savefile = 'save.pickle'
×
476

477
        with open(savefile, 'wb') as f:
2✔
478
            pickle.dump(self, f)
2✔
479
        print('Saved to file {}'.format(savefile))
2✔
480

481
    #def setHPC(self, hpc=True):
482
    #    self.hpc = hpc
483
        
484
    def addMaterial(self, material, Rrefl, Grefl, Brefl, materialtype='plastic', 
2✔
485
                    specularity=0, roughness=0, material_file=None, comment=None, rewrite=True):
486
        """
487
        Function to add a material in Radiance format. 
488

489

490
        Parameters
491
        ----------
492
        material : str
493
            DESCRIPTION.
494
        Rrefl : str
495
            Reflectivity for first wavelength, or 'R' bin.
496
        Grefl : str
497
            Reflecstrtivity for second wavelength, or 'G' bin.
498
        Brefl : str
499
            Reflectivity for third wavelength, or 'B' bin.
500
        materialtype : str, optional
501
            Type of material. The default is 'plastic'. Others can be mirror,
502
            trans, etc. See RADIANCe documentation. 
503
        specularity : str, optional
504
            Ratio of reflection that is specular and not diffuse. The default is 0.
505
        roughness : str, optional
506
            This is the microscopic surface roughness: the more jagged the 
507
            facets are, the rougher it is and more blurry reflections will appear.
508
        material_file : str, optional
509
            DESCRIPTION. The default is None.
510
        comment : str, optional
511
            DESCRIPTION. The default is None.
512
        rewrite : str, optional
513
            DESCRIPTION. The default is True.
514

515
        Returns
516
        -------
517
        None. Just adds the material to the material_file specified or the 
518
        default in ``materials\ground.rad``.
519

520
        References:
521
            See examples of documentation for more materialtype details.
522
            http://www.jaloxa.eu/resources/radiance/documentation/docs/radiance_tutorial.pdf page 10
523
     
524
            Also, you can use https://www.jaloxa.eu/resources/radiance/colour_picker.shtml 
525
            to have a sense of how the material would look with the RGB values as 
526
            well as specularity and roughness.
527

528
            To understand more on reflectivity, specularity and roughness values
529
            https://thinkmoult.com/radiance-specularity-and-roughness-value-examples.html
530
            
531
        """
532
        if material_file is None:
2✔
533
            material_file = 'ground.rad'    
2✔
534
    
535
        matfile = os.path.join('materials', material_file)
2✔
536
        
537
        with open(matfile, 'r') as fp:
2✔
538
            buffer = fp.readlines()
2✔
539
                
540
        # search buffer for material matching requested addition
541
        found = False
2✔
542
        for i in buffer:
2✔
543
            if materialtype and material in i:
2✔
544
                loc = buffer.index(i)
2✔
545
                found = True
2✔
546
                break
2✔
547
        if found:
2✔
548
            if rewrite:            
2✔
549
                print('Material exists, overwriting...\n')
2✔
550
                if comment is None:
2✔
551
                    pre = loc - 1
×
552
                else:
553
                    pre = loc - 2            
2✔
554
                # commit buffer without material match
555
                with open(matfile, 'w') as fp:
2✔
556
                    for i in buffer[0:pre]:
2✔
557
                        fp.write(i)
2✔
558
                    for i in buffer[loc+4:]:
2✔
559
                        fp.write(i)
×
560
        if (found and rewrite) or (not found):
2✔
561
            # append -- This will create the file if it doesn't exist
562
            file_object = open(matfile, 'a')
2✔
563
            file_object.write("\n\n")
2✔
564
            if comment is not None:
2✔
565
                file_object.write("#{}".format(comment))
2✔
566
            file_object.write("\nvoid {} {}".format(materialtype, material))
2✔
567
            if materialtype == 'glass' or materialtype =='mirror':
2✔
568
                file_object.write("\n0\n0\n3 {} {} {}".format(Rrefl, Grefl, Brefl))
×
569
            else:
570
                file_object.write("\n0\n0\n5 {} {} {} {} {}".format(Rrefl, Grefl, Brefl, specularity, roughness))
2✔
571
            file_object.close()
2✔
572
            print('Added material {} to file {}'.format(material, material_file))
2✔
573
        if (found and not rewrite):
2✔
574
            print('Material already exists\n')
2✔
575

576
    def exportTrackerDict(self, trackerdict=None,
2✔
577
                          savefile=None, reindex=None):
578
        """
579
        Use :py:func:`~bifacial_radiance.load._exportTrackerDict` to save a
580
        TrackerDict output as a csv file.
581

582
        Parameters
583
        ----------
584
            trackerdict
585
                The tracker dictionary to save
586
            savefile : str 
587
                path to .csv save file location
588
            reindex : bool
589
                True saves the trackerdict in TMY format, including rows for hours
590
                where there is no sun/irradiance results (empty)
591
                
592
        """
593
        
594
        import bifacial_radiance.load
2✔
595

596
        if trackerdict is None:
2✔
597
            trackerdict = self.trackerdict
2✔
598

599
        if savefile is None:
2✔
600
            savefile = _interactive_load(title='Select a .csv file to save to')
×
601

602
        if reindex is not None:
2✔
603
            reindex = False
2✔
604

605
        if self.cumulativesky is True and reindex is True:
2✔
606
            # don't re-index for cumulativesky,
607
            # which has angles for index
608
            print ("\n Warning: For cumulativesky simulations, exporting the "
×
609
                   "TrackerDict requires reindex = False. Setting reindex = "
610
                   "False and proceeding")
611
            reindex = False
×
612

613
        monthlyyearly = True
2✔
614
        if self.cumulativesky is True:
2✔
615
            monthlyyearly = False
2✔
616
            
617
        bifacial_radiance.load._exportTrackerDict(trackerdict, savefile,
2✔
618
                                                 cumulativesky=self.cumulativesky,
619
                                                 reindex=reindex, monthlyyearly=monthlyyearly)
620

621
    
622
    # loadtrackerdict not updated to match new trackerdict configuration
623
    def loadtrackerdict(self, trackerdict=None, fileprefix=None):
2✔
624
        """
625
        Use :py:class:`bifacial_radiance.load._loadtrackerdict` 
626
        to browse the results directory and load back any results saved in there.
627

628
        Parameters
629
        ----------
630
        trackerdict
631
        fileprefix : str
632

633
        """
634
        from bifacial_radiance.load import loadTrackerDict
2✔
635
        if trackerdict is None:
2✔
636
            trackerdict = self.trackerdict
×
637
        (trackerdict, totaldict) = loadTrackerDict(trackerdict, fileprefix)
2✔
638
        self.Wm2Front = totaldict['Wm2Front']
2✔
639
        self.Wm2Back = totaldict['Wm2Back']
2✔
640
    
641
    def returnOctFiles(self):
2✔
642
        """
643
        Return files in the root directory with `.oct` extension
644

645
        Returns
646
        -------
647
        oct_files : list
648
            List of .oct files
649
        
650
        """
651
        oct_files = [f for f in os.listdir(self.path) if f.endswith('.oct')]
×
652
        #self.oct_files = oct_files
653
        return oct_files
×
654

655
    def returnMaterialFiles(self, material_path=None):
2✔
656
        """
657
        Return files in the Materials directory with .rad extension
658
        appends materials files to the oconv file list
659

660
        Parameters
661
        ----------
662
        material_path : str
663
            Optional parameter to point to a specific materials directory.
664
            otherwise /materials/ is default
665

666
        Returns
667
        -------
668
        material_files : list
669
            List of .rad files
670

671
        """
672
        
673
        if material_path is None:
2✔
674
            material_path = 'materials'
×
675

676
        material_files = [f for f in os.listdir(os.path.join(self.path,
2✔
677
                                                             material_path)) if f.endswith('.rad')]
678

679
        materialfilelist = [os.path.join(material_path, f) for f in material_files]
2✔
680
        self.materialfiles = materialfilelist
2✔
681
        return materialfilelist
2✔
682

683

684
    def getResults(self, trackerdict=None):
2✔
685
        """
686
        Iterate over trackerdict and return irradiance results
687
        following analysis1axis runs
688

689
        Parameters
690
        ----------
691
        trackerdict : dict, optional
692
            trackerdict, after analysis1axis has been run
693

694
        Returns
695
        -------
696
        results : Pandas.DataFrame
697
            dataframe containing irradiance scan results.
698

699
        """
700
        from bifacial_radiance.load import getResults
2✔
701
        
702
        if trackerdict is None:
2✔
703
            trackerdict = self.trackerdict
2✔
704

705
        return getResults(trackerdict, self.cumulativesky)
2✔
706

707
    
708
    def sceneNames(self, scenes=None):
2✔
709
        if scenes is None: scenes = self.scenes
2✔
710
        return [scene.name for scene in scenes]
2✔
711
    
712
    def setGround(self, material=None, material_file=None):
2✔
713
        """ 
714
        Use GroundObj constructor class and return a ground object
715
        
716
        Parameters
717
        ------------
718
        material : numeric or str
719
            If number between 0 and 1 is passed, albedo input is assumed and assigned.    
720
            If string is passed with the name of the material desired. e.g. 'litesoil',
721
            properties are searched in `material_file`.
722
            Default Material names to choose from: litesoil, concrete, white_EPDM, 
723
            beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
724
        material_file : str
725
            Filename of the material information. Default `ground.rad`
726
    
727
        Returns
728
        -------
729
        self.ground : tuple
730
            self.ground.normval : numeric
731
            Normalized color value
732
            self.ground.ReflAvg : numeric
733
            Average reflectance
734
        """
735

736
        if material is None:
2✔
737
            try:
×
738
                if self.metdata.albedo is not None:
×
739
                    material = self.metdata.albedo
×
740
                    print(" Assigned Albedo from metdata.albedo")
×
741
            except:
×
742
                pass
×
743
            
744
        self.ground = GroundObj(material, material_file)
2✔
745

746

747
    def getEPW(self, lat=None, lon=None, GetAll=False):
2✔
748
        """
749
        Subroutine to download nearest epw files to latitude and longitude provided,
750
        into the directory \EPWs\
751
        based on github/aahoo.
752
        
753
        .. warning::
754
            verify=false is required to operate within NREL's network.
755
            to avoid annoying warnings, insecurerequestwarning is disabled
756
            currently this function is not working within NREL's network.  annoying!
757
        
758
        Parameters
759
        ----------
760
        lat : decimal 
761
            Used to find closest EPW file.
762
        lon : decimal 
763
            Longitude value to find closest EPW file.
764
        GetAll : boolean 
765
            Download all available files. Note that no epw file will be loaded into memory
766
        
767
        
768
        """
769

770
        import requests, re
2✔
771
        from requests.packages.urllib3.exceptions import InsecureRequestWarning
2✔
772
        requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
2✔
773
        hdr = {'User-Agent' : "Magic Browser",
2✔
774
               'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'
775
               }
776

777
        path_to_save = 'EPWs' # create a directory and write the name of directory here
2✔
778
        if not os.path.exists(path_to_save):
2✔
779
            os.makedirs(path_to_save)
×
780

781
        def _returnEPWnames():
2✔
782
            ''' return a dataframe with the name, lat, lon, url of available files'''
783
            r = requests.get('https://github.com/NREL/EnergyPlus/raw/develop/weather/master.geojson', verify=False)
2✔
784
            data = r.json() #metadata for available files
2✔
785
            #download lat/lon and url details for each .epw file into a dataframe
786
            df = pd.DataFrame({'url':[], 'lat':[], 'lon':[], 'name':[]})
2✔
787
            for location in data['features']:
2✔
788
                match = re.search(r'href=[\'"]?([^\'" >]+)', location['properties']['epw'])
2✔
789
                if match:
2✔
790
                    url = match.group(1)
2✔
791
                    name = url[url.rfind('/') + 1:]
2✔
792
                    lontemp = location['geometry']['coordinates'][0]
2✔
793
                    lattemp = location['geometry']['coordinates'][1]
2✔
794
                    dftemp = pd.DataFrame({'url':[url], 'lat':[lattemp], 'lon':[lontemp], 'name':[name]})
2✔
795
                    #df = df.append(dftemp, ignore_index=True)
796
                    df = pd.concat([df, dftemp], ignore_index=True)
2✔
797
            return df
2✔
798

799
        def _findClosestEPW(lat, lon, df):
2✔
800
            #locate the record with the nearest lat/lon
801
            errorvec = np.sqrt(np.square(df.lat - lat) + np.square(df.lon - lon))
2✔
802
            index = errorvec.idxmin()
2✔
803
            url = df['url'][index]
2✔
804
            name = df['name'][index]
2✔
805
            return url, name
2✔
806

807
        def _downloadEPWfile(url, path_to_save, name):
2✔
808
            r = requests.get(url, verify=False, headers=hdr)
2✔
809
            if r.ok:
2✔
810
                filename = os.path.join(path_to_save, name)
2✔
811
                # py2 and 3 compatible: binary write, encode text first
812
                with open(filename, 'wb') as f:
2✔
813
                    f.write(r.text.encode('ascii', 'ignore'))
2✔
814
                print(' ... OK!')
2✔
815
            else:
816
                print(' connection error status code: %s' %(r.status_code))
×
817
                r.raise_for_status()
×
818

819
        # Get the list of EPW filenames and lat/lon
820
        df = _returnEPWnames()
2✔
821

822
        # find the closest EPW file to the given lat/lon
823
        if (lat is not None) & (lon is not None) & (GetAll is False):
2✔
824
            url, name = _findClosestEPW(lat, lon, df)
2✔
825

826
            # download the EPW file to the local drive.
827
            print('Getting weather file: ' + name)
2✔
828
            _downloadEPWfile(url, path_to_save, name)
2✔
829
            self.epwfile = os.path.join('EPWs', name)
2✔
830

831
        elif GetAll is True:
×
832
            if input('Downloading ALL EPW files available. OK? [y/n]') == 'y':
×
833
                # get all of the EPW files
834
                for index, row in df.iterrows():
×
835
                    print('Getting weather file: ' + row['name'])
×
836
                    _downloadEPWfile(row['url'], path_to_save, row['name'])
×
837
            self.epwfile = None
×
838
        else:
839
            print('Nothing returned. Proper usage: epwfile = getEPW(lat,lon)')
×
840
            self.epwfile = None
×
841

842
        return self.epwfile
2✔
843
      
844

845

846
    def readWeatherFile(self, weatherFile=None, starttime=None, 
2✔
847
                        endtime=None, label=None, source=None,
848
                        coerce_year=None, tz_convert_val=None):
849
        """
850
        Read either a EPW or a TMY file, calls the functions 
851
        :py:class:`~bifacial_radiance.readTMY` or
852
        :py:class:`~bifacial_radiance.readEPW` 
853
        according to the weatherfile extention and returns a 
854
        :py:class:`~bifacial_radiance.MetObj` .
855
        
856
        Parameters
857
        ----------
858
        weatherFile : str
859
            File containing the weather information. EPW, TMY or solargis accepted.
860
        starttime : str
861
            Limited start time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
862
        endtime : str
863
            Limited end time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
864
        daydate : str  DEPRECATED
865
            For single day in 'MM/DD' or MM_DD format.  Now use starttime and 
866
            endtime set to the same date.
867
        label : str
868
            'left', 'right', or 'center'. For data that is averaged, defines if
869
            the timestamp refers to the left edge, the right edge, or the 
870
            center of the averaging interval, for purposes of calculating 
871
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
872
            represents data from 10 to 11, and sun position is calculated 
873
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
874
            data and NSRDB uses centered.
875
        source : str
876
            To help identify different types of .csv files. If None, it assumes
877
            it is a TMY3-style formated data. Current options: 'TMY3', 
878
            'solargis', 'EPW', 'SAM'
879
        coerce_year : int
880
            Year to coerce weather data to in YYYY format, ie 2021. 
881
            If more than one year of data in the  weather file, year is NOT coerced.
882
        tz_convert_val : int 
883
            Convert timezone to this fixed value, following ISO standard 
884
            (negative values indicating West of UTC.)
885
        """
886
        #from datetime import datetime
887
        #import warnings
888
        
889
        if weatherFile is None:
2✔
890
            if hasattr(self,'epwfile'):
×
891
                weatherFile = self.epwfile
×
892
            else:
893
                try:
×
894
                    weatherFile = _interactive_load('Select EPW or TMY3 climate file')
×
895
                except:
×
896
                    raise Exception('Interactive load failed. Tkinter not supported'+
×
897
                                    'on this system. Try installing X-Quartz and reloading')
898
        if coerce_year is not None:
2✔
899
            coerce_year = int(coerce_year)
2✔
900
            if str(coerce_year).__len__() != 4:
2✔
901
                warnings.warn('Incorrect coerce_year. Setting to None')
×
902
                coerce_year = None
×
903
                
904

905
        if source is None:
2✔
906
    
907
            if weatherFile[-3:].lower() == 'epw':
2✔
908
                source = 'EPW'
2✔
909
            else:
910
                print('Warning: CSV file passed for input. Assuming it is TMY3'+
2✔
911
                      'style format') 
912
                source = 'TMY3'
2✔
913
            if label is None:
2✔
914
                label = 'right' # EPW and TMY are by deffault right-labeled.
2✔
915

916
        if source.lower() == 'solargis':
2✔
917
            if label is None:
2✔
918
                label = 'center'
2✔
919
            metdata, metadata = self._readSOLARGIS(weatherFile, label=label)
2✔
920

921
        if source.lower() =='epw':
2✔
922
            if label is None:
2✔
923
                label = 'right'
×
924
            metdata, metadata = self._readEPW(weatherFile, label=label)
2✔
925

926
        if source.lower() =='tmy3':
2✔
927
            if label is None:
2✔
928
                label = 'right'
×
929
            metdata, metadata = self._readTMY(weatherFile, label=label)
2✔
930

931
        if source.lower() =='sam':
2✔
932
            if label is None:
×
933
                label = 'left'
×
934
            metdata, metadata = self._readSAM(weatherFile)
×
935
            
936
        self.metdata = self.readWeatherData(metadata, metdata, starttime=starttime, 
2✔
937
                            endtime=endtime,
938
                            coerce_year=coerce_year, label=label,
939
                            tz_convert_val=tz_convert_val)
940
        
941
        return self.metdata
2✔
942
        
943

944
    def readWeatherData(self, metadata, metdata, starttime=None, 
2✔
945
                        endtime=None,
946
                        coerce_year=None, label='center',
947
                        tz_convert_val=None):
948
        """
949
        Intermediate function to read in metadata and metdata objects from 
950
        :py:class:`~bifacial_radiance.readWeatherFile` and export a 
951
        :py:class:`~bifacial_radiance.MetObj` 
952
        
953
        Parameters
954
        ----------
955
        metadata : dict
956
            Dictionary with metadata stats. keys required: 'lat', 'lon', 'altitude',
957
            'TZ'
958
        metdata : Pandas DataFrame
959
            Dataframe with meteo timeseries. Index needs to be datetimelike and TZ-aware.
960
            columns required: 'DNI', 'DHI','GHI', 'Alb'
961
        starttime : str, optional
962
            Limited start time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
963
        endtime : str, optional
964
            Limited end time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
965
        label : str
966
            'left', 'right', or 'center'. For data that is averaged, defines if
967
            the timestamp refers to the left edge, the right edge, or the 
968
            center of the averaging interval, for purposes of calculating 
969
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
970
            represents data from 10 to 11, and sun position is calculated 
971
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
972
            data and NSRDB uses centered.
973
        coerce_year : int
974
            Year to coerce weather data to in YYYY format, ie 2021. 
975
            If more than one year of data in the  weather file, year is NOT coerced.
976
        tz_convert_val : int 
977
            Convert timezone to this fixed value, following ISO standard 
978
            (negative values indicating West of UTC.)
979
        """
980
        
981
        def _parseTimes(t, hour, coerce_year):
2✔
982
            '''
983
            parse time input t which could be string mm_dd_HH or YYYY-mm-dd_HHMM
984
            or datetime.datetime object.  Return pd.datetime object.  Define
985
            hour as hour input if not passed directly.
986
            '''
987
            import re
2✔
988
            
989
            if type(t) == str:
2✔
990
                try:
2✔
991
                    tsplit = re.split('-|_| ', t)
2✔
992
                    
993
                    #mm_dd format
994
                    if tsplit.__len__() == 2 and t.__len__() == 5: 
2✔
995
                        if coerce_year is None:
2✔
996
                                coerce_year = 2021 #default year. 
2✔
997
                        tsplit.insert(0,str(coerce_year))
2✔
998
                        tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
999
                        
1000
                    #mm_dd_hh or YYYY_mm_dd format
1001
                    elif tsplit.__len__() == 3 :
2✔
1002
                        if tsplit[0].__len__() == 2:
2✔
1003
                            if coerce_year is None:
2✔
1004
                                coerce_year = 2021 #default year. 
2✔
1005
                            tsplit.insert(0,str(coerce_year))
2✔
1006
                        elif tsplit[0].__len__() == 4:
2✔
1007
                            tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
1008
                            
1009
                    #YYYY-mm-dd_HHMM  format
1010
                    if tsplit.__len__() == 4 and tsplit[0].__len__() == 4:
2✔
1011
                        t_out = pd.to_datetime(''.join(tsplit).ljust(12,'0') ) 
2✔
1012
                    
1013
                    else:
1014
                        raise Exception(f'incorrect time string passed {t}.'
×
1015
                                        'Valid options: mm_dd, mm_dd_HH, '
1016
                                        'mm_dd_HHMM, YYYY-mm-dd_HHMM')  
1017
                except Exception as e:
×
1018
                    # Error for incorrect string passed:
1019
                    raise(e)
×
1020
            else:  #datetime or timestamp
1021
                try:
2✔
1022
                    t_out = pd.to_datetime(t)
2✔
1023
                except pd.errors.ParserError:
×
1024
                    print('incorrect time object passed.  Valid options: '
×
1025
                          'string or datetime.datetime or pd.timeIndex. You '
1026
                          f'passed {type(t)}.')
1027
            return t_out, coerce_year
2✔
1028
        # end _parseTimes
1029
        
1030
        def _tz_convert(metdata, metadata, tz_convert_val):
2✔
1031
            """
1032
            convert metdata to a different local timzone.  Particularly for 
1033
            SolarGIS weather files which are returned in UTC by default.
1034
            ----------
1035
            tz_convert_val : int
1036
                Convert timezone to this fixed value, following ISO standard 
1037
                (negative values indicating West of UTC.)
1038
            Returns: metdata, metadata  
1039
            """
1040
            import pytz
2✔
1041
            if (type(tz_convert_val) == int) | (type(tz_convert_val) == float):
2✔
1042
                metadata['TZ'] = tz_convert_val
2✔
1043
                metdata = metdata.tz_convert(pytz.FixedOffset(tz_convert_val*60))
2✔
1044
            return metdata, metadata
2✔
1045
        # end _tz_convert
1046
        
1047
        def _correctMetaKeys(m):
2✔
1048
            # put correct keys on m = metadata dict
1049

1050
            m['altitude'] = _firstlist([m.get('altitude'), m.get('elevation')])
2✔
1051
            m['TZ'] = _firstlist([m.get('TZ'), m.get('Time Zone'), m.get('timezone')])
2✔
1052
           
1053
            if not m.get('city'):
2✔
1054
                try:
2✔
1055
                    m['city'] = (m['county'] + ',' + m['state'] +
2✔
1056
                                        ',' + m['country'])
1057
                except KeyError:
2✔
1058
                    m['city'] = '-'
2✔
1059
            m['Name'] = _firstlist([m.get('Name'), m.get('city'), m.get('county'), 
2✔
1060
                                    f"nsrdb_{m.get('Location ID')}"])    
1061
            
1062
            return m
2✔
1063
        
1064
        
1065
        metadata = _correctMetaKeys(metadata)
2✔
1066

1067
        metdata.rename(columns={'dni': 'DNI',
2✔
1068
                                'dhi': 'DHI',
1069
                                'ghi': 'GHI',
1070
                                'air_temperature': 'DryBulb',
1071
                                'wind_speed': 'Wspd',
1072
                                'surface_albedo': 'Alb'
1073
                                }, inplace=True)        
1074
        
1075
    
1076
        metdata, metadata = _tz_convert(metdata, metadata, tz_convert_val)
2✔
1077
        tzinfo = metdata.index.tzinfo
2✔
1078
        tempMetDatatitle = 'metdata_temp.csv'
2✔
1079

1080
        # Parse the start and endtime strings. 
1081
        if starttime is not None:
2✔
1082
            starttime, coerce_year = _parseTimes(starttime, 1, coerce_year)
2✔
1083
            starttime = starttime.tz_localize(tzinfo)
2✔
1084
        if endtime is not None:
2✔
1085
            endtime, coerce_year = _parseTimes(endtime, 23, coerce_year)
2✔
1086
            endtime = endtime.tz_localize(tzinfo)
2✔
1087
        '''
2✔
1088
        #TODO: do we really need this check?
1089
        if coerce_year is not None and starttime is not None:
1090
            if coerce_year != starttime.year or coerce_year != endtime.year:
1091
                print("Warning: Coerce year does not match requested sampled "+
1092
                      "date(s)'s years. Setting Coerce year to None.")
1093
                coerce_year = None
1094
        '''        
1095

1096
        tmydata_trunc = self._saveTempTMY(metdata, filename=tempMetDatatitle, 
2✔
1097
                                          starttime=starttime, endtime=endtime, 
1098
                                          coerce_year=coerce_year,
1099
                                          label=label)
1100

1101
        if tmydata_trunc.__len__() > 0:
2✔
1102
            self.metdata = MetObj(tmydata_trunc, metadata, label = label)
2✔
1103
        else:
1104
            self.metdata = None
×
1105
            raise Exception('Weather file returned zero points for the '
×
1106
                  'starttime / endtime  provided')
1107
        
1108
        
1109
        return self.metdata
2✔
1110
        
1111

1112

1113
    def _saveTempTMY(self, tmydata, filename=None, starttime=None, endtime=None, 
2✔
1114
                     coerce_year=None, label=None):
1115
        '''
1116
        private function to save part or all of tmydata into /EPWs/ for use 
1117
        in gencumsky -G mode and return truncated  tmydata. Gencumsky 8760
1118
        starts with Jan 1, 1AM and ends Dec 31, 2400
1119
        
1120
        starttime:  tz-localized pd.TimeIndex
1121
        endtime:    tz-localized pd.TimeIndex
1122
        
1123
        returns: tmydata_truncated  : subset of tmydata based on start & end
1124
        '''
1125
        
1126
        
1127
        if filename is None:
2✔
1128
            filename = 'temp.csv'
×
1129
              
1130
                           
1131
        gencumskydata = None
2✔
1132
        gencumdict = None
2✔
1133
        if len(tmydata) == 8760: 
2✔
1134
            print("8760 line in WeatherFile. Assuming this is a standard hourly"+
2✔
1135
                  " WeatherFile for the year for purposes of saving Gencumulativesky"+
1136
                  " temporary weather files in EPW folder.")
1137
            if coerce_year is None and starttime is not None:
2✔
1138
                coerce_year = starttime.year
2✔
1139
            # SILVANA:  If user doesn't pass starttime, and doesn't select
1140
            # coerce_year, then do we really need to coerce it?
1141
            elif coerce_year is None:
2✔
1142
                coerce_year = 2021                
2✔
1143
            print(f"Coercing year to {coerce_year}")
2✔
1144
            with warnings.catch_warnings():
2✔
1145
                warnings.simplefilter("ignore")
2✔
1146
                tmydata.index.values[:] = tmydata.index[:] + pd.DateOffset(year=(coerce_year))
2✔
1147
                # Correcting last index to next year.
1148
                tmydata.index.values[-1] = tmydata.index[-1] + pd.DateOffset(year=(coerce_year+1))
2✔
1149
        
1150
            # FilterDates
1151
            filterdates = None
2✔
1152
            if starttime is not None and endtime is not None:
2✔
1153
                starttime
2✔
1154
                filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
2✔
1155
            else:
1156
                if starttime is not None:
2✔
1157
                    filterdates = (tmydata.index >= starttime)
2✔
1158
                if endtime is not None:
2✔
1159
                    filterdates = (tmydata.index <= endtime)
×
1160
            
1161
            if filterdates is not None:
2✔
1162
                print("Filtering dates")
2✔
1163
                tmydata[~filterdates] = 0
2✔
1164
        
1165
            gencumskydata = tmydata.copy()
2✔
1166
            
1167
        else:
1168
            if len(tmydata.index.year.unique()) == 1:
2✔
1169
                if coerce_year:
×
1170
                    # TODO: check why subhourly data still has 0 entries on the next day on _readTMY3
1171
                    # in the meantime, let's make Silvana's life easy by just deletig 0 entries
1172
                    tmydata = tmydata[~(tmydata.index.hour == 0)] 
×
1173
                    print(f"Coercing year to {coerce_year}")
×
1174
                    # TODO: this coercing shows a python warning. Turn it off or find another method? bleh.
1175
                    tmydata.index.values[:] = tmydata.index[:] + pd.DateOffset(year=(coerce_year))
×
1176
        
1177
                # FilterDates
1178
                filterdates = None
×
1179
                if starttime is not None and endtime is not None:
×
1180
                    filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1181
                else:
1182
                    if starttime is not None:
×
1183
                        filterdates = (tmydata.index >= starttime)
×
1184
                    if endtime is not None:
×
1185
                        filterdates = (tmydata.index <= endtime)
×
1186
                
1187
                if filterdates is not None:
×
1188
                    print("Filtering dates")
×
1189
                    tmydata[~filterdates] = 0
×
1190
        
1191
                gencumskydata = tmydata.copy()
×
1192
                gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata, 
×
1193
                                                                label=label)
1194
        
1195
            else:
1196
                if coerce_year:
2✔
1197
                    print("More than 1 year of data identified. Can't do coercing")
×
1198
                
1199
                # Check if years are consecutive
1200
                l = list(tmydata.index.year.unique())
2✔
1201
                if l != list(range(min(l), max(l)+1)):
2✔
1202
                    print("Years are not consecutive. Won't be able to use Gencumsky"+
×
1203
                          " because who knows what's going on with this data.")
1204
                else:
1205
                    print("Years are consecutive. For Gencumsky, make sure to select"+
2✔
1206
                          " which yearly temporary weather file you want to use"+
1207
                          " else they will all get accumulated to same hour/day")
1208
                    
1209
                    # FilterDates
1210
                    filterdates = None
2✔
1211
                    if starttime is not None and endtime is not None:
2✔
1212
                        filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1213
                    else:
1214
                        if starttime is not None:
2✔
1215
                            filterdates = (tmydata.index >= starttime)
×
1216
                        if endtime is not None:
2✔
1217
                            filterdates = (tmydata.index <= endtime)
×
1218
                    
1219
                    if filterdates is not None:
2✔
1220
                        print("Filtering dates")
×
1221
                        tmydata = tmydata[filterdates] # Reducing years potentially
×
1222
        
1223
                    # Checking if filtering reduced to just 1 year to do usual savin.
1224
                    if len(tmydata.index.year.unique()) == 1:
2✔
1225
                        gencumskydata = tmydata.copy()
×
1226
                        gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
×
1227
                                                                        label=label)
1228

1229
                    else:
1230
                        gencumdict = [g for n, g in tmydata.groupby(pd.Grouper(freq='Y'))]
2✔
1231
                        
1232
                        for ii in range(0, len(gencumdict)):
2✔
1233
                            gencumskydata = gencumdict[ii]
2✔
1234
                            gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
2✔
1235
                                                                            label=label)
1236
                            gencumdict[ii] = gencumskydata
2✔
1237
                        
1238
                        gencumskydata = None # clearing so that the dictionary style can be activated.
2✔
1239
        
1240
        
1241
        # Let's save files in EPWs folder for Gencumsky     
1242
        if gencumskydata is not None:
2✔
1243
            csvfile = os.path.join('EPWs', filename)
2✔
1244
            print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1245
            gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1246
            self.gencumsky_metfile = csvfile
2✔
1247
        
1248
        if gencumdict is not None:
2✔
1249
            self.gencumsky_metfile = []
2✔
1250
            for ii in range (0, len(gencumdict)):
2✔
1251
                gencumskydata = gencumdict[ii]
2✔
1252
                newfilename = filename.split('.')[0]+'_year_'+str(ii)+'.csv'
2✔
1253
                csvfile = os.path.join('EPWs', newfilename)
2✔
1254
                print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1255
                gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1256
                self.gencumsky_metfile.append(csvfile)
2✔
1257

1258
        return tmydata
2✔
1259

1260
        
1261
    def _readTMY(self, tmyfile=None, label = 'right', coerce_year=None):
2✔
1262
        '''
1263
        use pvlib to read in a tmy3 file.
1264
        Note: pvlib 0.7 does not currently support sub-hourly files. Until
1265
        then, use _readTMYdate() to create the index
1266

1267
        Parameters
1268
        ------------
1269
        tmyfile : str
1270
            Filename of tmy3 to be read with pvlib.tmy.readtmy3
1271
        label : str
1272
            'left', 'right', or 'center'. For data that is averaged, defines if
1273
            the timestamp refers to the left edge, the right edge, or the 
1274
            center of the averaging interval, for purposes of calculating 
1275
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1276
            represents data from 10 to 11, and sun position is calculated 
1277
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1278
            data and NSRDB uses centered.
1279
        coerce_year : int
1280
            Year to coerce to. Default is 2021. 
1281
        
1282
        Returns
1283
        -------
1284
        metdata - MetObj collected from TMY3 file
1285
        '''
1286
        def _convertTMYdate(data, meta):
2✔
1287
            ''' requires pvlib 0.8, updated to handle subhourly timestamps '''
1288
            # get the date column as a pd.Series of numpy datetime64
1289
            data_ymd = pd.to_datetime(data['Date (MM/DD/YYYY)'])
2✔
1290
            # shift the time column so that midnite is 00:00 instead of 24:00
1291
            shifted_hour = data['Time (HH:MM)'].str[:2].astype(int) % 24
2✔
1292
            minute = data['Time (HH:MM)'].str[3:].astype(int) 
2✔
1293
            # shift the dates at midnite so they correspond to the next day
1294
            data_ymd[shifted_hour == 0] += datetime.timedelta(days=1)
2✔
1295
            # NOTE: as of pandas>=0.24 the pd.Series.array has a month attribute, but
1296
            # in pandas-0.18.1, only DatetimeIndex has month, but indices are immutable
1297
            # so we need to continue to work with the panda series of dates `data_ymd`
1298
            data_index = pd.DatetimeIndex(data_ymd)
2✔
1299
            # use indices to check for a leap day and advance it to March 1st
1300
            leapday = (data_index.month == 2) & (data_index.day == 29)
2✔
1301
            data_ymd[leapday] += datetime.timedelta(days=1)
2✔
1302
            # shifted_hour is a pd.Series, so use pd.to_timedelta to get a pd.Series of
1303
            # timedeltas
1304
            # NOTE: as of pvlib-0.6.3, min req is pandas-0.18.1, so pd.to_timedelta
1305
            # unit must be in (D,h,m,s,ms,us,ns), but pandas>=0.24 allows unit='hour'
1306
            data.index = (data_ymd + pd.to_timedelta(shifted_hour, unit='h') +
2✔
1307
                         pd.to_timedelta(minute, unit='min') )
1308

1309
            data = data.tz_localize(int(meta['TZ'] * 3600))
2✔
1310
            
1311
            return data
2✔
1312
        
1313
        
1314
        import pvlib
2✔
1315

1316
        #(tmydata, metadata) = pvlib.tmy.readtmy3(filename=tmyfile) #pvlib<=0.6
1317
        (tmydata, metadata) = pvlib.iotools.tmy.read_tmy3(filename=tmyfile,
2✔
1318
                                                          coerce_year=coerce_year) 
1319
        
1320
        try:
2✔
1321
            tmydata = _convertTMYdate(tmydata, metadata) 
2✔
1322
        except KeyError:
×
1323
            print('PVLib >= 0.8.0 is required for sub-hourly data input')
×
1324

1325

1326
        return tmydata, metadata
2✔
1327

1328
    def _readSAM(self, SAMfile=None):
2✔
1329
        '''
1330
        use pvlib to read in a tmy3 file.
1331
        Note: pvlib 0.7 does not currently support sub-hourly files. Until
1332
        then, use _readTMYdate() to create the index
1333

1334
        Parameters
1335
        ------------
1336
        tmyfile : str
1337
            Filename of tmy3 to be read with pvlib.tmy.readtmy3
1338

1339
        Returns
1340
        -------
1341
        tmydata - Weather dataframe
1342
        metdata - MetObj collected from SAM file
1343
        '''
1344
        
1345
        # Will only work with latest PVLIB Release once htey accept my push..
1346
        # Note Oct. 10
1347
        # import pvlib
1348
        #(tmydata, metadata) = pvlib.iotools.tmy.read_psm3(filename=SAMfile,
1349
        #                                                  map_variables=True)
1350
        with open(SAMfile) as myfile:
×
1351
            head = next(myfile)#
×
1352
            meta = next(myfile)
×
1353
        
1354
        meta2=meta.split(',')
×
1355
        meta2[-1] = meta2[-1][:-1] # Remove the carryover sig
×
1356
        
1357
        head2 = head.split(',')
×
1358
        head2[-1] = head2[-1][:-1] 
×
1359
        
1360
        res = {head2[i]: meta2[i] for i in range(len(head2))}
×
1361
        
1362

1363
        data = pd.read_csv(SAMfile, skiprows=2)
×
1364
        
1365
        metadata = {}
×
1366
        metadata['TZ'] = float(res['Time Zone'])
×
1367
        metadata['latitude'] = float(res['Latitude'])
×
1368
        metadata['longitude'] = float(res['Longitude'])
×
1369
        metadata['altitude'] = float(res['Elevation'])
×
1370
        metadata['city'] = res['Source']
×
1371
        
1372
        allcaps = False
×
1373
        if 'Year' in data.columns:
×
1374
            allcaps = True
×
1375
            
1376
        if allcaps:
×
1377
            if 'Minute' in data.columns:
×
1378
                dtidx = pd.to_datetime(
×
1379
                    data[['Year', 'Month', 'Day', 'Hour', 'Minute']])
1380
            else: 
1381
                dtidx = pd.to_datetime(
×
1382
                    data[['Year', 'Month', 'Day', 'Hour']])
1383
        else:
1384
            if 'minute' in data.columns:
×
1385
                dtidx = pd.to_datetime(
×
1386
                    data[['year', 'month', 'day', 'hour', 'minute']])
1387
            else: 
1388
                dtidx = pd.to_datetime(
×
1389
                    data[['year', 'month', 'day', 'hour']])
1390

1391
        # in USA all timezones are integers
1392
        tz = 'Etc/GMT%+d' % -metadata['TZ']
×
1393
        data.index = pd.DatetimeIndex(dtidx).tz_localize(tz)
×
1394

1395
    
1396

1397
        data.rename(columns={'Temperature':'temp_air'}, inplace=True) 
×
1398
        data.rename(columns={'Surface Albedo':'Alb'}, inplace=True) 
×
1399
        data.rename(columns={'wspd':'wind_speed'}, inplace=True) 
×
1400
        data.rename(columns={'Wind Speed':'wind_speed'}, inplace=True) 
×
1401
        data.rename(columns={'Pressure':'pressure'}, inplace=True) 
×
1402
        data.rename(columns={'Dew Point':'dewpoint'}, inplace=True) 
×
1403

1404
        data.rename(columns={'tdry':'DryBulb'}, inplace=True) 
×
1405
        data.rename(columns={'Tdry':'DryBulb'}, inplace=True) 
×
1406
        data.rename(columns={'dni':'DNI'}, inplace=True) 
×
1407
        data.rename(columns={'dhi':'DHI'}, inplace=True) 
×
1408
        data.rename(columns={'ghi':'GHI'}, inplace=True) 
×
1409

1410
        data.rename(columns={'pres':'atmospheric_pressure'}, inplace=True) 
×
1411
        data.rename(columns={'Tdew':'temp_dew'}, inplace=True) 
×
1412
        data.rename(columns={'albedo':'Alb'}, inplace=True) 
×
1413
        
1414
        print("COLUMN DATAS", data.keys())
×
1415

1416
        tmydata = data
×
1417
        
1418
        return tmydata, metadata
×
1419

1420
    def _readEPW(self, epwfile=None, label = 'right', coerce_year=None):
2✔
1421
        """
1422
        Uses readepw from pvlib>0.6.1 but un-do -1hr offset and
1423
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1424
    
1425
        Parameters
1426
        ------------
1427
        epwfile : str
1428
            Direction and filename of the epwfile. If None, opens an interactive
1429
            loading window.
1430
        label : str
1431
            'left', 'right', or 'center'. For data that is averaged, defines if
1432
            the timestamp refers to the left edge, the right edge, or the 
1433
            center of the averaging interval, for purposes of calculating 
1434
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1435
            represents data from 10 to 11, and sun position is calculated 
1436
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1437
            data and NSRDB uses centered.
1438
        coerce_year : int
1439
            Year to coerce data to.
1440
        
1441
        """
1442
        
1443
        import pvlib
2✔
1444
        #import re
1445
        
1446
        '''
2✔
1447
        NOTE: In PVLib > 0.6.1 the new epw.read_epw() function reads in time 
1448
        with a default -1 hour offset.  This is reflected in our existing
1449
        workflow. 
1450
        '''
1451
        #(tmydata, metadata) = readepw(epwfile) #
1452
        (tmydata, metadata) = pvlib.iotools.epw.read_epw(epwfile, 
2✔
1453
                                                         coerce_year=coerce_year) #pvlib>0.6.1
1454
        #pvlib uses -1hr offset that needs to be un-done. Why did they do this?
1455
        tmydata.index = tmydata.index+pd.Timedelta(hours=1) 
2✔
1456

1457
        # rename different field parameters to match output from 
1458
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1459
        tmydata.rename(columns={'dni':'DNI',
2✔
1460
                                'dhi':'DHI',
1461
                                'temp_air':'DryBulb',
1462
                                'wind_speed':'Wspd',
1463
                                'ghi':'GHI',
1464
                                'albedo':'Alb'
1465
                                }, inplace=True)    
1466

1467
        return tmydata, metadata
2✔
1468

1469

1470
    def _readSOLARGIS(self, filename=None, label='center'):
2✔
1471
        """
1472
        Read solarGIS data file which is timestamped in UTC.
1473
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1474
        Timezone is always returned as UTC. Use tz_convert in readWeatherFile
1475
        to manually convert to local time
1476
    
1477
        Parameters
1478
        ------------
1479
        filename : str
1480
            filename of the solarGIS file. 
1481
        label : str
1482
            'left', 'right', or 'center'. For data that is averaged, defines if
1483
            the timestamp refers to the left edge, the right edge, or the 
1484
            center of the averaging interval. SolarGis default style is center,
1485
            unless user requests a right label. 
1486
       
1487
        """
1488
        # file format: anything with # preceding is in the header
1489
        header = []; lat = None; lon = None; elev = None; name = None
2✔
1490
        with open(filename, 'r') as result:
2✔
1491
            for line in result:
2✔
1492
                if line.startswith('#'):
2✔
1493
                    header.append(line)
2✔
1494
                    if line.startswith('#Latitude:'):
2✔
1495
                        lat = line[11:]
2✔
1496
                    if line.startswith('#Longitude:'):
2✔
1497
                        lon = line[12:]
2✔
1498
                    if line.startswith('#Elevation:'):
2✔
1499
                        elev = line[12:17]
2✔
1500
                    if line.startswith('#Site name:'):
2✔
1501
                        name = line[12:-1]
2✔
1502
                else:
1503
                    break
2✔
1504
        metadata = {'latitude':float(lat),
2✔
1505
                    'longitude':float(lon),
1506
                    'altitude':float(elev),
1507
                    'Name':name,
1508
                    'TZ':0.0}
1509
        # read in remainder of data
1510
        data = pd.read_csv(filename,skiprows=header.__len__(), delimiter=';')
2✔
1511

1512
        # rename different field parameters to match output from 
1513
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1514
        data.rename(columns={'DIF':'DHI',
2✔
1515
                             'TEMP':'DryBulb',
1516
                             'WS':'Wspd',
1517
                             }, inplace=True)    
1518

1519
        # Generate index from Date (DD.HH.YYYY) and Time
1520
        data.index = pd.to_datetime(data.Date + ' ' +  data.Time, 
2✔
1521
                                    dayfirst=True, utc=True,
1522
                                    infer_datetime_format = True)
1523

1524
        
1525
        return data, metadata
2✔
1526

1527

1528
    def getSingleTimestampTrackerAngle(self, timeindex, metdata=None, gcr=None, 
2✔
1529
                                       azimuth=180, axis_tilt=0, 
1530
                                       limit_angle=45, backtrack=True):
1531
        """
1532
        Helper function to calculate a tracker's angle for use with the 
1533
        fixed tilt routines of bifacial_radiance. It calculates tracker angle for
1534
        sun position at the timeindex passed (no left or right time offset, 
1535
        label = 'center')
1536
        
1537
        Parameters
1538
        ----------
1539
        timeindex : int
1540
            Index between 0 to ~4000 indicating hour to simulate.
1541
        metdata : :py:class:`~bifacial_radiance.MetObj` 
1542
            Meterological object to set up geometry. Usually set automatically by
1543
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1544
            Default = self.metdata
1545
        gcr : float
1546
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1547
        azimuth : float or int
1548
            Orientation axis of tracker torque tube. Default North-South (180 deg)
1549
        axis_tilt : float or int
1550
            Default 0. Axis tilt -- not implemented in sensors locations so it's pointless
1551
            at this release to change it.
1552
        limit_angle : float or int
1553
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1554
        backtrack : boolean
1555
            Whether backtracking is enabled (default = True)
1556
        
1557
        """
1558
        '''
2✔
1559
        elev = metdata.elevation
1560
        lat = metdata.latitude
1561
        lon = metdata.longitude
1562
        timestamp = metdata.datetime[timeindex]
1563
        '''
1564
        
1565
        import pvlib
2✔
1566
        
1567
        if not metdata:
2✔
1568
            metdata = self.metdata        
×
1569
        solpos = metdata.solpos.iloc[timeindex]
2✔
1570
        sunzen = float(solpos.apparent_zenith)
2✔
1571
        sunaz = float(solpos.azimuth) # not substracting the 180 
2✔
1572
        
1573
        trackingdata = pvlib.tracking.singleaxis(sunzen, sunaz,
2✔
1574
                                             axis_tilt, azimuth,
1575
                                             limit_angle, backtrack, gcr)
1576
        
1577
        tracker_theta = float(np.round(trackingdata['tracker_theta'],2))
2✔
1578
        tracker_theta = tracker_theta*-1 # bifacial_radiance uses East (morning) theta as positive
2✔
1579
            
1580
        return tracker_theta
2✔
1581

1582

1583
    def gendaylit(self, timeindex, metdata=None, debug=False):
2✔
1584
        """
1585
        Sets and returns sky information using gendaylit.
1586
        Uses PVLIB for calculating the sun position angles instead of
1587
        using Radiance internal sun position calculation (for that use gendaylit function)
1588
        
1589
        Parameters
1590
        ----------
1591
        timeindex : int
1592
            Index from 0 to ~4000 of the MetObj (daylight hours only)
1593
        metdata : ``MetObj``
1594
            MetObj object with list of dni, dhi, ghi and location
1595
        debug : bool
1596
            Flag to print output of sky DHI and DNI
1597

1598
        Returns
1599
        -------
1600
        skyname : str
1601
            Sets as a self.skyname and returns filename of sky in /skies/ directory. 
1602
            If errors exist, such as DNI = 0 or sun below horizon, this skyname is None
1603

1604
        """
1605
        #import warnings
1606
 
1607
        if metdata is None:
2✔
1608
            try:
2✔
1609
                metdata = self.metdata
2✔
1610
            except:
×
1611
                print('usage: pass metdata, or run after running ' +
×
1612
                      'readWeatherfile() ') 
1613
                return
×
1614

1615
        ground = self.ground
2✔
1616
        
1617
        locName = metdata.city
2✔
1618
        dni = metdata.dni[timeindex]
2✔
1619
        dhi = metdata.dhi[timeindex]
2✔
1620
        ghi = metdata.ghi[timeindex]
2✔
1621
        elev = metdata.elevation
2✔
1622
        lat = metdata.latitude
2✔
1623
        lon = metdata.longitude
2✔
1624

1625
        # Assign Albedos
1626
        try:
2✔
1627
            if ground.ReflAvg.shape == metdata.dni.shape:
2✔
1628
                groundindex = timeindex  
2✔
1629
            elif self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1630
                groundindex = 0
2✔
1631
            else:
1632
                warnings.warn("Shape of ground Albedos and TMY data do not match.")
×
1633
                return
×
1634
        except:
×
1635
            print('usage: make sure to run setGround() before gendaylit()')
×
1636
            return
×
1637

1638
        if debug is True:
2✔
1639
            print('Sky generated with Gendaylit, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1640
            print("Datetime TimeIndex", metdata.datetime[timeindex])
2✔
1641

1642

1643

1644
        #Time conversion to correct format and offset.
1645
        #datetime = metdata.sunrisesetdata['corrected_timestamp'][timeindex]
1646
        #Don't need any of this any more. Already sunrise/sunset corrected and offset by appropriate interval
1647

1648
        # get solar position zenith and azimuth based on site metadata
1649
        #solpos = pvlib.irradiance.solarposition.get_solarposition(datetimetz,lat,lon,elev)
1650
        solpos = metdata.solpos.iloc[timeindex]
2✔
1651
        sunalt = float(solpos.elevation)
2✔
1652
        # Radiance expects azimuth South = 0, PVlib gives South = 180. Must substract 180 to match.
1653
        sunaz = float(solpos.azimuth)-180.0
2✔
1654

1655
        sky_path = 'skies'
2✔
1656

1657
        if dhi <= 0:
2✔
1658
            self.skyfiles = [None]
×
1659
            return None
×
1660
        # We should already be filtering for elevation >0. But just in case...
1661
        if sunalt <= 0:
2✔
1662
            sunalt = np.arcsin((ghi-dhi)/(dni+.001))*180/np.pi # reverse engineer elevation from ghi, dhi, dni
×
1663
            print('Warning: negative sun elevation at '+
×
1664
                  '{}.  '.format(metdata.datetime[timeindex])+
1665
                  'Re-calculated elevation: {:0.2}'.format(sunalt))
1666

1667
        # Note - -W and -O1 option is used to create full spectrum analysis in units of Wm-2
1668
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1669
        skyStr = ("# start of sky definition for daylighting studies\n" + \
2✔
1670
            "# location name: " + str(locName) + " LAT: " + str(lat)
1671
            +" LON: " + str(lon) + " Elev: " + str(elev) + "\n"
1672
            "# Sun position calculated w. PVLib\n" + \
1673
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1674
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, ground.ReflAvg[groundindex]) + \
1675
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1676
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1677
            ground._makeGroundString(index=groundindex, cumulativesky=False)
1678

1679
        time = metdata.datetime[timeindex]
2✔
1680
        #filename = str(time)[2:-9].replace('-','_').replace(' ','_').replace(':','_')
1681
        filename = time.strftime('%Y-%m-%d_%H%M')
2✔
1682
        skyname = os.path.join(sky_path,"sky2_%s_%s_%s.rad" %(lat, lon, filename))
2✔
1683

1684
        skyFile = open(skyname, 'w')
2✔
1685
        skyFile.write(skyStr)
2✔
1686
        skyFile.close()
2✔
1687

1688
        self.skyfiles = [skyname]
2✔
1689

1690
        return skyname
2✔
1691

1692
    def gendaylit2manual(self, dni, dhi, sunalt, sunaz):
2✔
1693
        """
1694
        Sets and returns sky information using gendaylit.
1695
        Uses user-provided data for sun position and irradiance.
1696
        
1697
        .. warning::
1698
            This generates the sky at the sun altitude&azimuth provided, make 
1699
            sure it is the right position relative to how the weather data got
1700
            created and read (i.e. label right, left or center).
1701
            
1702
     
1703
        Parameters
1704
        ------------
1705
        dni: int or float
1706
           Direct Normal Irradiance (DNI) value, in W/m^2
1707
        dhi : int or float
1708
           Diffuse Horizontal Irradiance (DHI) value, in W/m^2 
1709
        sunalt : int or float
1710
           Sun altitude (degrees) 
1711
        sunaz : int or float
1712
           Sun azimuth (degrees) 
1713

1714
        Returns
1715
        -------
1716
        skyname : string
1717
           Filename of sky in /skies/ directory
1718
        """
1719

1720
        
1721
        print('Sky generated with Gendaylit 2 MANUAL, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1722

1723
        sky_path = 'skies'
2✔
1724

1725
        if sunalt <= 0 or dhi <= 0:
2✔
1726
            self.skyfiles = [None]
×
1727
            return None
×
1728
        
1729
                # Assign Albedos
1730
        try:
2✔
1731
            if self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1732
                groundindex = 0
2✔
1733
            else:
1734
                print("Ambiguous albedo entry, Set albedo to single value "
×
1735
                      "in setGround()")
1736
                return
×
1737
        except:
×
1738
            print('usage: make sure to run setGround() before gendaylit()')
×
1739
            return
×
1740
        
1741
        
1742
        # Note: -W and -O1 are used to create full spectrum analysis in units of Wm-2       
1743
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1744
        skyStr =   ("# start of sky definition for daylighting studies\n" + \
2✔
1745
            "# Manual inputs of DNI, DHI, SunAlt and SunAZ into Gendaylit used \n" + \
1746
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1747
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, self.ground.ReflAvg[groundindex]) + \
1748
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1749
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1750
            self.ground._makeGroundString(index=groundindex, cumulativesky=False)
1751

1752
        skyname = os.path.join(sky_path, "sky2_%s.rad" %(self.name))
2✔
1753

1754
        skyFile = open(skyname, 'w')
2✔
1755
        skyFile.write(skyStr)
2✔
1756
        skyFile.close()
2✔
1757

1758
        self.skyfiles = [skyname]
2✔
1759

1760
        return skyname
2✔
1761

1762
    def genCumSky(self, gencumsky_metfile=None, savefile=None):
2✔
1763
        """ 
1764
        Generate Skydome using gencumsky. 
1765
        
1766
        .. warning::
1767
            gencumulativesky.exe is required to be installed,
1768
            which is not a standard radiance distribution.
1769
            You can find the program in the bifacial_radiance distribution directory
1770
            in \Lib\site-packages\bifacial_radiance\data
1771
            
1772
 
1773
        Use :func:`readWeatherFile(filename, starttime='YYYY-mm-dd_HHMM', endtime='YYYY-mm-dd_HHMM')` 
1774
        to limit gencumsky simulations instead.
1775

1776
        Parameters
1777
        ------------
1778
        gencumsky_metfile : str
1779
            Filename with path to temporary created meteorological file usually created
1780
            in EPWs folder. This csv file has no headers, no index, and two
1781
            space separated columns with values for GHI and DNI for each hour 
1782
            in the year, and MUST have 8760 entries long otherwise gencumulativesky.exe cries. 
1783
        savefile : string
1784
            If savefile is None, defaults to "cumulative"
1785
            
1786
        Returns
1787
        --------
1788
        skyname : str
1789
            Filename of the .rad file containing cumulativesky info
1790
            
1791
        """
1792
        
1793
        # TODO:  error checking and auto-install of gencumulativesky.exe
1794
        # TODO: add check if readWeatherfile has not be done
1795
        # TODO: check if it fails if gcc module has been loaded? (common hpc issue)
1796
        
1797
        #import datetime
1798
        
1799
        if gencumsky_metfile is None:
2✔
1800
            gencumsky_metfile = self.gencumsky_metfile
×
1801
            if isinstance(gencumsky_metfile, str):
×
1802
                print("Loaded ", gencumsky_metfile)
×
1803
                
1804
        if isinstance(gencumsky_metfile, list):
2✔
1805
            print("There are more than 1 year of gencumsky temporal weather file saved."+
×
1806
                  "You can pass which file you want with gencumsky_metfile input. Since "+
1807
                  "No year was selected, defaulting to using the first year of the list")
1808
            gencumsky_metfile = gencumsky_metfile[0] 
×
1809
            print("Loaded ", gencumsky_metfile)
×
1810

1811

1812
        if savefile is None:
2✔
1813
            savefile = "cumulative"
2✔
1814
        sky_path = 'skies'
2✔
1815
        lat = self.metdata.latitude
2✔
1816
        lon = self.metdata.longitude
2✔
1817
        timeZone = self.metdata.timezone
2✔
1818
        '''
2✔
1819
        cmd = "gencumulativesky +s1 -h 0 -a %s -o %s -m %s %s " %(lat, lon, float(timeZone)*15, filetype) +\
1820
            "-time %s %s -date %s %s %s %s %s" % (startdt.hour, enddt.hour+1,
1821
                                                  startdt.month, startdt.day,
1822
                                                  enddt.month, enddt.day,
1823
                                                  gencumsky_metfile)
1824
        '''
1825
        cmd = (f"gencumulativesky +s1 -h 0 -a {lat} -o {lon} -m "
2✔
1826
               f"{float(timeZone)*15} -G {gencumsky_metfile}" )
1827
               
1828
        with open(savefile+".cal","w") as f:
2✔
1829
            _,err = _popen(cmd, None, f)
2✔
1830
            if err is not None:
2✔
1831
                print(err)
2✔
1832

1833
        # Assign Albedos
1834
        try:
2✔
1835
            groundstring = self.ground._makeGroundString(cumulativesky=True)
2✔
1836
        except:
×
1837
            raise Exception('Error: ground reflection not defined.  '
×
1838
                            'Run RadianceObj.setGround() first')
1839
            return
1840
        
1841

1842

1843
        skyStr = "#Cumulative Sky Definition\n" +\
2✔
1844
            "void brightfunc skyfunc\n" + \
1845
            "2 skybright " + "%s.cal\n" % (savefile) + \
1846
            "0\n" + \
1847
            "0\n" + \
1848
            "\nskyfunc glow sky_glow\n" + \
1849
            "0\n" + \
1850
            "0\n" + \
1851
            "4 1 1 1 0\n" + \
1852
            "\nsky_glow source sky\n" + \
1853
            "0\n" + \
1854
            "0\n" + \
1855
            "4 0 0 1 180\n" + \
1856
            groundstring
1857
            
1858
        skyname = os.path.join(sky_path, savefile+".rad")
2✔
1859

1860
        skyFile = open(skyname, 'w')
2✔
1861
        skyFile.write(skyStr)
2✔
1862
        skyFile.close()
2✔
1863

1864
        self.skyfiles = [skyname]#, 'SunFile.rad' ]
2✔
1865

1866
        return skyname
2✔
1867

1868
    def set1axis(self, metdata=None, azimuth=180, limit_angle=45,
2✔
1869
                 angledelta=5, backtrack=True, gcr=1.0 / 3, cumulativesky=True,
1870
                 fixed_tilt_angle=None, useMeasuredTrackerAngle=False,
1871
                 axis_azimuth=None):
1872
        """
1873
        Set up geometry for 1-axis tracking.  Pull in tracking angle details from
1874
        pvlib, create multiple 8760 metdata sub-files where datetime of met data
1875
        matches the tracking angle.  Returns 'trackerdict' which has keys equal to
1876
        either the tracker angles (gencumsky workflow) or timestamps (gendaylit hourly
1877
        workflow)
1878

1879
        Parameters
1880
        ------------
1881
         metdata : :py:class:`~bifacial_radiance.MetObj` 
1882
            Meterological object to set up geometry. Usually set automatically by
1883
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1884
            Default = self.metdata
1885
        azimuth : numeric
1886
            Orientation axis of tracker torque tube. Default North-South (180 deg).
1887
            For fixed-tilt configuration, input is fixed azimuth (180 is south)
1888
        limit_angle : numeric
1889
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1890
        angledelta : numeric
1891
            Degree of rotation increment to parse irradiance bins. Default 5 degrees.
1892
            (0.4 % error for DNI).  Other options: 4 (.25%), 2.5 (0.1%).
1893
            Note: the smaller the angledelta, the more simulations must be run.
1894
        backtrack : bool
1895
            Whether backtracking is enabled (default = True)
1896
        gcr : float
1897
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1898
        cumulativesky : bool
1899
            [True] Wether individual csv files are
1900
            created with constant tilt angle for the cumulativesky approach.
1901
            if false, the gendaylit tracking approach must be used.
1902
        fixed_tilt_angle : numeric
1903
            If passed, this changes to a fixed tilt simulation where each hour 
1904
            uses fixed_tilt_angle and axis_azimuth as the tilt and azimuth
1905
        useMeasuredTrackerAngle: Bool
1906
            If True, and data for tracker angles has been passed by being included
1907
            in the WeatherFile object (column name 'Tracker Angle (degrees)'),
1908
            then tracker angles will be set to these values instead of being calculated.
1909
            NOTE that the value for azimuth passed to set1axis must be surface 
1910
            azimuth in the morning and not the axis_azimuth 
1911
            (i.e. for a N-S HSAT, azimuth = 90).
1912
        axis_azimuth : numeric
1913
            DEPRECATED.  returns deprecation warning. Pass the tracker 
1914
            axis_azimuth through to azimuth input instead.
1915

1916

1917
        Returns
1918
        -------
1919
        trackerdict : dictionary 
1920
            Keys represent tracker tilt angles (gencumsky) or timestamps (gendaylit)
1921
            and list of csv metfile, and datetimes at that angle
1922
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
1923
            - or -
1924
            trackerdict[time]['tracker_theta';'surf_azm';'surf_tilt']
1925
        """
1926

1927
        # Documentation check:
1928
        # Removed         Internal variables
1929
        # -------
1930
        # metdata.solpos          dataframe with solar position data
1931
        # metdata.surface_azimuth list of tracker azimuth data
1932
        # metdata.surface_tilt    list of tracker surface tilt data
1933
        # metdata.tracker_theta   list of tracker tilt angle
1934
        #import warnings
1935
        
1936
        if metdata == None:
2✔
1937
            metdata = self.metdata
2✔
1938

1939
        if metdata == {}:
2✔
1940
            raise Exception("metdata doesnt exist yet.  "+
×
1941
                            "Run RadianceObj.readWeatherFile() ")
1942

1943
        if axis_azimuth:
2✔
1944
            azimuth = axis_azimuth
×
1945
            warnings.warn("axis_azimuth is deprecated in set1axis; use azimuth "
×
1946
                          "input instead.", DeprecationWarning)
1947
            
1948
        #backtrack = True   # include backtracking support in later version
1949
        #gcr = 1.0/3.0       # default value - not used if backtrack = False.
1950

1951

1952
        # get 1-axis tracker angles for this location, rounded to nearest 'angledelta'
1953
        trackerdict = metdata._set1axis(cumulativesky=cumulativesky,
2✔
1954
                                       azimuth=azimuth,
1955
                                       limit_angle=limit_angle,
1956
                                       angledelta=angledelta,
1957
                                       backtrack=backtrack,
1958
                                       gcr=gcr,
1959
                                       fixed_tilt_angle=fixed_tilt_angle,
1960
                                       useMeasuredTrackerAngle=useMeasuredTrackerAngle
1961
                                       )
1962
        self.trackerdict = trackerdict
2✔
1963
        self.cumulativesky = cumulativesky
2✔
1964

1965
        return trackerdict
2✔
1966

1967
    def gendaylit1axis(self, metdata=None, trackerdict=None, startdate=None,
2✔
1968
                       enddate=None, debug=False):
1969
        """
1970
        1-axis tracking implementation of gendaylit.
1971
        Creates multiple sky files, one for each time of day.
1972

1973
        Parameters
1974
        ------------
1975
        metdata
1976
            MetObj output from readWeatherFile.  Needs to have 
1977
            RadianceObj.set1axis() run on it first.
1978
        startdate : str 
1979
            DEPRECATED, does not do anything now.
1980
            Recommended to downselect metdata when reading Weather File.
1981
        enddate : str
1982
            DEPRECATED, does not do anything now.
1983
            Recommended to downselect metdata when reading Weather File.
1984
        trackerdict : dictionary
1985
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
1986
        
1987
        Returns
1988
        -------
1989
        Updated trackerdict dictionary 
1990
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
1991
            with the additional dictionary value ['skyfile'] added
1992

1993
        """
1994
        
1995
        if metdata is None:
2✔
1996
            metdata = self.metdata
2✔
1997
        if trackerdict is None:
2✔
1998
            try:
2✔
1999
                trackerdict = self.trackerdict
2✔
2000
            except AttributeError:
×
2001
                print('No trackerdict value passed or available in self')
×
2002

2003
        if startdate is not None or enddate is not None:
2✔
2004
            print("Deprecation Warning: gendyalit1axis no longer downselects"+
×
2005
                  " entries by stardate and enddate. Downselect your data"+
2006
                  " when loading with readWeatherFile")
2007
            return
×
2008
            
2009
        try:
2✔
2010
            metdata.tracker_theta  # this may not exist
2✔
2011
        except AttributeError:
×
2012
            print("metdata.tracker_theta doesn't exist. Run RadianceObj.set1axis() first")
×
2013

2014
        if debug is False:
2✔
2015
            print('Creating ~%d skyfiles. '%(len(trackerdict.keys())))
2✔
2016
        count = 0  # counter to get number of skyfiles created, just for giggles
2✔
2017

2018
        trackerdict2={}
2✔
2019
        for i in range(0, len(trackerdict.keys())):
2✔
2020
            try:
2✔
2021
                time = metdata.datetime[i]
2✔
2022
            except IndexError:  #out of range error
×
2023
                break  # 
×
2024
            #filename = str(time)[5:-12].replace('-','_').replace(' ','_')
2025
            filename = time.strftime('%Y-%m-%d_%H%M')
2✔
2026
            self.name = filename
2✔
2027

2028
            #check for GHI > 0
2029
            #if metdata.ghi[i] > 0:
2030
            if (metdata.ghi[i] > 0) & (~np.isnan(metdata.tracker_theta[i])):  
2✔
2031
                skyfile = self.gendaylit(metdata=metdata,timeindex=i, debug=debug)
2✔
2032
                # trackerdict2 reduces the dict to only the range specified.
2033
                trackerdict2[filename] = trackerdict[filename]  
2✔
2034
                trackerdict2[filename]['skyfile'] = skyfile
2✔
2035
                count +=1
2✔
2036

2037
        print('Created {} skyfiles in /skies/'.format(count))
2✔
2038
        self.trackerdict = trackerdict2
2✔
2039
        return trackerdict2
2✔
2040

2041
    def genCumSky1axis(self, trackerdict=None):
2✔
2042
        """
2043
        1-axis tracking implementation of gencumulativesky.
2044
        Creates multiple .cal files and .rad files, one for each tracker angle.
2045

2046
        Use :func:`readWeatherFile` to limit gencumsky simulations
2047
        
2048
        
2049
        Parameters
2050
        ------------
2051
        trackerdict : dictionary
2052
            Trackerdict generated as output by RadianceObj.set1axis()
2053
            
2054
        Returns
2055
        -------
2056
        trackerdict : dictionary
2057
            Trackerdict dictionary with new entry trackerdict.skyfile  
2058
            Appends 'skyfile'  to the 1-axis dict with the location of the sky .radfile
2059

2060
        """
2061
        
2062
        if trackerdict == None:
2✔
2063
            try:
2✔
2064
                trackerdict = self.trackerdict
2✔
2065
            except AttributeError:
×
2066
                print('No trackerdict value passed or available in self')
×
2067

2068
        for theta in sorted(trackerdict):  
2✔
2069
            # call gencumulativesky with a new .cal and .rad name
2070
            csvfile = trackerdict[theta]['csvfile']
2✔
2071
            savefile = '1axis_%s'%(theta)  #prefix for .cal file and skies\*.rad file
2✔
2072
            skyfile = self.genCumSky(gencumsky_metfile=csvfile, savefile=savefile)
2✔
2073
            trackerdict[theta]['skyfile'] = skyfile
2✔
2074
            print('Created skyfile %s'%(skyfile))
2✔
2075
        # delete default skyfile (not strictly necessary)
2076
        self.skyfiles = None
2✔
2077
        self.trackerdict = trackerdict
2✔
2078
        return trackerdict
2✔
2079

2080

2081
    def makeOct(self, filelist=None, octname=None):
2✔
2082
        """
2083
        Combine everything together into a .oct file
2084

2085
        Parameters
2086
        ----------
2087
        filelist : list 
2088
            Files to include.  otherwise takes self.filelist
2089
        octname : str
2090
            filename (without .oct extension)
2091

2092

2093
        Returns
2094
        -------
2095
        octname : str
2096
            filename of .oct file in root directory including extension
2097
        err : str
2098
            Error message returned from oconv (if any)
2099
        """
2100
        
2101
        if filelist is None:
2✔
2102
            filelist = self.getfilelist()
2✔
2103
        if octname is None:
2✔
2104
            octname = self.name
2✔
2105

2106
        debug = False
2✔
2107
        #JSS. With the way that the break is handled now, this will wait the 10 for all the hours
2108
        # that were not generated sky files.
2109
        if self.hpc :
2✔
2110
            import time
2✔
2111
            time_to_wait = 10
2✔
2112
            time_counter = 0
2✔
2113
            for file in filelist:
2✔
2114
                if debug:
2✔
2115
                    print("HPC Checking for file %s" % (file))
×
2116
                if None in filelist:  # are we missing any files? abort!
2✔
2117
                    print('Missing files, skipping...')
×
2118
                    self.octfile = None
×
2119
                    return None
×
2120
                #Filesky is being saved as 'none', so it crashes !
2121
                while not os.path.exists(file):
2✔
2122
                    time.sleep(1)
×
2123
                    time_counter += 1
×
2124
                if time_counter > time_to_wait:
2✔
2125
                    print ("filenotfound")
×
2126
                    break
×
2127

2128
        #os.system('oconv '+ ' '.join(filelist) + ' > %s.oct' % (octname))
2129
        if None in filelist:  # are we missing any files? abort!
2✔
2130
            print('Missing files, skipping...')
×
2131
            self.octfile = None
×
2132
            return None
×
2133

2134
        #cmd = 'oconv ' + ' '.join(filelist)
2135
        filelist.insert(0,'oconv')
2✔
2136
        with open('%s.oct' % (octname), "w") as f:
2✔
2137
            _,err = _popen(filelist, None, f)
2✔
2138
            #TODO:  exception handling for no sun up
2139
            if err is not None:
2✔
2140
                if err[0:5] == 'error':
×
2141
                    raise Exception(err[7:])
×
2142
                if err[0:7] == 'message':
×
2143
                    warnings.warn(err[9:], Warning)
×
2144
                    
2145

2146
        #use rvu to see if everything looks good. 
2147
        # use cmd for this since it locks out the terminal.
2148
        #'rvu -vf views\side.vp -e .01 monopanel_test.oct'
2149
        print("Created %s.oct" % (octname))
2✔
2150
        self.octfile = '%s.oct' % (octname)
2✔
2151
        return '%s.oct' % (octname)
2✔
2152

2153
    def makeOct1axis(self, trackerdict=None, singleindex=None, customname=None):
2✔
2154
        """
2155
        Combine files listed in trackerdict into multiple .oct files
2156

2157
        Parameters
2158
        ------------
2159
        trackerdict 
2160
            Output from :py:class:`~bifacial_radiance.RadianceObj.makeScene1axis`
2161
        singleindex : str
2162
            Single index for trackerdict to run makeOct1axis in single-value mode,
2163
            format 'YYYY-MM-DD_HHMM'.
2164
        customname : str 
2165
            Custom text string added to the end of the OCT file name.
2166

2167
        Returns
2168
        -------
2169
        trackerdict
2170
            Append 'octfile'  to the 1-axis dict with the location of the scene .octfile
2171
        """
2172

2173
        if customname is None:
2✔
2174
            customname = ''
2✔
2175

2176
        if trackerdict is None:
2✔
2177
            try:
×
2178
                trackerdict = self.trackerdict
×
2179
            except AttributeError:
×
2180
                print('No trackerdict value passed or available in self')
×
2181
        if singleindex is None:   # loop through all values in the tracker dictionary
2✔
2182
            indexlist = trackerdict.keys()
2✔
2183
        else:  # just loop through one single index in tracker dictionary
2184
            indexlist = [singleindex]
2✔
2185

2186
        print('\nMaking {} octfiles in root directory.'.format(indexlist.__len__()))
2✔
2187
        for index in sorted(indexlist):  # run through either entire key list of trackerdict, or just a single value
2✔
2188
            try:  #TODO: check if this works
2✔
2189
                filelist = self.materialfiles + [trackerdict[index]['skyfile']] + self._getradfiles(trackerdict[index]['scenes'])
2✔
2190
                octname = '1axis_%s%s'%(index, customname)
2✔
2191
                trackerdict[index]['octfile'] = self.makeOct(filelist, octname)
2✔
2192
            except KeyError as e:
×
2193
                print('Trackerdict key error: {}'.format(e))
×
2194
                
2195
        self.trackerdict = trackerdict
2✔
2196
        return trackerdict
2✔
2197

2198
    
2199
    def makeModule(self, name=None, x=None, y=None, z=None,  modulefile=None, 
2✔
2200
                 text=None, customtext='',  xgap=0.01, ygap=0.0, 
2201
                 zgap=0.1, numpanels=1, rewriteModulefile=True, 
2202
                 glass=False, modulematerial=None, bifi=1,  **kwargs):
2203
        """
2204
        pass module generation details into ModuleObj(). See ModuleObj() 
2205
        docstring for more details
2206
        """
2207
        from bifacial_radiance import ModuleObj
2✔
2208

2209
        if name is None:
2✔
2210
            print("usage:  makeModule(name,x,y,z, modulefile = '\objects\*.rad', "+
2✔
2211
                  " zgap = 0.1 (module offset)"+
2212
                  "numpanels = 1 (# of panels in portrait), ygap = 0.05 "+
2213
                  "(slope distance between panels when arrayed), "+
2214
                  "rewriteModulefile = True (or False), bifi = 1")
2215
            print("You can also override module_type info by passing 'text'"+
2✔
2216
                  "variable, or add on at the end for racking details with "+
2217
                  "'customtext'. See function definition for more details")
2218
            print("Optional: tubeParams={} (torque tube details including "
2✔
2219
                  "diameter (torque tube dia. in meters), tubetype='Round' "
2220
                  "(or 'square', 'hex'), material='Metal_Grey' (or 'black')"
2221
                  ", axisofrotation=True (does scene rotate around tube)")
2222
            print("Optional: cellModule={} (create cell-level module by "+
2✔
2223
                  " passing in dictionary with keys 'numcellsx'6 (#cells in "+
2224
                  "X-dir.), 'numcellsy', 'xcell' (cell size in X-dir. in meters),"+
2225
                  "'ycell', 'xcellgap' (spacing between cells in X-dir.), 'ycellgap'")
2226
            print("Optional: omegaParams={} (create the support structure omega by "+
2✔
2227
                  "passing in dictionary with keys 'omega_material' (the material of "+
2228
                  "omega), 'mod_overlap'(the length of the module adjacent piece of"+
2229
                  " omega that overlaps with the module),'x_omega1', 'y_omega' (ideally same"+
2230
                  " for all the parts of omega),'z_omega1', 'x_omega2' (X-dir length of the"+
2231
                  " vertical piece), 'x_omega3', z_omega3")
2232

2233
            return
2✔
2234
        
2235
        """
2✔
2236
        # TODO: check for deprecated torquetube and axisofrotationTorqueTube in
2237
          kwargs.  
2238
        """
2239
        if 'tubeParams' in kwargs:
2✔
2240
            tubeParams = kwargs.pop('tubeParams')
2✔
2241
        else:
2242
            tubeParams = None
2✔
2243
        if 'torquetube' in kwargs:
2✔
2244
            torquetube = kwargs.pop('torquetube')
×
2245
            print("\nWarning: boolean input `torquetube` passed into makeModule"
×
2246
                  ". Starting in v0.4.0 this boolean parameter is deprecated."
2247
                  " Use module.addTorquetube() with `visible` parameter instead.")
2248
            if tubeParams:
×
2249
                tubeParams['visible'] =  torquetube
×
2250
            elif (tubeParams is None) & (torquetube is True):
×
2251
                tubeParams = {'visible':True} # create default TT
×
2252
            
2253
        if 'axisofrotationTorqueTube' in kwargs:
2✔
2254
            axisofrotation = kwargs.pop('axisofrotationTorqueTube')
×
2255
            print("\nWarning: input boolean `axisofrotationTorqueTube` passed "
×
2256
                "into makeModule. Starting in v0.4.0 this boolean parameter is"
2257
                " deprecated. Use module.addTorquetube() with `axisofrotation`"
2258
                "parameter instead.")
2259
            if tubeParams:  #this kwarg only does somehting if there's a TT.
×
2260
                tubeParams['axisofrotation'] = axisofrotation
×
2261
        
2262
        if self.hpc:  # trigger HPC simulation in ModuleObj
2✔
2263
            kwargs['hpc']=True
2✔
2264
            
2265
        self.module = ModuleObj(name=name, x=x, y=y, z=z, bifi=bifi, modulefile=modulefile,
2✔
2266
                   text=text, customtext=customtext, xgap=xgap, ygap=ygap, 
2267
                   zgap=zgap, numpanels=numpanels, 
2268
                   rewriteModulefile=rewriteModulefile, glass=glass, 
2269
                   modulematerial=modulematerial, tubeParams=tubeParams,
2270
                   **kwargs)
2271
        return self.module
2✔
2272
    
2273
    
2274
    
2275
    def makeCustomObject(self, name=None, text=None):
2✔
2276
        """
2277
        Function for development and experimenting with extraneous objects in the scene.
2278
        This function creates a `name.rad` textfile in the objects folder
2279
        with whatever text that is passed to it.
2280
        It is up to the user to pass the correct radiance format.
2281
        
2282
        For example, to create a box at coordinates 0,0 (with its bottom surface
2283
        on the plane z=0):
2284
            
2285
        .. code-block:
2286
        
2287
            name = 'box'
2288
            text='! genbox black PVmodule 0.5 0.5 0.5 | xform -t -0.25 -0.25 0'
2289

2290
        Parameters
2291
        ----------
2292
        name : str
2293
            String input to name the module type
2294
        text : str
2295
            Text used in the radfile to generate the module
2296
        
2297
        """
2298

2299
        customradfile = os.path.join('objects', '%s.rad'%(name)) # update in 0.2.3 to shorten radnames
2✔
2300
        # py2 and 3 compatible: binary write, encode text first
2301
        with open(customradfile, 'wb') as f:
2✔
2302
            f.write(text.encode('ascii'))
2✔
2303

2304
        print("\nCustom Object Name", customradfile)
2✔
2305
        #self.customradfile = customradfile
2306
        return customradfile
2✔
2307

2308

2309
    def printModules(self):
2✔
2310
        # print available module types from ModuleObj
2311
        from bifacial_radiance import ModuleObj
2✔
2312
        modulenames = ModuleObj().readModule()
2✔
2313
        print('Available module names: {}'.format([str(x) for x in modulenames]))
2✔
2314
        return modulenames
2✔
2315
    
2316
        
2317
    def makeScene(self, module=None, sceneDict=None, radname=None,
2✔
2318
                  customtext=None, append=False, 
2319
                  moduletype=None, appendtoScene=None):
2320
        """
2321
        Create a SceneObj which contains details of the PV system configuration including
2322
        tilt, row pitch, height, nMods per row, nRows in the system. Append to
2323
        self.scenes list
2324

2325
        Parameters
2326
        ----------
2327
        module : str or ModuleObj
2328
            String name of module created with makeModule()
2329
        sceneDict : dictionary
2330
            Dictionary with keys: `tilt`, `clearance_height`*, `pitch`,
2331
            `azimuth`, `nMods`, `nRows`, `hub_height`*, `height`*
2332
            * height deprecated from sceneDict. For makeScene (fixed systems)
2333
            if passed it is assumed it reffers to clearance_height.
2334
            `clearance_height` recommended for fixed_tracking systems.
2335
            `hub_height` can also be passed as a possibility.
2336
        radname : str
2337
            Gives a custom name to the scene file. Useful when parallelizing.
2338
        customtext : str
2339
            Appends to the scene a custom text pointing to a custom object
2340
            created by the user; format of the text should start with the rad 
2341
            file path and name, and then any other geometry transformations 
2342
            native to Radiance necessary.
2343
        append : bool, default False
2344
            If multiple scenes exist (makeScene called multiple times), either 
2345
            overwrite the existing scene (default) or append a new SceneObj to
2346
            self.scenes
2347
        moduletype: DEPRECATED. use the `module` kwarg instead.
2348
        appendtoScene : DEPRECATED.  use the `customtext` kwarg instead.
2349

2350
        
2351
        Returns
2352
        -------
2353
        SceneObj 
2354
            'scene' with configuration details
2355
            
2356
        """
2357
        if appendtoScene is not None:
2✔
2358
            customtext = appendtoScene
×
2359
            print("Warning:  input `appendtoScene` is deprecated. Use kwarg "
×
2360
                  "`customtext` instead")
2361
        if moduletype is not None:
2✔
2362
            module = moduletype
×
2363
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2364
                  "`module` instead")
2365
        if module is None:
2✔
2366
            try:
×
2367
                module = self.module
×
2368
                print(f'Using last saved module, name: {module.name}')
×
2369
            except AttributeError:
×
2370
                print('makeScene(module, sceneDict, nMods, nRows).  '+\
×
2371
                          'Available moduletypes: ' )
2372
                self.printModules() #print available module types
×
2373
                return
×
2374
        scene = SceneObj(module, hpc=self.hpc, name=f'Scene{self.scenes.__len__()}')
2✔
2375
        if self.scenes.__len__() >=1:
2✔
2376
            print(f"Additional scene {scene.name} created! See list of names with RadianceObj.scenes and sceneNames")
2✔
2377

2378
        if sceneDict is None:
2✔
2379
            print('makeScene(moduletype, sceneDict, nMods, nRows).  '+\
×
2380
                  'sceneDict inputs: .tilt .clearance_height .pitch .azimuth')
2381
            self.scenes.append(scene)
×
2382
            return scene
×
2383

2384
        if 'azimuth' not in sceneDict:
2✔
2385
            sceneDict['azimuth'] = 180
2✔
2386

2387
        if 'nRows' not in sceneDict:
2✔
2388
            sceneDict['nRows'] = 7
2✔
2389

2390
        if 'nMods' not in sceneDict:
2✔
2391
            sceneDict['nMods'] = 20
2✔
2392

2393
        # Fixed tilt routine
2394
        # Preferred: clearance_height,
2395
        # If only height is passed, it is assumed to be clearance_height.
2396
        
2397
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
2398
                                                                preferred='clearance_height', 
2399
                                                                nonpreferred='hub_height')
2400
        
2401
        #self.nMods = sceneDict['nMods']
2402
        #self.nRows = sceneDict['nRows']
2403
        sceneRAD = scene._makeSceneNxR(sceneDict=sceneDict,
2✔
2404
                                                 radname=radname)
2405

2406
        # TODO: deprecate this section in favor of multiple sceneObjs?
2407
        # This functionality allows additional radfiles to be added to the same
2408
        # sceneObj, so it's somewhat distinct from making new sceneObjs...        
2409
        if 'appendRadfile' not in sceneDict:
2✔
2410
            appendRadfile = False
2✔
2411
        else:
2412
            appendRadfile = sceneDict['appendRadfile']
×
2413

2414
        if appendRadfile:
2✔
2415
            debug = False
×
2416
            try:
×
2417
                scene.radfiles.append(sceneRAD)
×
2418
                if debug:
×
2419
                    print( "Radfile APPENDED!")
×
2420
            except:
×
2421
                #TODO: Manage situation where radfile was created with
2422
                #appendRadfile to False first..
2423
                scene.radfiles=[]
×
2424
                scene.radfiles.append(sceneRAD)
×
2425
                if debug:
×
2426
                    print( "Radfile APPENDAGE created!")
×
2427
        else:
2428
            scene.radfiles = [sceneRAD]
2✔
2429
        #
2430
        if customtext is not None:
2✔
2431
            self.appendtoScene(radfile=scene.radfiles[0], customObject = customtext)
×
2432
            
2433
        # default behavior: overwrite. (backwards compatible behavior.)
2434
        if append:
2✔
2435
            self.scenes.append(scene)
×
2436
        else:
2437
            self.scenes = [scene]
2✔
2438
        return scene
2✔
2439

2440
    def appendtoScene(self, radfile=None, customObject=None, text=''):
2✔
2441
        """
2442
        Appends to the `Scene radfile` in folder `\objects` the text command in Radiance
2443
        lingo created by the user.
2444
        Useful when using addCustomObject to the scene.
2445
        
2446
        DEPRECATED: use the identical version in SceneObj instead
2447

2448
        Parameters
2449
        ----------
2450
        radfile: str
2451
            Directory and name of where .rad scene file is stored
2452
        customObject : str
2453
            Directory and name of custom object .rad file is stored, and any geometry
2454
            modifications needed for it.
2455
        text : str 
2456
            Command to be appended to the radfile which specifies its position 
2457
            in the scene. Do not leave empty spaces at the end.
2458

2459
        Returns
2460
        -------
2461
        Nothing, the radfile must already be created and assigned when running this.
2462
        
2463
        """        
2464
        warnings.warn('RadObj.appendtoScene is deprecated.  Use the equivalent'
2✔
2465
              ' functionality in SceneObj.appendtoScene.', DeprecationWarning)
2466
        # py2 and 3 compatible: binary write, encode text first
2467
        text2 = '\n!xform -rx 0 ' + text + ' ' + customObject
2✔
2468
        
2469
        debug = False
2✔
2470
        if debug:
2✔
2471
            print (text2)
×
2472

2473
        with open(radfile, 'a+') as f:
2✔
2474
            f.write(text2)
2✔
2475

2476

2477
    
2478
    def makeScene1axis(self, trackerdict=None, module=None, sceneDict=None,
2✔
2479
                       cumulativesky=None, customtext=None, append=False, 
2480
                       moduletype=None, appendtoScene=None):
2481
        """
2482
        Creates a SceneObj for each tracking angle which contains details of the PV
2483
        system configuration including row pitch, hub_height, nMods per row, nRows in the system...
2484

2485
        Parameters
2486
        ------------
2487
        trackerdict
2488
            Output from GenCumSky1axis
2489
        module : str or ModuleObj
2490
            Name or ModuleObj created with makeModule()
2491
        sceneDict : 
2492
            Dictionary with keys:`tilt`, `hub_height`, `pitch`, `azimuth`
2493
        cumulativesky : bool
2494
            Defines if sky will be generated with cumulativesky or gendaylit.
2495
        customtext : str
2496
            Appends to each scene a custom text pointing to a custom object
2497
            created by the user; format of the text should start with the rad 
2498
            file path and name, and then any other geometry transformations 
2499
            native to Radiance necessary. e.g '!xform -rz 90 '+self.makeCustomObject()
2500
        append : bool, default False
2501
            If multiple scenes exist (makeScene called multiple times), either 
2502
            overwrite the existing scene (default) or append a new SceneObj to
2503
            self.scenes
2504
        moduletype: DEPRECATED. use the `module` kwarg instead.
2505
        appendtoScene : DEPRECATED. use the `customtext` kwarg instead
2506
            
2507
        Returns
2508
        --------
2509
        trackerdict 
2510
            Append the following keys
2511
                'scene'
2512
                    SceneObj for each tracker theta
2513
                'clearance_height'
2514
                    Calculated ground clearance based on
2515
                    `hub height`, `tilt` angle and overall collector width `sceney`
2516
                
2517
        """
2518
        
2519
        import math, copy
2✔
2520

2521
        if sceneDict is None:
2✔
2522
            print('usage: makeScene1axis(module, sceneDict, nMods, nRows).'+
×
2523
                  'sceneDict inputs: .hub_height .azimuth .nMods .nRows'+
2524
                  'and .pitch or .gcr')
2525
            return
×
2526
        
2527
        if appendtoScene is not None: #kwarg is deprecated.
2✔
2528
            customtext = appendtoScene
×
2529
            warnings.warn("Warning:  input `appendtoScene` is deprecated. Use kwarg "
×
2530
                  "`customtext` instead", DeprecationWarning)
2531
        # If no nRows or nMods assigned on deprecated variable or dictionary,
2532
        # assign default.
2533
        if 'nRows' not in sceneDict:
2✔
2534
            sceneDict['nRows'] = 7
2✔
2535
        if 'nMods' not in sceneDict:
2✔
2536
            sceneDict['nMods'] = 20
2✔
2537

2538
        if trackerdict is None:
2✔
2539
            try:
2✔
2540
                trackerdict = self.trackerdict
2✔
2541
            except AttributeError:
×
2542
                print('No trackerdict value passed or available in self')
×
2543

2544
        if cumulativesky is None:
2✔
2545
            try:
2✔
2546
                # see if cumulativesky = False was set earlier,
2547
                # e.g. in RadianceObj.set1axis
2548
                cumulativesky = self.cumulativesky
2✔
2549
            except AttributeError:
×
2550
                # default cumulativesky = true to maintain backward compatibility.
2551
                cumulativesky = True
×
2552

2553

2554
        if moduletype is not None:
2✔
2555
            module = moduletype
×
2556
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2557
                  "`module` instead")
2558
        if module is None:
2✔
2559
            try:
2✔
2560
                module = self.module
2✔
2561
                print(f'Using last saved module, name: {module.name}')
2✔
2562
            except AttributeError:
×
2563
                print('usage:  makeScene1axis(trackerdict, module, '+
×
2564
                      'sceneDict, nMods, nRows). ')
2565
                self.printModules() #print available module types
×
2566
                return
×
2567

2568
        if 'orientation' in sceneDict:
2✔
2569
            raise Exception('\n\n ERROR: Orientation format has been '
×
2570
                'deprecated since version 0.2.4. If you want to flip your '
2571
                'modules, on makeModule switch the x and y values.\n\n')
2572
       
2573
        # 1axis routine
2574
        # Preferred hub_height
2575
        sceneDict, use_clearanceheight = _heightCasesSwitcher(sceneDict, 
2✔
2576
                                                        preferred='hub_height', 
2577
                                                        nonpreferred='clearance_height')
2578

2579
        if use_clearanceheight:
2✔
2580
            simplefix = 0
2✔
2581
            hubheight = sceneDict['clearance_height'] # Not really, but this is the fastest 
2✔
2582
            # to make it work with the simplefix as below the actual clearnace height
2583
            # gets calculated and the 0 sets the cosine correction to 0. 
2584
            # TODO CLEAN THIS UP.
2585
            
2586
        else:
2587
            #the hub height is the tracker height at center of rotation.
2588
            hubheight = sceneDict['hub_height']
2✔
2589
            simplefix = 1
2✔
2590

2591
        # we no longer need sceneDict['hub_height'] - it'll be replaced by 'clearance_height' below
2592
        sceneDict.pop('hub_height',None)
2✔
2593
        if cumulativesky is True:        # cumulativesky workflow
2✔
2594
            print('\nMaking .rad files for cumulativesky 1-axis workflow')
2✔
2595
            for theta in trackerdict:
2✔
2596
                scene = SceneObj(module, hpc=self.hpc)
2✔
2597
                if trackerdict[theta]['surf_azm'] >= 180:
2✔
2598
                    trackerdict[theta]['surf_azm'] = trackerdict[theta]['surf_azm']-180
2✔
2599
                    trackerdict[theta]['surf_tilt'] = trackerdict[theta]['surf_tilt']*-1
2✔
2600
                radname = '1axis%s_'%(theta,)
2✔
2601

2602
                # Calculating clearance height for this theta.
2603
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2604
                        * scene.module.sceney + scene.module.offsetfromaxis \
2605
                        * math.sin(abs(theta)*math.pi/180)
2606
                # Calculate the ground clearance height based on the hub height. Add abs(theta) to avoid negative tilt angle errors
2607
                #trackerdict[theta]['clearance_height'] = height
2608

2609

2610
                try:
2✔
2611
                    sceneDict.update({'tilt' : trackerdict[theta]['surf_tilt'],
2✔
2612
                                     'clearance_height' :  height,
2613
                                     'azimuth' : trackerdict[theta]['surf_azm'],
2614
                                     'modulez' :  scene.module.z})
2615
                    
2616
                    # sceneDict2 = {'tilt':trackerdict[theta]['surf_tilt'],
2617
                    #                'pitch':sceneDict['pitch'],
2618
                    #                'clearance_height':height,
2619
                    #                'azimuth':trackerdict[theta]['surf_azm'],
2620
                    #                'nMods': sceneDict['nMods'],
2621
                    #                'nRows': sceneDict['nRows'],
2622
                    #                'modulez': scene.module.z}
2623
                except KeyError as err:
×
2624
                    #maybe gcr is passed, not pitch
2625
                    # sceneDict2 = {'tilt':trackerdict[theta]['surf_tilt'],
2626
                    #               'gcr':sceneDict['gcr'],
2627
                    #               'clearance_height':height,
2628
                    #               'azimuth':trackerdict[theta]['surf_azm'],
2629
                    #               'nMods': sceneDict['nMods'],
2630
                    #               'nRows': sceneDict['nRows'],
2631
                    #               'modulez': scene.module.z}
2632
                    raise err
×
2633

2634
                # if sceneDict isn't copied, it will change inside the SceneObj since dicts are mutable!
2635
                radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2636
                                             radname=radname)
2637
                #trackerdict[theta]['radfile'] = radfile
2638
                # TODO: determine radfiles dynamically from scenes
2639
                try:
2✔
2640
                    name=f"Scene{trackerdict[theta]['scenes'].__len__()}"
2✔
2641
                    scene.name = name
2✔
2642
                    if customtext is not None:
2✔
2643
                        scene.appendtoScene(customObject = customtext)
2✔
2644

2645
                    if append:
2✔
2646
                        trackerdict[theta]['scenes'].append(scene)
2✔
2647
                    else:
2648
                        trackerdict[theta]['scenes'] = [scene]
2✔
2649
                except KeyError: #either KeyError or maybe IndexError?  
2✔
2650
                    trackerdict[theta]['scenes'] = [scene]
2✔
2651

2652
            print('{} Radfiles created in /objects/'.format(trackerdict.__len__()))
2✔
2653

2654
        else:  #gendaylit workflow
2655
            print('\nMaking ~%s .rad files for gendaylit 1-axis workflow (this takes a minute..)' % (len(trackerdict)))
2✔
2656
            count = 0
2✔
2657
            for time in trackerdict:
2✔
2658
                scene = SceneObj(module, hpc=self.hpc)
2✔
2659

2660
                if trackerdict[time]['surf_azm'] >= 180:
2✔
2661
                    trackerdict[time]['surf_azm'] = trackerdict[time]['surf_azm']-180
×
2662
                    trackerdict[time]['surf_tilt'] = trackerdict[time]['surf_tilt']*-1
×
2663
                theta = trackerdict[time]['theta']
2✔
2664
                radname = '1axis%s_'%(time,)
2✔
2665

2666
                # Calculating clearance height for this time.
2667
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2668
                        * scene.module.sceney + scene.module.offsetfromaxis \
2669
                        * math.sin(abs(theta)*math.pi/180)
2670

2671
                if trackerdict[time]['ghi'] > 0:
2✔
2672

2673
                    try:
2✔
2674
                        sceneDict.update({'tilt' : trackerdict[time]['surf_tilt'],
2✔
2675
                                         'clearance_height' :  height,
2676
                                         'azimuth' : trackerdict[time]['surf_azm'],
2677
                                         'modulez' :  scene.module.z})
2678
                        # sceneDict2 = {'tilt':trackerdict[time]['surf_tilt'],
2679
                        #               'pitch':sceneDict['pitch'],
2680
                        #               'clearance_height': height,
2681
                        #               'azimuth':trackerdict[time]['surf_azm'],
2682
                        #               'nMods': sceneDict['nMods'],
2683
                        #               'nRows': sceneDict['nRows'],
2684
                        #               'modulez': scene.module.z}
2685
                        
2686
                    except KeyError as err:
×
2687
                        #maybe gcr is passed instead of pitch
2688
                        # sceneDict2 = {'tilt':trackerdict[time]['surf_tilt'],
2689
                        #               'gcr':sceneDict['gcr'],
2690
                        #               'clearance_height': height,
2691
                        #               'azimuth':trackerdict[time]['surf_azm'],
2692
                        #               'nMods': sceneDict['nMods'],
2693
                        #               'nRows': sceneDict['nRows'],
2694
                        #               'modulez': scene.module.z}
2695
                        raise err
×
2696
                    # if sceneDict isn't copied, it will change inside the SceneObj since dicts are mutable!
2697
                    radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2698
                                                 radname=radname)
2699
                    
2700
                    #try:
2701
                    if customtext is not None:
2✔
2702
                        scene.appendtoScene(customObject = customtext)
2✔
2703
                        
2704
                    if ('scenes' in trackerdict[time]) and append:
2✔
2705
                        scene.name=f"Scene{trackerdict[time]['scenes'].__len__()}"
2✔
2706
                        trackerdict[time]['scenes'].append(scene)
2✔
2707
                    else:
2708
                        scene.name="Scene0"
2✔
2709
                        trackerdict[time]['scenes'] = [scene]
2✔
2710
                    
2711
                    count+=1
2✔
2712
            print('{} Radfiles created in /objects/'.format(count))
2✔
2713

2714

2715

2716

2717
        self.trackerdict = trackerdict
2✔
2718
        self.hub_height = hubheight
2✔
2719
        
2720
        return trackerdict
2✔
2721

2722

2723
    def analysis1axis(self, trackerdict=None, singleindex=None, accuracy='low',
2✔
2724
                      customname=None, modWanted=None, rowWanted=None, 
2725
                      sensorsy=9, sensorsx=1,  
2726
                      modscanfront = None, modscanback = None, relative=False, 
2727
                      debug=False, sceneNum=0, append=True ):
2728
        """
2729
        Loop through trackerdict and runs linescans for each scene and scan in there.
2730
        If multiple scenes exist in the trackerdict, only ONE scene can be analyzed at a 
2731
        time.  
2732
        Todo: how to run calculateResults with array of multiple results
2733

2734
        Parameters
2735
        ----------------
2736
        trackerdict 
2737
        singleindex : str
2738
            For single-index mode, just the one index we want to run (new in 0.2.3).
2739
            Example format '21_06_14_12_30' for 2021 June 14th 12:30 pm
2740
        accuracy : str
2741
            'low' or 'high', resolution option used during _irrPlot and rtrace
2742
        customname : str
2743
            Custom text string to be added to the file name for the results .CSV files
2744
        modWanted : int or list
2745
            Module to be sampled. Index starts at 1.
2746
        rowWanted : int or list
2747
            Row to be sampled. Index starts at 1. (row 1)
2748
        sensorsy : int or list 
2749
            Number of 'sensors' or scanning points along the collector width 
2750
            (CW) of the module(s). If multiple values are passed, first value
2751
            represents number of front sensors, second value is number of back sensors
2752
        sensorsx : int or list 
2753
            Number of 'sensors' or scanning points along the length, the side perpendicular 
2754
            to the collector width (CW) of the module(s) for the back side of the module. 
2755
            If multiple values are passed, first value represents number of 
2756
            front sensors, second value is number of back sensors.
2757
        modscanfront : dict
2758
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2759
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2760
            floats except for 'orient' which takes x y z values as string 'x y z'
2761
            for example '0 0 -1'. These values will overwrite the internally
2762
            calculated frontscan dictionary for the module & row selected. If modifying 
2763
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2764
            results writing stage. 
2765
        modscanback : dict
2766
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2767
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2768
            floats except for 'orient' which takes x y z values as string 'x y z'
2769
            for example '0 0 -1'. These values will overwrite the internally
2770
            calculated frontscan dictionary for the module & row selected.  If modifying 
2771
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2772
            results writing stage. 
2773
        relative : Bool
2774
            if passing modscanfront and modscanback to modify dictionarie of positions,
2775
            this sets if the values passed to be updated are relative or absolute. 
2776
            Default is absolute value (relative=False)
2777
        debug : Bool
2778
            Activates internal printing of the function to help debugging.
2779
        sceneNum : int
2780
            Index of the scene number in the list of scenes per trackerdict. default 0
2781
        append : Bool (default True)
2782
            Append trackerdict['AnalysisObj'] to list.  Otherwise over-write any
2783
            AnalysisObj's and start 1axis analysis from scratch
2784

2785
        Returns
2786
        -------
2787
        trackerdict is returned with :py:class:`bifacial_radiance.AnalysisObj`  
2788
            for each timestamp:
2789
    
2790
        trackerdict.key.'AnalysisObj'  : analysis object for this tracker theta
2791
            to get a dictionary of results, run :py:class:`bifacial_radiance.AnalysisObj`.getResults
2792
        :py:class:`bifacial_radiance.AnalysisObj`.getResults returns the following keys:
2793
            'Wm2Front'     : np.array of front Wm-2 irradiances, len=sensorsy_back
2794
            'Wm2Back'      : np.array of rear Wm-2 irradiances, len=sensorsy_back
2795
            'backRatio'    : np.array of rear irradiance ratios, len=sensorsy_back
2796

2797
        """
2798
        
2799
        import warnings, itertools
2✔
2800

2801
        if customname is None:
2✔
2802
            customname = ''
2✔
2803

2804
        if trackerdict == None:
2✔
2805
            try:
2✔
2806
                trackerdict = self.trackerdict
2✔
2807
            except AttributeError:
×
2808
                print('No trackerdict value passed or available in self')
×
2809
        
2810
        if not append:
2✔
2811
            warnings.warn('Append=False. Over-writing any existing `AnalysisObj` in trackerdict.')
×
2812
            for key in trackerdict:
×
2813
                trackerdict[key]['AnalysisObj'] = []
×
2814
    
2815
        if singleindex is None:  # run over all values in trackerdict
2✔
2816
            trackerkeys = sorted(trackerdict.keys())
2✔
2817
        else:                   # run in single index mode.
2818
            trackerkeys = [singleindex]
×
2819
    
2820
        if modWanted == None:
2✔
2821
            modWanted = round(trackerdict[trackerkeys[0]]['scenes'][sceneNum].sceneDict['nMods'] / 1.99)
2✔
2822
        if rowWanted == None:
2✔
2823
            rowWanted = round(trackerdict[trackerkeys[0]]['scenes'][sceneNum].sceneDict['nRows'] / 1.99)
2✔
2824
    
2825
    
2826
        #frontWm2 = 0 # container for tracking front irradiance across module chord. Dynamically size based on first analysis run
2827
        #backWm2 = 0 # container for tracking rear irradiance across module chord.
2828
        for index in trackerkeys:   # either full list of trackerdict keys, or single index
2✔
2829
            octfile = trackerdict[index]['octfile']
2✔
2830
            scene = trackerdict[index]['scenes'][sceneNum]
2✔
2831
            name = '1axis_%s%s_%s'%(index, customname, scene.name)
2✔
2832
            if not trackerdict[index].get('AnalysisObj'):
2✔
2833
                trackerdict[index]['AnalysisObj'] = []
2✔
2834
            if octfile is None:
2✔
2835
                continue  # don't run analysis if the octfile is none
×
2836
            # loop over rowWanted and modWanted.  Need to listify it first
2837
            if type(rowWanted)!=list:   rowWanted = [rowWanted]
2✔
2838
            if type(modWanted)!=list:   modWanted = [modWanted]
2✔
2839
    
2840
            row_mod_pairs = list(itertools.product(rowWanted,modWanted))
2✔
2841
            for (r,m) in row_mod_pairs:  
2✔
2842
                #Results = {'rowWanted':r,'modWanted':m, 'sceneNum':sceneNum}
2843
                #if customname: Results['customname'] = customname
2844
                try:  # look for missing data
2✔
2845
                    analysis = AnalysisObj(octfile,name)
2✔
2846
                    analysis.sceneNum = sceneNum
2✔
2847
                    #name = '1axis_%s%s_%s'%(index, customname, scene.name) #defined above
2848
                    frontscanind, backscanind = analysis.moduleAnalysis(scene=scene, modWanted=m, 
2✔
2849
                                                    rowWanted=r, 
2850
                                                    sensorsy=sensorsy, 
2851
                                                    sensorsx=sensorsx, 
2852
                                                    modscanfront=modscanfront, modscanback=modscanback,
2853
                                                    relative=relative, debug=debug)
2854
                    analysis.analysis(octfile=octfile,name=name,frontscan=frontscanind,backscan=backscanind,accuracy=accuracy)                
2✔
2855
                    trackerdict[index]['AnalysisObj'].append(analysis)
2✔
2856
                except Exception as e: # problem with file. TODO: only catch specific error types here.
×
2857
                    warnings.warn('Index: {}. Problem with file. Error: {}. Skipping'.format(index,e), Warning)
×
2858
                    return
×
2859
    
2860
                #combine cumulative front and back irradiance for each tracker angle
2861
                """
2✔
2862
                try:  #on error, trackerdict[index] is returned empty
2863
                    Results['Wm2Front'] = analysis.Wm2Front
2864
                    Results['Wm2Back'] = analysis.Wm2Back
2865
                    Results['backRatio'] = analysis.backRatio
2866
                except AttributeError as  e:  # no key Wm2Front.
2867
                    warnings.warn('Index: {}. Trackerdict key not found: {}. Skipping'.format(index,e), Warning)
2868
                    return
2869
                trackerdict[index]['Results'].append(Results)
2870
                """
2871
                try:
2✔
2872
                    print('Index: {}. Wm2Front: {}. Wm2Back: {}'.format(index,
2✔
2873
                      np.mean(analysis.Wm2Front), np.mean(analysis.Wm2Back)))
2874
                except AttributeError:  #no Wm2Front
×
2875
                    warnings.warn('AnalysisObj not successful.')
×
2876

2877
        self.trackerdict = trackerdict
2✔
2878
        return trackerdict
2✔
2879

2880
    def analysis1axisground(self, trackerdict=None, singleindex=None, accuracy='low',
2✔
2881
                      customname=None, modWanted=None, rowWanted=None, sensorsground=None, 
2882
                      sensorsgroundx=1, sceneNum=0, append=True):
2883
        """
2884
        uses :py:class:`bifacial_radiance.AnalysisObj`.groundAnalysis to run a
2885
        single ground scan along the entire row-row pitch. 
2886

2887
        Parameters
2888
        ----------
2889
        trackerdict : optional
2890
        singleindex : str
2891
            For single-index mode, just the one index we want to run (new in 0.2.3).
2892
            Example format '21_06_14_12_30' for 2021 June 14th 12:30 pm
2893
        accuracy : str
2894
            'low' (default) or 'high', resolution option used during _irrPlot and rtrace
2895
        customname : str
2896
            Custom text string to be added to the file name for the results .CSV files
2897
        modWanted : int
2898
            Module to be sampled. Index starts at 1.
2899
        rowWanted : int
2900
            Row to be sampled. Index starts at 1. (row 1)
2901
        sensorsground : int (default None)
2902
            Number of scan points along the scene pitch.  Default every 20cm
2903
        sensorsgroundx : int (default 1)
2904
            Number of scans in the x dimension
2905
        sceneNum : int
2906
            Index of the scene number in the list of scenes per trackerdict. default 0
2907
        append : Bool (default True)
2908
            Append trackerdict['AnalysisObj'] to list.  Otherwise over-write any
2909
            AnalysisObj's and start 1axis analysis from scratch
2910

2911
        Returns
2912
        -------
2913
        trackerdict is returned with :py:class:`bifacial_radiance.AnalysisObj`  
2914
            for each timestamp:
2915
    
2916
        trackerdict.key.'AnalysisObj'  : analysis object for this tracker theta
2917
            to get a dictionary of results, run :py:class:`bifacial_radiance.AnalysisObj`.getResults
2918
        :py:class:`bifacial_radiance.AnalysisObj`.getResults returns the following keys:
2919
            'Wm2Ground'     : np.array of Wm-2 irradiances along the ground, len=sensorsground
2920
            'sensorsground'      : int of number of ground scan points
2921

2922
        """
2923
        
2924
        import warnings, itertools
2✔
2925

2926
        if customname is None:
2✔
2927
            customname = ''
2✔
2928

2929
        if trackerdict == None:
2✔
2930
            try:
2✔
2931
                trackerdict = self.trackerdict
2✔
2932
            except AttributeError:
×
2933
                print('No trackerdict value passed or available in self')
×
2934
        
2935
        if not append:
2✔
2936
            warnings.warn('Append=False. Over-writing any existing `AnalysisObj` in trackerdict.')
×
2937
            for key in trackerdict:
×
2938
                trackerdict[key]['AnalysisObj'] = []
×
2939
    
2940
        if singleindex is None:  # run over all values in trackerdict
2✔
2941
            trackerkeys = sorted(trackerdict.keys())
2✔
2942
        else:                   # run in single index mode.
2943
            trackerkeys = [singleindex]
×
2944

2945
        for index in trackerkeys:   # either full list of trackerdict keys, or single index
2✔
2946
            octfile = trackerdict[index]['octfile']
2✔
2947
            scene = trackerdict[index]['scenes'][sceneNum]
2✔
2948
            name = '1axis_groundscan_%s%s'%(index,customname)
2✔
2949
            trackerdict[index]['Results'] = []
2✔
2950
            if octfile is None:
2✔
2951
                continue  # don't run analysis if the octfile is none
×
2952
        
2953
            #Results = {'Groundscan':customname}
2954
            try:  # look for missing data
2✔
2955
                analysis = AnalysisObj(octfile,name)
2✔
2956
                analysis.sceneNum = sceneNum
2✔
2957
                #name = '1axis_%s%s'%(index,customname)
2958
                groundscanid = analysis.groundAnalysis(scene=scene, modWanted=modWanted,
2✔
2959
                                                       rowWanted=rowWanted,
2960
                                                       sensorsground=sensorsground)
2961
                analysis.analysis(octfile=octfile,name=name,
2✔
2962
                                  frontscan=groundscanid, accuracy=accuracy)
2963
                #Results['AnalysisObj']=analysis
2964
                # try to push Wm2Ground and sensorsground into the AnalysisObj...
2965
                analysis.Wm2Ground = analysis.Wm2Front
2✔
2966
                del analysis.Wm2Front
2✔
2967
                analysis.sensorsground = analysis.Wm2Ground.__len__()
2✔
2968
                trackerdict[index]['AnalysisObj'].append(analysis)
2✔
2969
            except Exception as e: # problem with file. TODO: only catch specific error types here.
×
2970
                warnings.warn('Index: {}. Problem with file. Error: {}. Skipping'.format(index,e), Warning)
×
2971
                return
×
2972
            """
2✔
2973
            try:  #on error, trackerdict[index] is returned empty
2974
                Results['Wm2Ground'] = analysis.Wm2Front
2975
                Results['sensorsground'] = analysis.Wm2Front.__len__()
2976
            except AttributeError as  e:  # no key Wm2Front.
2977
                warnings.warn('Index: {}. Trackerdict key not found: {}. Skipping'.format(index,e), Warning)
2978
                return
2979
            trackerdict[index]['Results'].append(Results)
2980
            """
2981
            try:
2✔
2982
                print('Index: {}. Wm2Ground: {}. sensorsground: {}'.format(index,
2✔
2983
                    np.mean(analysis.Wm2Ground), sensorsground))
2984
            except AttributeError:  #no Wm2Front
×
2985
                warnings.warn('AnalysisObj not successful.')
×
2986
        return trackerdict
2✔
2987

2988

2989
    def calculateResults1axis(self, trackerdict=None, module=None,
2✔
2990
                             CECMod2=None, agriPV=False):
2991
            '''
2992
            Loops through all results in trackerdict and calculates performance, 
2993
            considering electrical mismatch, using
2994
            PVLib. Cell temperature is calculated 
2995

2996

2997
            Parameters
2998
             ----------
2999
            module: ModuleObj from scene.module
3000
                It's best to set this in advance in the ModuleObj. 
3001
                If passed in here, it overrides the value that may be set in the
3002
                trackerdict already.
3003
            CEcMod2 : Dict
3004
                Dictionary with CEC Module Parameters for a Monofacial module. If None,
3005
                same module as CECMod is used for the BGE calculations, but just 
3006
                using the front irradiance (Gfront). 
3007
                
3008
            Returns
3009
            -------
3010
            trackerdict 
3011
                Trackerdict with new entries for each key of irradiance and Power 
3012
                Output for the module.
3013
                POA_eff: mean of [(mean of clean Gfront) + clean Grear * bifaciality factor]
3014
                Gfront_mean: mean of clean Gfront
3015
                Grear_mean: mean of clean Grear
3016
                Mismatch: mismatch calculated from the MAD distribution of 
3017
                          POA_total
3018
                Pout_raw: power output calculated from POA_total, considers 
3019
                      wind speed and temp_amb if in trackerdict.
3020
                Pout: power output considering electrical mismatch
3021
            '''
3022
            
3023
            from bifacial_radiance import performance
2✔
3024
            import pandas as pd
2✔
3025
            
3026
            if trackerdict is None:
2✔
3027
                trackerdict = self.trackerdict
2✔
3028

3029
            keys = list(trackerdict.keys())
2✔
3030
            
3031
            def _trackerMeteo(tracker_item):
2✔
3032
                keylist = ['dni', 'ghi', 'dhi', 'temp_air', 'wind_speed' ]
2✔
3033
                return {k: v for k, v in tracker_item.items() if k in keylist}
2✔
3034
                
3035
            def _printRow(analysisobj, key):
2✔
3036
                if self.cumulativesky:
2✔
3037
                    keyname = 'theta'
2✔
3038
                else:
3039
                    keyname = 'timestamp'
2✔
3040
                return pd.concat([pd.DataFrame({keyname:key},index=[0]),
2✔
3041
                                 analysisobj.getResults(),
3042
                                 analysisobj.power_data 
3043
                                 ], axis=1)
3044
        
3045
                
3046

3047
            # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!1``34
3048
            # TODO IMPORTANT: ADD CUMULATIVE CHEck AND WHOLE OTHER PROCESSING OPTION
3049
            # TO EMULATE WHAT HAPPENED BEFORE WITH GENCUMSKY1AXIS when trackerdict = cumulative = True
3050
            # if cumulative:
3051
            #    print("Add HERE gencusky1axis results for each tracekr angle")
3052

3053
            #else:
3054
            # loop over module and row values in 'Results'
3055
            keys_all = []
2✔
3056
            self.CompiledResults = pd.DataFrame(None)
2✔
3057

3058
            if not self.cumulativesky:
2✔
3059
                                
3060
                for key in keys:
2✔
3061
            
3062
                    meteo_data = _trackerMeteo(trackerdict[key])
2✔
3063
                    
3064

3065
                    
3066
                    
3067
                    try:
2✔
3068
                        for analysis in trackerdict[key]['AnalysisObj']: # loop over multiple row & module in trackerDict['AnalysisObj']
2✔
3069
                            keys_all.append(key)
2✔
3070
                            # Search for module object 
3071
                            if module is None:
2✔
3072
                                module_local = trackerdict[key]['scenes'][analysis.sceneNum].module
2✔
3073
                            else:
3074
                                module_local = None
×
3075
                            power_data = analysis.calc_performance(meteo_data=meteo_data, 
2✔
3076
                                                          module=module_local,
3077
                                                          cumulativesky=self.cumulativesky,   
3078
                                                           CECMod2=CECMod2, 
3079
                                                          agriPV=agriPV)
3080
                            self.CompiledResults = pd.concat([self.CompiledResults, 
2✔
3081
                                                              _printRow(analysis, key)], ignore_index=True)
3082
                    except KeyError:
×
3083
                        pass
×
3084
                    
3085
                            
3086
                        
3087
                
3088
            else:
3089
                # TODO HERE: SUM all keys for rows that have the same rowWanted/modWanted
3090
                for key in keys:
2✔
3091
                    try:
2✔
3092
                        for analysis in trackerdict[key]['AnalysisObj']: # loop over multiple row & module in trackerDict['AnalysisObj']
2✔
3093
                            keys_all.append(key)
2✔
3094
                            self.CompiledResults = pd.concat([self.CompiledResults, 
2✔
3095
                                      _printRow(analysis, key)], ignore_index=True)
3096
                    except KeyError:
×
3097
                        pass
×
3098
           
3099

3100
                self.CompiledResults = performance.calculateResultsGencumsky1axis(results=self.CompiledResults,
2✔
3101
                                           bifacialityfactor=1.0,
3102
                                           fillcleanedSensors=True, agriPV=False)
3103
                
3104
                self.CompiledResults.to_csv(os.path.join('results', 'Cumulative_Results.csv'))
2✔
3105
                
3106
            self.trackerdict = trackerdict    
2✔
3107
            return self.CompiledResults
2✔
3108
        
3109
            
3110
    def generate_spectra(self, metdata=None, simulation_path=None, ground_material=None, scale_spectra=False,
2✔
3111
                         scale_albedo=False, scale_albedo_nonspectral_sim=False, scale_upper_bound=2500):
3112
        '''
3113
        Generate spectral irradiance files for spectral simulations using pySMARTS
3114
        Or
3115
        Generate an hourly albedo weighted by pySMARTS spectral irradiances
3116

3117
        Parameters
3118
        ----------
3119
        metdata : radianceObject.metdata, optional
3120
            DESC
3121
        simulation_path : path object or string, optional
3122
            path of current simulation directory
3123
        ground_material : str or (R,G,B), optional
3124
            ground material string from pySMARTS glossary or compatible
3125
            (R,G,B) tuple.
3126
        scale_spectra : boolean, default=False
3127
            Apply a simple scaling to the generated spectra. Scales by
3128
            integrated irradiance below specified upper wavelength bound
3129
        scale_albedo : boolean, default=False
3130
            Apply a scaling factor to generated spectral albedo.
3131
            Scales by mean value below specified upper wavelength bound
3132
        scale_albedo_nonspectral_sim : boolean, default=False
3133
            You intend to run a non-spectral simulation. This will scale
3134
            the albedo read from the weather file by a calculation
3135
            on measured and generated spectra and the spectral responsivity
3136
            of device (spectral responsivity currently not implemented)
3137
        scale_upper_bound : integer, optional
3138
            Sets an upper bound for the wavelength in all scaling
3139
            calculations. Limits the bounds of integration for spectral DNI,
3140
            DHI, and GHI. Limits the domain over which spectral albedo
3141
            is averaged.
3142
        
3143
        Returns
3144
        -------
3145
        spectral_alb : spectral_property class
3146
            spectral_alb.data:  dataframe with frequency and magnitude data.
3147
            returns as None when scale_albedo_nonspectral_sim == True
3148
        spectral_dni : spectral_property class
3149
            spectral_dni.data:  dataframe with frequency and magnitude data.
3150
        spectral_dhi : spectral_property class
3151
            spectral_dhi.data:  dataframe with frequency and magnitude data.
3152
        weighted_alb : pd.series
3153
            datetime-indexed series of weighted albedo values
3154
            returns as None when scale_albedo_nonspectral_sim == False
3155
        '''
3156
        if metdata == None:
2✔
3157
            metdata = self.metdata                        
2✔
3158
        if simulation_path == None:
2✔
3159
            simulation_path = self.path
2✔
3160

3161
        from bifacial_radiance import spectral_utils as su
2✔
3162
      
3163
        spectra_path = 'spectra'
2✔
3164
        if not os.path.exists(spectra_path):
2✔
3165
            os.mkdir(spectra_path)
×
3166
        
3167
        (spectral_alb, spectral_dni, spectral_dhi, weighted_alb) = su.generate_spectra(metdata=metdata,
2✔
3168
                            simulation_path=simulation_path,
3169
                            ground_material=ground_material,
3170
                            spectra_folder=spectra_path,
3171
                            scale_spectra=scale_spectra,
3172
                            scale_albedo=scale_albedo,
3173
                            scale_albedo_nonspectral_sim=scale_albedo_nonspectral_sim,
3174
                            scale_upper_bound=scale_upper_bound)
3175
        
3176
        if scale_albedo_nonspectral_sim:
2✔
3177
            self.metdata.albedo = weighted_alb.values
2✔
3178
        return (spectral_alb, spectral_dni, spectral_dhi, weighted_alb)
2✔
3179

3180
    def generate_spectral_tmys(self, wavelengths, weather_file, location_name, spectra_folder=None,
2✔
3181
                               output_folder=None):
3182
        """
3183
        Generate a series of TMY-like files with per-wavelength irradiance. There will be one file
3184
        per wavelength. These are necessary to run a spectral simulation with gencumsky
3185
        
3186
        Paramters:
3187
        ----------
3188
        wavelengths: (np.array or list)
3189
            array or list of integer wavelengths to simulate, in units [nm]. example: [300,325,350]
3190
        spectra_folder: (path or str)
3191
            File path or path-like string pointing to the folder contained the SMARTS generated spectra
3192
        weather_file: (path or str)
3193
            File path or path-like string pointing to the weather file used for spectra generation.
3194
            The structure of this file, and it's meta-data, will be copied into the new files.
3195
        location_name: 
3196
            _description_
3197
        output_folder: 
3198
            File path or path-like string pointing to the destination folder for spectral TMYs
3199
        """
3200
        from bifacial_radiance import spectral_utils as su
×
3201

3202
        if spectra_folder is None:
×
3203
            spectra_folder = 'spectra'
×
3204
        
3205
        if output_folder is None:
×
3206
            output_folder = os.path.join('data','spectral_tmys')
×
3207
        if not os.path.exists(output_folder):
×
3208
            os.makedirs(output_folder, exist_ok=True)
×
3209
        
3210
        su.generate_spectral_tmys(wavelengths=wavelengths, spectra_folder=spectra_folder,
×
3211
                                  weather_file=weather_file, location_name=location_name,
3212
                                  output_folder=output_folder)
3213

3214
# End RadianceObj definition
3215

3216
class GroundObj:
2✔
3217
    """
3218
    Class to set and return details for the ground surface materials and reflectance.
3219
    If 1 albedo value is passed, it is used as default.
3220
    If 3 albedo values are passed, they are assigned to each of the three wavelength placeholders (RGB),
3221
    
3222
    If material type is known, it is used to get reflectance info.  
3223
    if material type isn't known, material_info.list is returned
3224

3225
    Parameters
3226
    ------------
3227
    materialOrAlbedo : numeric or str
3228
        If number between 0 and 1 is passed, albedo input is assumed and assigned.    
3229
        If string is passed with the name of the material desired. e.g. 'litesoil',
3230
        properties are searched in `material_file`.
3231
        Default Material names to choose from: litesoil, concrete, white_EPDM, 
3232
        beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
3233
    material_file : str
3234
        Filename of the material information. Default `ground.rad`
3235
    silent       :  bool   
3236
        suppress print statements. Default False  
3237

3238
    Returns
3239
    -------
3240

3241
    """
3242
    def __repr__(self):
2✔
3243
        return str(self.__dict__)   
×
3244
    def __init__(self, materialOrAlbedo=None, material_file=None, silent=False):
2✔
3245
        #import warnings
3246
        from numbers import Number
2✔
3247
        
3248
        self.normval = None
2✔
3249
        self.ReflAvg = None
2✔
3250
        self.Rrefl = None
2✔
3251
        self.Grefl = None
2✔
3252
        self.Brefl = None
2✔
3253

3254
        self.ground_type = 'custom'        
2✔
3255

3256
        if material_file is None:
2✔
3257
            material_file = 'ground.rad'
2✔
3258
            
3259
        self.material_file = material_file           
2✔
3260
        if materialOrAlbedo is None: # Case where it's none.
2✔
3261
            print('\nInput albedo 0-1, or string from ground.printGroundMaterials().'
×
3262
            '\nAlternatively, run setGround after readWeatherData()'
3263
            'and setGround will read metdata.albedo if available')
3264
            return
×
3265
            
3266
        if isinstance(materialOrAlbedo, str) :
2✔
3267
            self.ground_type = materialOrAlbedo  
2✔
3268
            # Return the RGB albedo for material ground_type
3269
            materialOrAlbedo = self.printGroundMaterials(self.ground_type)
2✔
3270
            
3271
        # Check for double and int. 
3272
        if isinstance(materialOrAlbedo, Number):
2✔
3273
            materialOrAlbedo = np.array([[materialOrAlbedo, 
2✔
3274
                                          materialOrAlbedo, materialOrAlbedo]])
3275
        
3276
        if isinstance(materialOrAlbedo, list):
2✔
3277
            materialOrAlbedo = np.asarray(materialOrAlbedo)
×
3278
        
3279
        # By this point, materialOrAlbedo should be a np.ndarray:
3280
        if isinstance(materialOrAlbedo, np.ndarray):
2✔
3281

3282
            if materialOrAlbedo.ndim == 0:
2✔
3283
            # numpy array of one single value, i.e. np.array(0.62)
3284
            # after this if, np.array([0.62])
3285
                materialOrAlbedo = materialOrAlbedo.reshape([1])
×
3286
                
3287
            if materialOrAlbedo.ndim == 1:
2✔
3288
            # If np.array is ([0.62]), this repeats it so at the end it's
3289
            # np.array ([0.62, 0.62, 0.62])
3290
                materialOrAlbedo = np.repeat(np.array([materialOrAlbedo]), 
2✔
3291
                                             3, axis=1).reshape(
3292
                                                     len(materialOrAlbedo),3)
3293
            
3294
            if (materialOrAlbedo.ndim == 2) & (materialOrAlbedo.shape[1] > 3): 
2✔
3295
                    warnings.warn("Radiance only raytraces 3 wavelengths at "
2✔
3296
                                  "a time. Trimming albedo np.array input to "
3297
                                  "3 wavelengths.")
3298
                    materialOrAlbedo = materialOrAlbedo[:,0:3]
2✔
3299
        # By this point we should have np.array of dim=2 and shape[1] = 3.        
3300
        # Check for invalid values
3301
        if (materialOrAlbedo > 1).any() or (materialOrAlbedo < 0).any():
2✔
3302
            if not silent:
2✔
3303
                print('Warning: albedo values greater than 1 or less than 0. '
2✔
3304
                      'Constraining to [0..1]')
3305
            materialOrAlbedo = materialOrAlbedo.clip(min=0, max=1)
2✔
3306
        try:
2✔
3307
            self.Rrefl = materialOrAlbedo[:,0]
2✔
3308
            self.Grefl = materialOrAlbedo[:,1]
2✔
3309
            self.Brefl = materialOrAlbedo[:,2]
2✔
3310
            self.normval = _normRGB(materialOrAlbedo[:,0],materialOrAlbedo[:,1],
2✔
3311
                                    materialOrAlbedo[:,2])
3312
            self.ReflAvg = np.round(np.mean(materialOrAlbedo, axis=1),4)
2✔
3313
            if not silent:
2✔
3314
                print(f'Loading albedo, {self.ReflAvg.__len__()} value(s), '
2✔
3315
                      f'{self._nonzeromean(self.ReflAvg):0.3f} avg\n'
3316
                      f'{self.ReflAvg[self.ReflAvg != 0].__len__()} nonzero albedo values.')
3317
        except IndexError as e:
×
3318
            print('albedo.shape should be 3 column (N x 3)')
×
3319
            raise e
×
3320
            
3321
        # store list of columns and methods for convenience / introspection
3322
        # TODO: abstract this by making a super class that this inherits
3323
        self.columns =  [attr for attr in dir(self) if not (attr.startswith('_') or callable(getattr(self,attr)))]
2✔
3324
        self.methods = [attr for attr in dir(self) if (not attr.startswith('_') and callable(getattr(self,attr)))]
2✔
3325
    
3326
    def printGroundMaterials(self, materialString=None):
2✔
3327
        """
3328
        printGroundMaterials(materialString=None)
3329
        
3330
        input: None or materialString.  If None, return list of acceptable
3331
        material types from ground.rad.  If valid string, return RGB albedo
3332
        of the material type selected.
3333
        """
3334
        
3335
        #import warnings
3336
        material_path = 'materials'
2✔
3337
        
3338
        f = open(os.path.join(material_path, self.material_file))
2✔
3339
        keys = [] #list of material key names
2✔
3340
        Rreflall = []; Greflall=[]; Breflall=[] #RGB material reflectance  
2✔
3341
        temp = f.read().split()
2✔
3342
        f.close()
2✔
3343
        #return indices for 'plastic' definition
3344
        index = _findme(temp,'plastic')
2✔
3345
        for i in index:
2✔
3346
            keys.append(temp[i+1])# after plastic comes the material name
2✔
3347
            Rreflall.append(float(temp[i+5]))#RGB reflectance comes a few more down the list
2✔
3348
            Greflall.append(float(temp[i+6]))
2✔
3349
            Breflall.append(float(temp[i+7]))
2✔
3350
        
3351
        if materialString is not None:
2✔
3352
            try:
2✔
3353
                index = _findme(keys,materialString)[0]
2✔
3354
            except IndexError:
×
3355
                warnings.warn('Error - materialString not in '
×
3356
                              f'{self.material_file}: {materialString}')
3357
            return(np.array([[Rreflall[index], Greflall[index], Breflall[index]]]))
2✔
3358
        else:
3359
            return(keys)
2✔
3360
            
3361
    def _nonzeromean(self, val):
2✔
3362
        '''  array mean excluding zero. return zero if everything's zero'''
3363
        tempmean = np.nanmean(val)
2✔
3364
        if tempmean > 0:
2✔
3365
            tempmean = np.nanmean(val[val !=0])
2✔
3366
        return tempmean     
2✔
3367
        
3368
    def _makeGroundString(self, index=0, cumulativesky=False):
2✔
3369
        '''
3370
        create string with ground reflectance parameters for use in 
3371
        gendaylit and gencumsky.
3372
        
3373
        Parameters
3374
        -----------
3375
        index : integer
3376
            Index of time for time-series albedo. Default 0
3377
        cumulativesky:  Boolean
3378
            If true, set albedo to average of time series values.
3379

3380
        Returns
3381
        -------
3382
        groundstring:  text with albedo details to append to sky.rad in
3383
                       gendaylit
3384
        '''
3385
         
3386
        
3387
        try:  
2✔
3388
            if cumulativesky is True:
2✔
3389
                Rrefl = self._nonzeromean(self.Rrefl) 
2✔
3390
                Grefl = self._nonzeromean(self.Grefl) 
2✔
3391
                Brefl = self._nonzeromean(self.Brefl)
2✔
3392
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
3393
            else:
3394
                Rrefl = self.Rrefl[index]
2✔
3395
                Grefl = self.Grefl[index]
2✔
3396
                Brefl = self.Brefl[index]
2✔
3397
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
3398

3399
            # Check for all zero albedo case
3400
            if normval == 0:
2✔
3401
                normval = 1
×
3402
            
3403
            groundstring = ( f'\nskyfunc glow ground_glow\n0\n0\n4 ' 
2✔
3404
                f'{Rrefl/normval} {Grefl/normval} {Brefl/normval} 0\n' 
3405
                '\nground_glow source ground\n0\n0\n4 0 0 -1 180\n' 
3406
                f'\nvoid plastic {self.ground_type}\n0\n0\n5 '
3407
                f'{Rrefl:0.3f} {Grefl:0.3f} {Brefl:0.3f} 0 0\n'
3408
                f"\n{self.ground_type} ring groundplane\n" 
3409
                '0\n0\n8\n0 0 -.01\n0 0 1\n0 100' )
3410
        except IndexError as err:
×
3411
            print(f'Index {index} passed to albedo with only '
×
3412
                  f'{self.Rrefl.__len__()} values.'   )
3413
            raise err
×
3414
        return groundstring
2✔
3415

3416
        
3417

3418
class SceneObj:
2✔
3419
    '''
3420
    Scene information including PV module type, bifaciality, array info
3421
    pv module orientation defaults: Azimuth = 180 (south)
3422
    pv module origin: z = 0 bottom of frame. y = 0 lower edge of frame. 
3423
    x = 0 vertical centerline of module
3424

3425
    scene includes module details (x,y,bifi, sceney (collector_width), scenex)
3426
    
3427
    Parameters
3428
    ------------
3429
    module : str or ModuleObj
3430
            String name of module created with makeModule()
3431
    name : str
3432
           Identifier of scene in case of multiple scenes. Default `Scene0'.
3433
           Automatically increments if makeScene is run multiple times.
3434
    
3435
    '''
3436
    def __repr__(self):
2✔
3437
        return str(self.__dict__)
×
3438
    def __init__(self, module=None, name=None, hpc=False):
2✔
3439
        ''' initialize SceneObj
3440
        '''
3441
        from bifacial_radiance import ModuleObj
2✔
3442
        # should sceneDict be initialized here? This is set in _makeSceneNxR
3443
        if module is None:
2✔
3444
            return
×
3445
        elif type(module) == str:
2✔
3446
            self.module = ModuleObj(name=module)
2✔
3447

3448

3449
        elif type(module) == ModuleObj: # try moduleObj
2✔
3450
            self.module = module
2✔
3451

3452
        #self.moduleDict = self.module.getDataDict()
3453
        #self.scenex = self.module.scenex
3454
        #self.sceney = self.module.sceney
3455
        #self.offsetfromaxis = self.moduleDict['offsetfromaxis']
3456
        
3457
        self.modulefile = self.module.modulefile
2✔
3458
        self.hpc = hpc  #default False.  Set True by makeScene after sceneobj created.
2✔
3459
        if name is None:
2✔
3460
            self.name = 'Scene0'
2✔
3461
        else:
3462
            self.name = name
2✔
3463

3464
    def _makeSceneNxR(self, modulename=None, sceneDict=None, radname=None):
2✔
3465
        """
3466
        Arrange module defined in :py:class:`bifacial_radiance.SceneObj` into a N x R array.
3467
        Returns a :py:class:`bifacial_radiance.SceneObj` which contains details 
3468
        of the PV system configuration including `tilt`, `row pitch`, `hub_height`
3469
        or `clearance_height`, `nMod`s per row, `nRows` in the system.
3470

3471
        The returned scene has (0,0) coordinates centered at the module at the
3472
        center of the array. For 5 rows, that is row 3, for 4 rows, that is
3473
        row 2 also (rounds down). For 5 modules in the row, that is module 3,
3474
        for 4 modules in the row, that is module 2 also (rounds down)
3475

3476
        Parameters
3477
        ------------
3478
        modulename: str 
3479
            Name of module created with :py:class:`~bifacial_radiance.RadianceObj.makeModule`.
3480
        sceneDict : dictionary 
3481
            Dictionary of scene parameters.
3482
                clearance_height : numeric
3483
                    (meters). 
3484
                pitch : numeric
3485
                    Separation between rows
3486
                tilt : numeric 
3487
                    Valid input ranges -90 to 90 degrees
3488
                azimuth : numeric 
3489
                    A value denoting the compass direction along which the
3490
                    axis of rotation lies. Measured in decimal degrees East
3491
                    of North. [0 to 180) possible.
3492
                nMods : int 
3493
                    Number of modules per row (default = 20)
3494
                nRows : int 
3495
                    Number of rows in system (default = 7)
3496
        radname : str
3497
            String for name for radfile.
3498

3499

3500
        Returns
3501
        -------
3502
        radfile : str
3503
             Filename of .RAD scene in /objects/
3504
        scene : :py:class:`~bifacial_radiance.SceneObj `
3505
             Returns a `SceneObject` 'scene' with configuration details
3506

3507
        """
3508
        import copy
2✔
3509
        
3510
        if modulename is None:
2✔
3511
            modulename = self.module.name
2✔
3512

3513
        if sceneDict is None:
2✔
3514
            print('makeScene(modulename, sceneDict, nMods, nRows).  sceneDict'
×
3515
                  ' inputs: .tilt .azimuth .nMods .nRows' 
3516
                  ' AND .tilt or .gcr ; AND .hub_height or .clearance_height')
3517
        else: sceneDict = copy.deepcopy(sceneDict)
2✔
3518
            
3519

3520

3521
        if 'orientation' in sceneDict:
2✔
3522
            raise Exception('\n\n ERROR: Orientation format has been '
×
3523
                'deprecated since version 0.2.4. If you want to flip your '
3524
                'modules, on makeModule switch the x and y values.\n\n')
3525

3526
        if 'azimuth' not in sceneDict:
2✔
3527
            sceneDict['azimuth'] = 180
×
3528

3529
        if 'axis_tilt' not in sceneDict:
2✔
3530
            sceneDict['axis_tilt'] = 0
2✔
3531

3532
        if 'originx' not in sceneDict:
2✔
3533
            sceneDict['originx'] = 0
2✔
3534

3535
        if 'originy' not in sceneDict:
2✔
3536
            sceneDict['originy'] = 0
2✔
3537

3538
        if radname is None:
2✔
3539
            radname =  str(self.module.name).strip().replace(' ', '_')
2✔
3540

3541
        # loading variables
3542
        tilt = round(sceneDict['tilt'], 2)
2✔
3543
        azimuth = round(sceneDict['azimuth'], 2)
2✔
3544
        nMods = sceneDict['nMods']
2✔
3545
        nRows = sceneDict['nRows']
2✔
3546
        axis_tilt = sceneDict['axis_tilt']
2✔
3547
        originx = sceneDict ['originx']
2✔
3548
        originy = sceneDict['originy']
2✔
3549

3550
        # hub_height, clearance_height and height logic.
3551
        # this routine uses hub_height to move the panels up so it's important 
3552
        # to have a value for that, either obtianing from clearance_height 
3553
        # (if coming from makeScene) or from hub_height itself.
3554
        # it is assumed that if no clearance_height or hub_height is passed,
3555
        # hub_height = height.
3556

3557
        
3558
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, preferred='hub_height', 
2✔
3559
                                                     nonpreferred='clearance_height')
3560
        
3561
        if use_clearanceheight :
2✔
3562
            hubheight = sceneDict['clearance_height'] + 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
3563
            * self.module.sceney - self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
3564

3565
            title_clearance_height = sceneDict['clearance_height'] 
2✔
3566
        else:
3567
            hubheight = sceneDict['hub_height'] 
2✔
3568
            # this calculates clearance_height, used for the title
3569
            title_clearance_height = sceneDict['hub_height'] - 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
3570
            * self.module.sceney + self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
3571

3572
        try: 
2✔
3573
            if sceneDict['pitch'] >0:
2✔
3574
                pitch = sceneDict['pitch'] 
2✔
3575
            else:
3576
                raise Exception('default to gcr')
×
3577
            
3578
        except:
2✔
3579

3580
            if 'gcr' in sceneDict:
2✔
3581
                pitch = np.round(self.module.sceney/sceneDict['gcr'],3)
2✔
3582
            else:
3583
                raise Exception('No valid `pitch` or `gcr` in sceneDict')
×
3584

3585

3586

3587
        ''' INITIALIZE VARIABLES '''
2✔
3588
        text = '!xform '
2✔
3589

3590
        text += '-rx %s -t %s %s %s ' %(tilt, 0, 0, hubheight)
2✔
3591
        
3592
        # create nMods-element array along x, nRows along y. 1cm module gap.
3593
        text += '-a %s -t %s 0 0 -a %s -t 0 %s 0 ' %(nMods, self.module.scenex, nRows, pitch)
2✔
3594

3595
        # azimuth rotation of the entire shebang. Select the row to scan here based on y-translation.
3596
        # Modifying so center row is centered in the array. (i.e. 3 rows, row 2. 4 rows, row 2 too)
3597
        # Since the array is already centered on row 1, module 1, we need to increment by Nrows/2-1 and Nmods/2-1
3598

3599
        text += (f'-i 1 -t {-self.module.scenex*(round(nMods/1.999)*1.0-1)} '
2✔
3600
                 f'{-pitch*(round(nRows / 1.999)*1.0-1)} 0 -rz {180-azimuth} '
3601
                 f'-t {originx} {originy} 0 ' )
3602
        
3603
        #axis tilt only working for N-S trackers
3604
        if axis_tilt != 0 and azimuth == 90:  
2✔
3605
            print("Axis_Tilt is still under development. The scene will be "
×
3606
                  "created with the proper axis tilt, and the tracking angle"
3607
                  "will consider the axis_tilt, but the sensors for the "
3608
                  "analysis might not fall in the correct surfaces unless you"
3609
                  " manually position them for this version. Sorry! :D ")
3610
                  
3611
            text += (f'-rx {axis_tilt} -t 0 0 %s ' %(
×
3612
                self.module.scenex*(round(nMods/1.99)*1.0-1)*np.sin(
3613
                        axis_tilt * np.pi/180) ) )
3614

3615
        filename = (f'{radname}_C_{title_clearance_height:0.5f}_rtr_{pitch:0.5f}_tilt_{tilt:0.5f}_'
2✔
3616
                    f'{nMods}modsx{nRows}rows_origin{originx},{originy}.rad' )
3617
        
3618
        if self.hpc:
2✔
3619
            text += f'"{os.path.join(os.getcwd(), self.modulefile)}"' 
2✔
3620
            radfile = os.path.join(os.getcwd(), 'objects', filename) 
2✔
3621
        else:
3622
            text += os.path.join(self.modulefile)
2✔
3623
            radfile = os.path.join('objects',filename ) 
2✔
3624

3625
        # py2 and 3 compatible: binary write, encode text first
3626
        with open(radfile, 'wb') as f:
2✔
3627
            f.write(text.encode('ascii'))
2✔
3628

3629
        self.gcr = self.module.sceney / pitch
2✔
3630
        self.text = text
2✔
3631
        self.radfiles = radfile
2✔
3632
        self.sceneDict = sceneDict
2✔
3633
#        self.hub_height = hubheight
3634
        return radfile
2✔
3635
    
3636
    def appendtoScene(self, radfile=None, customObject=None,  text=''):
2✔
3637
        """
3638
        Appends to the `Scene radfile` in folder `\objects` the text command in Radiance
3639
        lingo created by the user.
3640
        Useful when using addCustomObject to the scene.
3641

3642
        Parameters
3643
        ----------
3644
        radfile: str, optional
3645
            Directory and name of where .rad scene file is stored. Default: self.radfiles
3646
        customObject : str
3647
            Directory and name of custom object .rad file is stored, and any geometry
3648
            modifications needed for it.
3649
        text : str, optional 
3650
            Command to be appended to the radfile which specifies its position 
3651
            in the scene. Do not leave empty spaces at the end.
3652

3653

3654
        Returns
3655
        -------
3656
        Nothing, the radfile must already be created and assigned when running this.
3657
        
3658
        """
3659
        
3660
        # py2 and 3 compatible: binary write, encode text first
3661

3662
        if not radfile: #by default, append to the first radfile in the list
2✔
3663
            if type(self.radfiles) == list:
2✔
3664
                radfile = self.radfiles[0]
2✔
3665
            elif type(self.radfiles) == str:
2✔
3666
                radfile = self.radfiles
2✔
3667
            else:
3668
                raise Exception('SceneObj.radfiles set improperly')
×
3669

3670
        if customObject:
2✔
3671
            text2 = '\n!xform -rx 0 ' + text + ' ' + customObject
2✔
3672
            
3673
            debug = False
2✔
3674
            if debug:
2✔
3675
                print (text2)
×
3676
    
3677
            with open(radfile, 'a+') as f:
2✔
3678
                f.write(text2)
2✔
3679
    
3680
   
3681
    def showScene(self):
2✔
3682
        """ 
3683
        Method to call objview on the scene included in self
3684
            
3685
        """
3686
        cmd = 'objview %s %s' % (os.path.join('materials', 'ground.rad'),
×
3687
                                         self.radfiles)
3688
        print('Rendering scene. This may take a moment...')
×
3689
        _,err = _popen(cmd,None)
×
3690
        if err is not None:
×
3691
            print('Error: {}'.format(err))
×
3692
            print('possible solution: install radwinexe binary package from '
×
3693
                  'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml'
3694
                  ' into your RADIANCE binaries path')
3695
            return
×
3696

3697
    def saveImage(self, filename=None, view=None):
2✔
3698
        """
3699
        Save an image of the scene to /images/. A default ground (concrete material) 
3700
        and sun (due East or West azimuth and 65 elevation) are created. 
3701

3702
        Parameters:    
3703
            filename : string, optional. name for image file, defaults to scene name
3704
            view     : string, optional.  name for view file in /views. default to 'side.vp'  
3705
                      Input of 'XYZ' into view will do a zoomed out view of the whole scene              
3706

3707
        """
3708
        import tempfile
2✔
3709
        
3710
        temp_dir = tempfile.TemporaryDirectory()
2✔
3711
        pid = os.getpid()
2✔
3712
        if filename is None:
2✔
3713
            filename = f'{self.name}'
×
3714
            
3715
        if view is None:
2✔
3716
            view = 'side.vp'
2✔
3717

3718
        # fake lighting temporary .radfile.  Use 65 elevation and +/- 90 azimuth
3719
        # use a concrete ground surface
3720
        if (self.sceneDict['azimuth'] > 100 and self.sceneDict['tilt'] >= 0) or \
2✔
3721
            (self.sceneDict['azimuth'] <= 100 and self.sceneDict['tilt'] < 0):
3722
            sunaz = 90
×
3723
        else:
3724
            sunaz = -90
2✔
3725
        ground = GroundObj('concrete', silent=True) 
2✔
3726
        ltfile = os.path.join(temp_dir.name, f'lt{pid}.rad')
2✔
3727
        with open(ltfile, 'w') as f:
2✔
3728
            f.write("!gensky -ang %s %s +s\n" %(65, sunaz) + \
2✔
3729
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
3730
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
3731
            ground._makeGroundString() )
3732
        
3733
        # make .rif and run RAD
3734
        riffile = os.path.join(temp_dir.name, f'ov{pid}.rif')
2✔
3735
        with open(riffile, 'w') as f:
2✔
3736
                f.write("scene= materials/ground.rad " +\
2✔
3737
                        f"{self.radfiles} {ltfile}\n".replace("\\",'/') +\
3738
                    f"EXPOSURE= .5\nUP= Z\nview= {view.replace('.vp','')} -vf views/{view}\n" +\
3739
                    f"oconv= -f\nPICT= images/{filename}")
3740
        _,err = _popen(["rad",'-s',riffile], None)
2✔
3741
        if err:
2✔
3742
            print(err)
×
3743
        else:
3744
            print(f"Scene image saved: images/{filename}_{view.replace('.vp','')}.hdr")
2✔
3745
        
3746
        temp_dir.cleanup()
2✔
3747

3748
    def addPiles(self, spacingPiles=6, pile_lenx=0.2, pile_leny=0.2, pile_height=None, debug=True):
2✔
3749
        '''
3750
        Function to add support piles at determined intervals throughout the rows.
3751
        TODO: enable functionality or check for scenes using 'clearance_height' ?
3752
        TODO: enable functionality with makeScene1axis (append radfile to each trackerdict entry)
3753
        
3754
        Parameters
3755
        ----------
3756
        spacingPiles : float
3757
            Distance between support piles.
3758
        pile_lenx : float
3759
            Dimension of the pile on the row-x direction, in meters. Default is 0.2
3760
        pile_leny: float
3761
            Dimension of the pile on the row-y direction, in meters. Defualt is 0.2
3762
        pile_height : float
3763
            Dimension of the pile on the z-direction, from the ground up. If None,
3764
            value of hub_height is used. Default: None.
3765
            
3766
        Returns
3767
        -------
3768
        None
3769
        
3770
        '''
3771

3772
        nMods = self.sceneDict['nMods'] 
2✔
3773
        nRows = self.sceneDict['nRows']           
2✔
3774
        module = self.module
2✔
3775

3776
        if pile_height is None:
2✔
3777
            pile_height = self.sceneDict['hub_height']
2✔
3778
            print("pile_height!", pile_height)
2✔
3779
            
3780
        rowlength = nMods * module.scenex
2✔
3781
        nPiles = np.floor(rowlength/spacingPiles) + 1
2✔
3782
        pitch = self.sceneDict['pitch']
2✔
3783
        azimuth=self.sceneDict['azimuth']
2✔
3784
        originx = self.sceneDict['originx']
2✔
3785
        originy = self.sceneDict['originy']
2✔
3786
    
3787
        text='! genbox black post {} {} {} '.format(pile_lenx, pile_leny, pile_height)
2✔
3788
        text+='| xform -t {} {} 0 '.format(pile_lenx/2.0, pile_leny/2.0)
2✔
3789

3790
        if self.hpc:
2✔
3791
            radfilePiles = os.path.join(os.getcwd(), 'objects', 'Piles.rad')
×
3792
        else:
3793
            radfilePiles = os.path.join('objects','post.rad')
2✔
3794

3795
        # py2 and 3 compatible: binary write, encode text first
3796
        with open(radfilePiles, 'wb') as f:
2✔
3797
            f.write(text.encode('ascii'))
2✔
3798
                    
3799
        
3800
        # create nPiles -element array along x, nRows along y. 1cm module gap.
3801
        text = '!xform -rx 0 -a %s -t %s 0 0 -a %s -t 0 %s 0 ' %(nPiles, spacingPiles, nRows, pitch)
2✔
3802

3803
        # azimuth rotation of the entire shebang. Select the row to scan here based on y-translation.
3804
        # Modifying so center row is centered in the array. (i.e. 3 rows, row 2. 4 rows, row 2 too)
3805
        # Since the array is already centered on row 1, module 1, we need to increment by Nrows/2-1 and Nmods/2-1
3806

3807
        text += (f'-i 1 -t {-self.module.scenex*(round(nMods/1.999)*1.0-1)} '
2✔
3808
                 f'{-pitch*(round(nRows / 1.999)*1.0-1)} 0 -rz {180-azimuth} '
3809
                 f'-t {originx} {originy} 0 ' )
3810

3811
        filename = (f'Piles_{spacingPiles}_{pile_lenx}_{pile_leny}_{pile_height}.rad')
2✔
3812

3813
        if self.hpc:
2✔
3814
            text += f'"{os.path.join(os.getcwd(), radfilePiles)}"'
×
3815
            scenePilesRad = os.path.join(os.getcwd(), 'objects', filename) 
×
3816
        else:
3817
            text += os.path.join(radfilePiles)
2✔
3818
            scenePilesRad = os.path.join('objects',filename ) 
2✔
3819

3820
        # py2 and 3 compatible: binary write, encode text first
3821
        with open(scenePilesRad, 'wb') as f:
2✔
3822
            f.write(text.encode('ascii'))
2✔
3823

3824
        try:
2✔
3825
            self.radfiles.append(scenePilesRad)
2✔
3826
            if debug:
2✔
3827
                print( "Piles Radfile Appended")
2✔
3828
        except:
×
3829
            #TODO: Manage situation where radfile was created with
3830
            #appendRadfile to False first..
3831
            self.radfiles=[]
×
3832
            self.radfiles.append(scenePilesRad)
×
3833
            
3834
        if debug:   
2✔
3835
            print("Piles Created and Appended Successfully.")
2✔
3836

3837
        return
2✔
3838

3839
# end of SceneObj
3840

3841

3842
        
3843
class MetObj:
2✔
3844
    """
3845
    Meteorological data from EPW file.
3846

3847
    Initialize the MetObj from tmy data already read in. 
3848
    
3849
    Parameters
3850
    -----------
3851
    tmydata : DataFrame
3852
        TMY3 output from :py:class:`~bifacial_radiance.RadianceObj.readTMY` or 
3853
        from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3854
    metadata : Dictionary
3855
        Metadata output from output from :py:class:`~bifacial_radiance.RadianceObj.readTMY`` 
3856
        or from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3857
    label : str
3858
        label : str
3859
        'left', 'right', or 'center'. For data that is averaged, defines if the
3860
        timestamp refers to the left edge, the right edge, or the center of the
3861
        averaging interval, for purposes of calculating sunposition. For
3862
        example, TMY3 data is right-labeled, so 11 AM data represents data from
3863
        10 to 11, and sun position should be calculated at 10:30 AM.  Currently
3864
        SAM and PVSyst use left-labeled interval data and NSRDB uses centered.
3865
        
3866
    Once initialized, the following parameters are available in the MetObj:
3867
        -latitude, longitude, elevation, timezone, city [scalar values]
3868
        
3869
        -datetime, ghi, dhi, dni, albedo, dewpoint, pressure, temp_air, 
3870
        wind_speed, meastracker_angle [numpy.array]
3871
        
3872
        -solpos [pandas dataframe of solar position]
3873

3874
    """
3875
    def __repr__(self):
2✔
3876
        return str(self.__dict__)  
×
3877
    def __init__(self, tmydata, metadata, label = 'right'):
2✔
3878

3879
        import pytz
2✔
3880
        import pvlib
2✔
3881
        #import numpy as np
3882
        
3883
        #First prune all GHI = 0 timepoints.  New as of 0.4.0
3884
        # TODO: is this a good idea?  This changes default behavior...
3885
        tmydata = tmydata[tmydata.GHI > 0]
2✔
3886

3887
        #  location data.  so far needed:
3888
        # latitude, longitude, elevation, timezone, city
3889
        self.latitude = metadata['latitude']; lat=self.latitude
2✔
3890
        self.longitude = metadata['longitude']; lon=self.longitude
2✔
3891
        self.elevation = metadata['altitude']; elev=self.elevation
2✔
3892
        self.timezone = metadata['TZ']
2✔
3893

3894
        try:
2✔
3895
            self.city = metadata['Name'] # readepw version
2✔
3896
        except KeyError:
2✔
3897
            self.city = metadata['city'] # pvlib version
2✔
3898
        #self.location.state_province_region = metadata['State'] # unecessary
3899
        self.datetime = tmydata.index.tolist() # this is tz-aware.
2✔
3900
        self.ghi = np.array(tmydata.GHI)
2✔
3901
        self.dhi = np.array(tmydata.DHI)
2✔
3902
        self.dni = np.array(tmydata.DNI)
2✔
3903
        self.albedo = np.array(_firstlist([tmydata.get('Alb'), tmydata.get('albedo'), 
2✔
3904
                                           tmydata.get('Albedo')]) )
3905
        if pd.isnull(self.albedo).all():   self.albedo = None
2✔
3906
        
3907
        # Try and retrieve dewpoint and pressure
3908
        try:
2✔
3909
            self.dewpoint = np.array(tmydata['temp_dew'])
2✔
3910
        except KeyError:
2✔
3911
            self.dewpoint = None
2✔
3912

3913
        try:
2✔
3914
            self.pressure = np.array(tmydata['atmospheric_pressure'])
2✔
3915
        except KeyError:
2✔
3916
            self.pressure = None
2✔
3917

3918
        try:
2✔
3919
            self.temp_air = np.array(tmydata['temp_air'])
2✔
3920
        except KeyError:
2✔
3921
            self.temp_air = None
2✔
3922

3923
        if self.temp_air is None:
2✔
3924
            try:
2✔
3925
                self.temp_air = np.array(tmydata['DryBulb'])
2✔
3926
            except KeyError:
×
3927
                self.temp_air = None
×
3928

3929
        try:
2✔
3930
            self.wind_speed = np.array(tmydata['wind_speed'])
2✔
3931
        except KeyError:
2✔
3932
            self.wind_speed = None
2✔
3933
        
3934
        if self.wind_speed is None:
2✔
3935
            try:
2✔
3936
                self.wind_speed = np.array(tmydata['Wspd'])
2✔
3937
            except KeyError:
×
3938
                self.wind_speed = None
×
3939
            
3940
        # Try and retrieve TrackerAngle
3941
        try:
2✔
3942
            self.meastracker_angle = np.array(tmydata['Tracker Angle (degrees)'])
2✔
3943
        except KeyError:
2✔
3944
            self.meastracker_angle= None
2✔
3945
            
3946
            
3947
        #v0.2.5: initialize MetObj with solpos, sunrise/set and corrected time
3948
        datetimetz = pd.DatetimeIndex(self.datetime)
2✔
3949
        try:  # make sure the data is tz-localized.
2✔
3950
            datetimetz = datetimetz.tz_localize(pytz.FixedOffset(self.timezone*60))#  use pytz.FixedOffset (in minutes)
2✔
3951
        except TypeError:  # data is tz-localized already. Just put it in local time.
2✔
3952
            datetimetz = datetimetz.tz_convert(pytz.FixedOffset(self.timezone*60))
2✔
3953
        #check for data interval. default 1h.
3954
        try:
2✔
3955
            interval = datetimetz[1]-datetimetz[0]
2✔
3956
        except IndexError:
2✔
3957
            interval = pd.Timedelta('1h') # ISSUE: if 1 datapoint is passed, are we sure it's hourly data?
2✔
3958
            print ("WARNING: TMY interval was unable to be defined, so setting it to 1h.")
2✔
3959
        # TODO:  Refactor this into a subfunction. first calculate minutedelta 
3960
        # based on label and interval (-30, 0, +30, +7.5 etc) then correct all.        
3961
        if label.lower() == 'center':
2✔
3962
            print("Calculating Sun position for center labeled data, at exact timestamp in input Weather File")
2✔
3963
            sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
3964
            sunup['corrected_timestamp'] = datetimetz
2✔
3965
        else:
3966
            if interval== pd.Timedelta('1h'):
2✔
3967

3968
                if label.lower() == 'right':
2✔
3969
                    print("Calculating Sun position for Metdata that is right-labeled ", 
2✔
3970
                          "with a delta of -30 mins. i.e. 12 is 11:30 sunpos")
3971
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
3972
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes before timestamp
2✔
3973
                    # vector update of minutedelta at sunrise
3974
                    sunrisemask = sunup.index.hour-1==sunup['sunrise'].dt.hour
2✔
3975
                    sunup['minutedelta'].mask(sunrisemask,np.floor((60-(sunup['sunrise'].dt.minute))/2),inplace=True)
2✔
3976
                    # vector update of minutedelta at sunset
3977
                    sunsetmask = sunup.index.hour-1==sunup['sunset'].dt.hour
2✔
3978
                    sunup['minutedelta'].mask(sunsetmask,np.floor((60-(sunup['sunset'].dt.minute))/2),inplace=True)
2✔
3979
                    # save corrected timestamp
3980
                    sunup['corrected_timestamp'] = sunup.index-pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
3981
        
3982
                elif label.lower() == 'left':        
2✔
3983
                    print("Calculating Sun position for Metdata that is left-labeled ",
2✔
3984
                          "with a delta of +30 mins. i.e. 12 is 12:30 sunpos.")
3985
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) 
2✔
3986
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes after timestamp
2✔
3987
                    # vector update of minutedelta at sunrise
3988
                    sunrisemask = sunup.index.hour==sunup['sunrise'].dt.hour
2✔
3989
                    sunup['minutedelta'].mask(sunrisemask,np.ceil((60+sunup['sunrise'].dt.minute)/2),inplace=True)
2✔
3990
                    # vector update of minutedelta at sunset
3991
                    sunsetmask = sunup.index.hour==sunup['sunset'].dt.hour
2✔
3992
                    sunup['minutedelta'].mask(sunsetmask,np.ceil((60+sunup['sunset'].dt.minute)/2),inplace=True)
2✔
3993
                    # save corrected timestamp
3994
                    sunup['corrected_timestamp'] = sunup.index+pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
3995
                else: raise ValueError('Error: invalid weather label passed. Valid inputs: right, left or center')
×
3996
            else:
3997
                minutedelta = int(interval.seconds/2/60)
×
3998
                print("Interval in weather data is less than 1 hr, calculating"
×
3999
                      f" Sun position with a delta of -{minutedelta} minutes.")
4000
                print("If you want no delta for sunposition, use "
×
4001
                      "readWeatherFile( label='center').")
4002
                #datetimetz=datetimetz-pd.Timedelta(minutes = minutedelta)   # This doesn't check for Sunrise or Sunset
4003
                #sunup= pvlib.irradiance.solarposition.get_sun_rise_set_transit(datetimetz, lat, lon) # deprecated in pvlib 0.6.1
4004
                sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
×
4005
                sunup['corrected_timestamp'] = sunup.index-pd.Timedelta(minutes = minutedelta)
×
4006
    
4007
        self.solpos = pvlib.irradiance.solarposition.get_solarposition(sunup['corrected_timestamp'],lat,lon,elev)
2✔
4008
        self.sunrisesetdata=sunup
2✔
4009
        self.label = label
2✔
4010
        self.columns =  [attr for attr in dir(self) if not attr.startswith('_')]
2✔
4011

4012
    def _set1axis(self, azimuth=180, limit_angle=45, angledelta=None, 
2✔
4013
                  backtrack=True, gcr=1.0/3.0, cumulativesky=True, 
4014
                  fixed_tilt_angle=None, axis_tilt=0, useMeasuredTrackerAngle=False):
4015

4016
        """
4017
        Set up geometry for 1-axis tracking cumulativesky.  Solpos data
4018
        already stored in `metdata.solpos`. Pull in tracking angle details from
4019
        pvlib, create multiple 8760 metdata sub-files where datetime of met
4020
        data matches the tracking angle.
4021

4022
        Parameters
4023
        ------------
4024
        cumulativesky : bool
4025
            Whether individual csv files are created
4026
            with constant tilt angle for the cumulativesky approach.
4027
            if false, the gendaylit tracking approach must be used.
4028
        azimuth : numerical
4029
            orientation axis of tracker torque tube. Default North-South (180 deg)
4030
            For fixed tilt simulations  this is the orientation azimuth
4031
        limit_angle : numerical
4032
            +/- limit angle of the 1-axis tracker in degrees. Default 45
4033
        angledelta : numerical
4034
            Degree of rotation increment to parse irradiance bins.
4035
            Default 5 degrees (0.4 % error for DNI).
4036
            Other options: 4 (.25%), 2.5 (0.1%).
4037
            (the smaller the angledelta, the more simulations)
4038
        backtrack : bool
4039
            Whether backtracking is enabled (default = True)
4040
        gcr : float
4041
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
4042
        axis_tilt : float
4043
            Tilt of the axis. While it can be considered for the tracking calculation,
4044
            the scene geometry creation of the trackers does not support tilte
4045
            axis_trackers yet (but can be done manuallyish. See Tutorials)
4046
        fixed_tilt_angle : numeric
4047
            If passed, this changes to a fixed tilt simulation where each hour
4048
            uses fixed_tilt_angle and azimuth as the tilt and azimuth
4049

4050
        Returns
4051
        -------
4052
        trackerdict : dictionary 
4053
            Keys for tracker tilt angles and
4054
            list of csv metfile, and datetimes at that angle
4055
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
4056
        metdata.solpos : dataframe
4057
            Dataframe with output from pvlib solar position for each timestep
4058
        metdata.sunrisesetdata :
4059
            Pandas dataframe with sunrise, sunset and adjusted time data.
4060
        metdata.tracker_theta : list
4061
            Tracker tilt angle from pvlib for each timestep
4062
        metdata.surface_tilt : list
4063
            Tracker surface tilt angle from pvlib for each timestep
4064
        metdata.surface_azimuth : list
4065
            Tracker surface azimuth angle from pvlib for each timestep
4066
        """
4067
          
4068
        #axis_tilt = 0       # only support 0 tilt trackers for now
4069
        self.cumulativesky = cumulativesky   # track whether we're using cumulativesky or gendaylit
2✔
4070

4071
        if (cumulativesky is True) & (angledelta is None):
2✔
4072
            angledelta = 5  # round angle to 5 degrees for cumulativesky
×
4073

4074
        # get 1-axis tracker angles for this location,
4075
        # round to nearest 'angledelta'
4076
        if self.meastracker_angle is not None and useMeasuredTrackerAngle is True:
2✔
4077
            print("Tracking Data: Reading from provided Tracker Angles")
2✔
4078
        elif self.meastracker_angle is None and useMeasuredTrackerAngle is True:
2✔
4079
            useMeasuredTrackerAngle = False
×
4080
            print("Warning: Using Measured Tracker Angles was specified but DATA"+
×
4081
                  " for trackers has not yet been assigned. "+
4082
                  " Assign it by making it a column on your Weatherdata File "+
4083
                  "named 'Tracker Angle (degrees)' and run ReadWeatherFile again")
4084

4085
        trackingdata = self._getTrackingAngles(azimuth,
2✔
4086
                                               limit_angle,
4087
                                               angledelta,
4088
                                               axis_tilt = axis_tilt,
4089
                                               backtrack = backtrack,
4090
                                               gcr = gcr,
4091
                                               fixed_tilt_angle=fixed_tilt_angle,
4092
                                               useMeasuredTrackerAngle=useMeasuredTrackerAngle)
4093

4094
        # get list of unique rounded tracker angles
4095
        theta_list = trackingdata.dropna()['theta_round'].unique()
2✔
4096

4097
        if cumulativesky is True:
2✔
4098
            # create a separate metfile for each unique tracker theta angle.
4099
            # return dict of filenames and details
4100
            trackerdict = self._makeTrackerCSV(theta_list,trackingdata)
2✔
4101
        else:
4102
            # trackerdict uses timestamp as keys. return azimuth
4103
            # and tilt for each timestamp
4104
            #times = [str(i)[5:-12].replace('-','_').replace(' ','_') for i in self.datetime]
4105
            times = [i.strftime('%Y-%m-%d_%H%M') for i in self.datetime]
2✔
4106
            #trackerdict = dict.fromkeys(times)
4107
            trackerdict = {}
2✔
4108
            for i,time in enumerate(times) :
2✔
4109
                # remove NaN tracker theta from trackerdict
4110
                if (self.ghi[i] > 0) & (~np.isnan(self.tracker_theta[i])):
2✔
4111
                    trackerdict[time] = {
2✔
4112
                                        'surf_azm':self.surface_azimuth[i],
4113
                                        'surf_tilt':self.surface_tilt[i],
4114
                                        'theta':self.tracker_theta[i],
4115
                                        'dni':self.dni[i],
4116
                                        'ghi':self.ghi[i],
4117
                                        'dhi':self.dhi[i],
4118
                                        'temp_air':self.temp_air[i],
4119
                                        'wind_speed':self.wind_speed[i]
4120
                                        }
4121

4122
        return trackerdict
2✔
4123

4124

4125
    def _getTrackingAngles(self, azimuth=180, limit_angle=45,
2✔
4126
                           angledelta=None, axis_tilt=0, backtrack=True,
4127
                           gcr = 1.0/3.0, fixed_tilt_angle=None,
4128
                           useMeasuredTrackerAngle=False):
4129
        '''
4130
        Helper subroutine to return 1-axis tracker tilt and azimuth data.
4131

4132
        Parameters
4133
        ----------
4134
        same as pvlib.tracking.singleaxis, plus:
4135
        angledelta : degrees
4136
            Angle to round tracker_theta to.  This is for
4137
            cumulativesky simulations. Other input options: None (no 
4138
            rounding of tracker angle) 
4139
        fixed_tilt_angle : (Optional) degrees
4140
            This changes to a fixed tilt simulation where each hour uses 
4141
            fixed_tilt_angle and azimuth as the tilt and azimuth
4142

4143
        Returns
4144
        -------
4145
        DataFrame with the following columns:
4146
            * tracker_theta: The rotation angle of the tracker.
4147
                tracker_theta = 0 is horizontal, and positive rotation angles 
4148
                are clockwise.
4149
            * aoi: The angle-of-incidence of direct irradiance onto the
4150
                rotated panel surface.
4151
            * surface_tilt: The angle between the panel surface and the earth
4152
                surface, accounting for panel rotation.
4153
            * surface_azimuth: The azimuth of the rotated panel, determined by
4154
                projecting the vector normal to the panel's surface to the 
4155
                earth's  surface.
4156
            * 'theta_round' : tracker_theta rounded to the nearest 'angledelta'
4157
            If no angledelta is specified, it is rounded to the nearest degree.
4158
        '''
4159
        import pvlib
2✔
4160
        #import warnings
4161
        from pvlib.irradiance import aoi 
2✔
4162
        #import numpy as np
4163
        #import pandas as pd
4164
        
4165
        solpos = self.solpos
2✔
4166
        
4167
        #New as of 0.3.2:  pass fixed_tilt_angle and switches to FIXED TILT mode
4168

4169
        if fixed_tilt_angle is not None:
2✔
4170
            # system with fixed tilt = fixed_tilt_angle 
4171
            surface_tilt=fixed_tilt_angle
2✔
4172
            surface_azimuth=azimuth 
2✔
4173
            # trackingdata keys: 'tracker_theta', 'aoi', 'surface_azimuth', 'surface_tilt'
4174
            trackingdata = pd.DataFrame({'tracker_theta':fixed_tilt_angle,
2✔
4175
                                         'aoi':aoi(surface_tilt, surface_azimuth,
4176
                                                   solpos['zenith'], 
4177
                                                   solpos['azimuth']),
4178
                                         'surface_azimuth':azimuth,
4179
                                         'surface_tilt':fixed_tilt_angle})
4180
        elif useMeasuredTrackerAngle:           
2✔
4181
            # tracked system
4182
            surface_tilt=self.meastracker_angle
2✔
4183
            surface_azimuth=azimuth
2✔
4184

4185
            trackingdata = pd.DataFrame({'tracker_theta':self.meastracker_angle,
2✔
4186
                                         'aoi':aoi(surface_tilt, surface_azimuth,
4187
                                                   solpos['zenith'], 
4188
                                                   solpos['azimuth']),
4189
                                         'surface_azimuth':azimuth,
4190
                                         'surface_tilt':abs(self.meastracker_angle)})
4191

4192

4193
        else:
4194
            # get 1-axis tracker tracker_theta, surface_tilt and surface_azimuth
4195
            with warnings.catch_warnings():
2✔
4196
                warnings.filterwarnings("ignore", category=RuntimeWarning)
2✔
4197
                trackingdata = pvlib.tracking.singleaxis(solpos['zenith'],
2✔
4198
                                                     solpos['azimuth'],
4199
                                                     axis_tilt,
4200
                                                     azimuth,
4201
                                                     limit_angle,
4202
                                                     backtrack,
4203
                                                     gcr)
4204
            
4205
        # save tracker tilt information to metdata.tracker_theta,
4206
        # metdata.surface_tilt and metdata.surface_azimuth
4207
        self.tracker_theta = np.round(trackingdata['tracker_theta'],2).tolist()
2✔
4208
        self.surface_tilt = np.round(trackingdata['surface_tilt'],2).tolist()
2✔
4209
        self.surface_azimuth = np.round(trackingdata['surface_azimuth'],2).tolist()
2✔
4210
        # undo the  timestamp offset put in by solpos.
4211
        #trackingdata.index = trackingdata.index + pd.Timedelta(minutes = 30)
4212
        # It may not be exactly 30 minutes any more...
4213
        trackingdata.index = self.sunrisesetdata.index  #this has the original time data in it
2✔
4214

4215
        # round tracker_theta to increments of angledelta for use in cumulativesky
4216
        def _roundArbitrary(x, base=angledelta):
2✔
4217
            # round to nearest 'base' value.
4218
            # mask NaN's to avoid rounding error message
4219
            return base * (x/float(base)).round()
2✔
4220

4221
        if angledelta == 0:
2✔
4222
            raise ZeroDivisionError('Angledelta = 0. Use None instead')
×
4223
        elif angledelta is None: # don't round theta
2✔
4224
            trackingdata['theta_round'] = trackingdata['tracker_theta']
×
4225
        else:  # round theta
4226
            trackingdata['theta_round'] = \
2✔
4227
                _roundArbitrary(trackingdata['tracker_theta'], angledelta)
4228

4229
        return trackingdata
2✔
4230

4231
    def _makeTrackerCSV(self, theta_list, trackingdata):
2✔
4232
        '''
4233
        Create multiple new irradiance csv files with data for each unique
4234
        rounded tracker angle. Return a dictionary with the new csv filenames
4235
        and other details, Used for cumulativesky tracking
4236

4237
        Parameters
4238
        -----------
4239
        theta_list : array
4240
             Array of unique tracker angle values
4241

4242
        trackingdata : Pandas 
4243
             Pandas Series with hourly tracker angles from
4244
             :pvlib.tracking.singleaxis
4245

4246
        Returns
4247
        --------
4248
        trackerdict : dictionary
4249
              keys: *theta_round tracker angle  (default: -45 to +45 in
4250
                                                 5 degree increments).
4251
              sub-array keys:
4252
                  *datetime:  array of datetime strings in this group of angles
4253
                  *count:  number of datapoints in this group of angles
4254
                  *surf_azm:  tracker surface azimuth during this group of angles
4255
                  *surf_tilt:  tilt angle average during this group of angles
4256
                  *csvfile:  name of csv met data file saved in /EPWs/
4257
        '''
4258

4259
        dt = pd.to_datetime(self.datetime)
2✔
4260

4261
        trackerdict = dict.fromkeys(theta_list)
2✔
4262

4263
        for theta in sorted(trackerdict):  
2✔
4264
            trackerdict[theta] = {}
2✔
4265
            csvfile = os.path.join('EPWs', '1axis_{}.csv'.format(theta))
2✔
4266
            tempdata = trackingdata[trackingdata['theta_round'] == theta]
2✔
4267

4268
            #Set up trackerdict output for each value of theta
4269
            trackerdict[theta]['csvfile'] = csvfile
2✔
4270
            trackerdict[theta]['surf_azm'] = tempdata['surface_azimuth'].median()
2✔
4271
            trackerdict[theta]['surf_tilt'] = abs(theta)
2✔
4272
            datetimetemp = tempdata.index.strftime('%Y-%m-%d %H:%M:%S') #local time
2✔
4273
            trackerdict[theta]['datetime'] = datetimetemp
2✔
4274
            trackerdict[theta]['count'] = datetimetemp.__len__()
2✔
4275
            #Create new temp csv file with zero values for all times not equal to datetimetemp
4276
            # write 8760 2-column csv:  GHI,DHI
4277
            dni_temp = []
2✔
4278
            ghi_temp = []
2✔
4279
            dhi_temp = []
2✔
4280
            for g, d, time in zip(self.ghi, self.dhi,
2✔
4281
                                  dt.strftime('%Y-%m-%d %H:%M:%S')):
4282

4283
                # is this time included in a particular theta_round angle?
4284
                if time in datetimetemp:
2✔
4285
                    ghi_temp.append(g)
2✔
4286
                    dhi_temp.append(d)
2✔
4287
                else:
4288
                    # mask out irradiance at this time, since it
4289
                    # belongs to a different bin
4290
                    ghi_temp.append(0.0)
2✔
4291
                    dhi_temp.append(0.0)
2✔
4292
            # save in 2-column GHI,DHI format for gencumulativesky -G
4293
            savedata = pd.DataFrame({'GHI':ghi_temp, 'DHI':dhi_temp},
2✔
4294
                                    index = self.datetime).tz_localize(None)
4295
            # Fill partial year. Requires 2021 measurement year.
4296
            savedata = _subhourlydatatoGencumskyformat(savedata, 
2✔
4297
                                                       label=self.label)
4298
            print('Saving file {}, # points: {}'.format(
2✔
4299
                  trackerdict[theta]['csvfile'], datetimetemp.__len__()))
4300

4301
            savedata.to_csv(csvfile,
2✔
4302
                            index=False,
4303
                            header=False,
4304
                            sep=' ',
4305
                            columns=['GHI','DHI'])
4306

4307

4308
        return trackerdict
2✔
4309

4310

4311
class AnalysisObj:
2✔
4312
    """
4313
    Analysis class for performing raytrace to obtain irradiance measurements
4314
    at the array, as well plotting and reporting results.
4315
    """
4316
    def __repr__(self):
2✔
4317
        return str(self.__dict__)    
2✔
4318
    def __init__(self, octfile=None, name=None, hpc=False):
2✔
4319
        """
4320
        Initialize AnalysisObj by pointing to the octfile.  Scan information
4321
        is defined separately by passing scene details into AnalysisObj.moduleAnalysis()
4322
        
4323
        Parameters
4324
        ------------
4325
        octfile : string
4326
            Filename and extension of .oct file
4327
        name    :
4328
        hpc     : boolean, default False. Waits for octfile for a
4329
                  longer time if parallel processing.
4330
        modWanted  : Module used for analysis
4331
        rowWanted  : Row used for analysis 
4332
        sceneNum   : Which scene number (in case of multiple scenes)
4333
        """
4334

4335
        self.octfile = octfile
2✔
4336
        self.name = name
2✔
4337
        self.hpc = hpc
2✔
4338
        self.modWanted = None
2✔
4339
        self.rowWanted = None
2✔
4340
        self.sceneNum = 0 # should this be 0 or None by default??
2✔
4341
        self.power_data = None  # results from self.calc_performance() stored here
2✔
4342
        
4343

4344
        
4345
        # store list of columns and methods for convenience / introspection
4346
        # TODO: abstract this by making a super class that this inherits
4347
        self.columns =  [attr for attr in dir(self) if not (attr.startswith('_') or callable(getattr(self,attr)))]
2✔
4348
        self.methods = [attr for attr in dir(self) if (not attr.startswith('_') and callable(getattr(self,attr)))]
2✔
4349

4350
    def getResults(self):
2✔
4351
        """
4352
        go through the AnalysisObj and return a dict of irraidance result keys,
4353
        This can be passed into CompileResults
4354

4355
        Returns
4356
        -------
4357
        Results : dict.  irradiance scan results
4358
        """
4359
        keylist = ['rowWanted', 'modWanted', 'sceneNum', 'name', 'x', 'y','z',
2✔
4360
                    'Wm2Front', 'Wm2Back', 'Wm2Ground', 'backRatio', 'mattype', 'rearMat' ]
4361
        resultdict = {k: v for k, v in self.__dict__.items() if k in keylist}
2✔
4362
        return pd.DataFrame.from_dict(resultdict, orient='index').T.rename(
2✔
4363
            columns={'modWanted':'modNum', 'rowWanted':'rowNum'})
4364

4365

4366
        
4367
    def makeImage(self, viewfile, octfile=None, name=None):
2✔
4368
        """
4369
        Makes a visible image (rendering) of octfile, viewfile
4370
        """
4371
        
4372
        import time
2✔
4373

4374
        if octfile is None:
2✔
4375
            octfile = self.octfile
2✔
4376
        if name is None:
2✔
4377
            name = self.name
2✔
4378

4379
        #TODO: update this for cross-platform compatibility w/ os.path.join
4380
        if self.hpc :
2✔
4381
            time_to_wait = 10
2✔
4382
            time_counter = 0
2✔
4383
            filelist = [octfile, "views/"+viewfile]
2✔
4384
            for file in filelist:
2✔
4385
                while not os.path.exists(file):
2✔
4386
                    time.sleep(1)
×
4387
                    time_counter += 1
×
4388
                    if time_counter > time_to_wait:break
×
4389

4390
        print('Generating visible render of scene')
2✔
4391
        #TODO: update this for cross-platform compatibility w os.path.join
4392
        os.system("rpict -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+
2✔
4393
                  "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 "+
4394
                  "-ad 1536 -as 392 -av 25 25 25 -lr 8 -lw 1e-4 -vf views/"
4395
                  +viewfile+ " " + octfile +
4396
                  " > images/"+name+viewfile[:-3] +".hdr")
4397

4398
    def makeFalseColor(self, viewfile, octfile=None, name=None):
2✔
4399
        """
4400
        Makes a false-color plot of octfile, viewfile
4401
        
4402
        .. note::
4403
            For Windows requires installation of falsecolor.exe,
4404
            which is part of radwinexe-5.0.a.8-win64.zip found at
4405
            http://www.jaloxa.eu/resources/radiance/radwinexe.shtml
4406
        """
4407
        #TODO: error checking for installation of falsecolor.exe 
4408
        
4409
        if octfile is None:
2✔
4410
            octfile = self.octfile
2✔
4411
        if name is None:
2✔
4412
            name = self.name
2✔
4413

4414
        print('Generating scene in WM-2. This may take some time.')
2✔
4415
        #TODO: update and test this for cross-platform compatibility using os.path.join
4416
        cmd = "rpict -i -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+\
2✔
4417
              "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 -ad 1536 -as 392 " +\
4418
              "-av 25 25 25 -lr 8 -lw 1e-4 -vf views/"+viewfile + " " + octfile
4419

4420
        WM2_out,err = _popen(cmd,None)
2✔
4421
        if err is not None:
2✔
4422
            print('Error: {}'.format(err))
×
4423
            return
×
4424

4425
        # determine the extreme maximum value to help with falsecolor autoscale
4426
        extrm_out,err = _popen("pextrem",WM2_out.encode('latin1'))
2✔
4427
        # cast the pextrem string as a float and find the max value
4428
        WM2max = max(map(float,extrm_out.split()))
2✔
4429
        print('Saving scene in false color')
2✔
4430
        #auto scale false color map
4431
        if WM2max < 1100:
2✔
4432
            cmd = "falsecolor -l W/m2 -m 1 -s 1100 -n 11"
2✔
4433
        else:
4434
            cmd = "falsecolor -l W/m2 -m 1 -s %s"%(WM2max,)
×
4435
        with open(os.path.join("images","%s%s_FC.hdr"%(name,viewfile[:-3]) ),"w") as f:
2✔
4436
            data,err = _popen(cmd,WM2_out.encode('latin1'),f)
2✔
4437
            if err is not None:
2✔
4438
                print(err)
×
4439
                print('possible solution: install radwinexe binary package from '
×
4440
                      'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml')
4441

4442
    def _linePtsArray(self, linePtsDict):
2✔
4443
        """
4444
        Helper function to just print the x y and z values in an array format,
4445
        just like they will show in the .csv result files.
4446
        
4447
        """
4448
        xstart = linePtsDict['xstart']
×
4449
        ystart = linePtsDict['ystart']
×
4450
        zstart = linePtsDict['zstart']
×
4451
        xinc = linePtsDict['xinc']
×
4452
        yinc = linePtsDict['yinc']
×
4453
        zinc = linePtsDict['zinc']
×
4454
        sx_xinc = linePtsDict['sx_xinc']
×
4455
        sx_yinc = linePtsDict['sx_yinc']
×
4456
        sx_zinc = linePtsDict['sx_zinc']
×
4457
        Nx = int(linePtsDict['Nx'])
×
4458
        Ny = int(linePtsDict['Ny'])
×
4459
        Nz = int(linePtsDict['Nz'])
×
4460

4461
        x = []
×
4462
        y = []
×
4463
        z = []
×
4464

4465
        for iz in range(0,Nz):
×
4466
            for ix in range(0,Nx):
×
4467
                for iy in range(0,Ny):
×
4468
                    x . append(xstart+iy*xinc+ix*sx_xinc)
×
4469
                    y . append(ystart+iy*yinc+ix*sx_yinc)
×
4470
                    z . append(zstart+iy*zinc+ix*sx_zinc)
×
4471

4472
        return x, y, z
×
4473
        
4474
    def _linePtsMakeDict(self, linePtsDict):
2✔
4475
        a = linePtsDict
2✔
4476
        linepts = self._linePtsMake3D(a['xstart'],a['ystart'],a['zstart'],
2✔
4477
                            a['xinc'], a['yinc'], a['zinc'],
4478
                            a['sx_xinc'], a['sx_yinc'], a['sx_zinc'],
4479
                            a['Nx'],a['Ny'],a['Nz'],a['orient'])
4480
        return linepts
2✔
4481

4482
    def _linePtsMake3D(self, xstart, ystart, zstart, xinc, yinc, zinc,
2✔
4483
                       sx_xinc, sx_yinc, sx_zinc,
4484
                      Nx, Ny, Nz, orient):
4485
        #create linepts text input with variable x,y,z.
4486
        #If you don't want to iterate over a variable, inc = 0, N = 1.
4487

4488
        linepts = ""
2✔
4489
        # make sure Nx, Ny, Nz are ints.
4490
        Nx = int(Nx)
2✔
4491
        Ny = int(Ny)
2✔
4492
        Nz = int(Nz)
2✔
4493

4494

4495
        for iz in range(0,Nz):
2✔
4496
            for ix in range(0,Nx):
2✔
4497
                for iy in range(0,Ny):
2✔
4498
                    xpos = xstart+iy*xinc+ix*sx_xinc
2✔
4499
                    ypos = ystart+iy*yinc+ix*sx_yinc
2✔
4500
                    zpos = zstart+iy*zinc+ix*sx_zinc
2✔
4501
                    linepts = linepts + str(xpos) + ' ' + str(ypos) + \
2✔
4502
                          ' '+str(zpos) + ' ' + orient + " \r"
4503
        return(linepts)
2✔
4504

4505
    def _irrPlot(self, octfile, linepts, mytitle=None, plotflag=None,
2✔
4506
                   accuracy='low'):
4507
        """
4508
        (plotdict) = _irrPlot(linepts,title,time,plotflag, accuracy)
4509
        irradiance plotting using rtrace
4510
        pass in the linepts structure of the view along with a title string
4511
        for the plots.  
4512

4513
        Parameters
4514
        ------------
4515
        octfile : string
4516
            Filename and extension of .oct file
4517
        linepts : 
4518
            Output from :py:class:`bifacial_radiance.AnalysisObj._linePtsMake3D`
4519
        mytitle : string
4520
            Title to append to results files
4521
        plotflag : Boolean
4522
            Include plot of resulting irradiance
4523
        accuracy : string
4524
            Either 'low' (default - faster) or 'high'
4525
            (better for low light)
4526

4527
        Returns
4528
        -------
4529
        out : dictionary
4530
            out.x,y,z  - coordinates of point
4531
            .r,g,b     - r,g,b values in Wm-2
4532
            .Wm2            - equal-weight irradiance
4533
            .mattype        - material intersected
4534
            .title      - title passed in
4535
        """
4536
        
4537
        if mytitle is None:
2✔
4538
            #mytitle = octfile[:-4]
4539
            mytitle = f'{octfile[:-4]}_{self.name}_Row{self.rowWanted}_Module{self.modWanted}'
×
4540

4541
        if plotflag is None:
2✔
4542
            plotflag = False
×
4543

4544
        
4545
        if self.hpc :
2✔
4546
            import time
2✔
4547
            time_to_wait = 10
2✔
4548
            time_counter = 0
2✔
4549
            while not os.path.exists(octfile):
2✔
4550
                time.sleep(1)
×
4551
                time_counter += 1
×
4552
                if time_counter > time_to_wait:
×
4553
                    print('Warning: OCTFILE NOT FOUND')
×
4554
                    break
×
4555

4556
        if octfile is None:
2✔
4557
            print('Analysis aborted. octfile = None' )
×
4558
            return None
×
4559

4560
        keys = ['Wm2','x','y','z','r','g','b','mattype']
2✔
4561
        out = {key: np.empty(0) for key in keys}
2✔
4562
        #out = dict.fromkeys(['Wm2','x','y','z','r','g','b','mattype','title'])
4563
        out['title'] = mytitle
2✔
4564
        print ('Linescan in process: %s' %(mytitle))
2✔
4565
        #rtrace ambient values set for 'very accurate':
4566
        #cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
4567

4568
        if accuracy == 'low':
2✔
4569
            #rtrace optimized for faster scans: (ab2, others 96 is too coarse)
4570
            cmd = "rtrace -i -ab 2 -aa .1 -ar 256 -ad 2048 -as 256 -h -oovs "+ octfile
2✔
4571
        elif accuracy == 'high':
×
4572
            #rtrace ambient values set for 'very accurate':
4573
            cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
×
4574
        else:
4575
            print('_irrPlot accuracy options: "low" or "high"')
×
4576
            return({})
×
4577

4578

4579

4580
        temp_out,err = _popen(cmd,linepts.encode())
2✔
4581
        if err is not None:
2✔
4582
            if err[0:5] == 'error':
×
4583
                raise Exception(err[7:])
×
4584
            else:
4585
                print(err)
×
4586

4587
        # when file errors occur, temp_out is None, and err message is printed.
4588
        if temp_out is not None:
2✔
4589
            for line in temp_out.splitlines():
2✔
4590
                temp = line.split('\t')
2✔
4591
                out['x'] = np.append(out['x'],float(temp[0]))
2✔
4592
                out['y'] = np.append(out['y'],float(temp[1]))
2✔
4593
                out['z'] = np.append(out['z'],float(temp[2]))
2✔
4594
                out['r'] = np.append(out['r'],float(temp[3]))
2✔
4595
                out['g'] = np.append(out['g'],float(temp[4]))
2✔
4596
                out['b'] = np.append(out['b'],float(temp[5]))
2✔
4597
                out['mattype'] = np.append(out['mattype'],temp[6])
2✔
4598
                out['Wm2'] = np.append(out['Wm2'], sum([float(i) for i in temp[3:6]])/3.0)
2✔
4599

4600

4601
            if plotflag is True:
2✔
4602
                import matplotlib.pyplot as plt
×
4603
                plt.figure()
×
4604
                plt.plot(out['Wm2'])
×
4605
                plt.ylabel('Wm2 irradiance')
×
4606
                plt.xlabel('variable')
×
4607
                plt.title(mytitle)
×
4608
                plt.show()
×
4609
        else:
4610
            out = None   # return empty if error message.
×
4611

4612
        return(out)
2✔
4613

4614
    def _saveResults(self, data=None, reardata=None, savefile=None, RGB = False):
2✔
4615
        """
4616
        Function to save output from _irrPlot
4617
        If rearvals is passed in, back ratio is saved
4618
        If data = None then only reardata is saved.
4619
    
4620
        Returns
4621
        --------
4622
        savefile : str
4623
            If set to None, will write to default .csv filename in results folder.
4624
        """
4625

4626
        if savefile is None:
2✔
4627
            savefile = data['title'] + '.csv'
×
4628
        
4629
        if data is None and reardata is not None: # only rear data is passed.
2✔
4630
            data = reardata
2✔
4631
            reardata = None
2✔
4632
            # run process like normal but swap labels at the end
4633
            rearswapflag = True  
2✔
4634
        else:
4635
            rearswapflag = False
2✔
4636
            
4637
        # make savefile dataframe and set self.attributes
4638
        
4639
        if RGB:
2✔
4640
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype', 'Wm2','r', 'g', 'b' ]}
×
4641
        else:
4642
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype','Wm2' ]}
2✔
4643
            
4644
        df = pd.DataFrame(data_sub)
2✔
4645
        df = df.rename(columns={'Wm2':'Wm2Front'})
2✔
4646
        
4647
        if reardata is not None:
2✔
4648
            df.insert(3, 'rearZ', reardata['z'])
2✔
4649
            df.insert(5, 'rearMat', reardata['mattype'])
2✔
4650
            df.insert(7, 'Wm2Back',  reardata['Wm2'])
2✔
4651
            # add 1mW/m2 to avoid dividebyzero
4652
            df.insert(8, 'Back/FrontRatio',  df['Wm2Back'] / (df['Wm2Front']+.001))
2✔
4653
            df['backRatio'] = df['Back/FrontRatio']
2✔
4654
            df['rearX'] = reardata['x']
2✔
4655
            df['rearY'] = reardata['y']
2✔
4656
            if RGB:
2✔
4657
                df['rearR'] = reardata['r']
×
4658
                df['rearG'] = reardata['g']
×
4659
                df['rearB'] = reardata['b']
×
4660
                
4661
        # rename columns if only rear data was originally passed
4662
        if rearswapflag:
2✔
4663
            df = df.rename(columns={'Wm2Front':'Wm2Back','mattype':'rearMat'})
2✔
4664
        # set attributes of analysis to equal columns of df
4665
        for col in df.columns:
2✔
4666
            setattr(self, col, np.array(df[col])) #cdeline: changed from list to np.array on 3/16/24   
2✔
4667
        # only save a subset
4668
        df = df.drop(columns=['backRatio'], errors='ignore')
2✔
4669
        df.to_csv(os.path.join("results", savefile), sep = ',',
2✔
4670
                           index = False)
4671

4672

4673
        print('Saved: %s'%(os.path.join("results", savefile)))
2✔
4674
        return os.path.join("results", savefile)
2✔
4675

4676
    def _saveResultsCumulative(self, data, reardata=None, savefile=None):
2✔
4677
        """
4678
        TEMPORARY FUNCTION -- this is a fix to save ONE cumulative results csv
4679
        in the main working folder for when doing multiple entries in a 
4680
        tracker dict.
4681
        
4682
        Returns
4683
        --------
4684
        savefile : str
4685
            If set to None, will write to default .csv filename in results folder.
4686
        """
4687

4688
        if savefile is None:
×
4689
            savefile = data['title'] + '.csv'
×
4690
        # make dataframe from results
4691
        data_sub = {key:data[key] for key in ['x', 'y', 'z', 'Wm2', 'mattype']}
×
4692
        self.x = data['x']
×
4693
        self.y = data['y']
×
4694
        self.z = data['z']
×
4695
        self.mattype = data['mattype']
×
4696
        #TODO: data_sub front values don't seem to be saved to self.
4697
        if reardata is not None:
×
4698
            self.rearX = reardata['x']
×
4699
            self.rearY = reardata['y']
×
4700
            self.rearMat = reardata['mattype']
×
4701
            data_sub['rearMat'] = self.rearMat
×
4702
            self.rearZ = reardata['z']
×
4703
            data_sub['rearZ'] = self.rearZ
×
4704
            self.Wm2Front = data_sub.pop('Wm2')
×
4705
            data_sub['Wm2Front'] = self.Wm2Front
×
4706
            self.Wm2Back = reardata['Wm2']
×
4707
            data_sub['Wm2Back'] = self.Wm2Back
×
4708
            self.backRatio = [x/(y+.001) for x,y in zip(reardata['Wm2'],data['Wm2'])] # add 1mW/m2 to avoid dividebyzero
×
4709
            data_sub['Back/FrontRatio'] = self.backRatio
×
4710
            df = pd.DataFrame.from_dict(data_sub)
×
4711
            df.to_csv(savefile, sep = ',',
×
4712
                      columns = ['x','y','z','rearZ','mattype','rearMat',
4713
                                 'Wm2Front','Wm2Back','Back/FrontRatio'],
4714
                                 index = False) # new in 0.2.3
4715

4716
        else:
4717
            df = pd.DataFrame.from_dict(data_sub)
×
4718
            df.to_csv(savefile, sep = ',',
×
4719
                      columns = ['x','y','z', 'mattype','Wm2'], index = False)
4720

4721
        print('Saved: %s'%(savefile))
×
4722
        return (savefile)   
×
4723

4724
    def moduleAnalysis(self, scene, modWanted=None, rowWanted=None,
2✔
4725
                       sensorsy=9, sensorsx=1, 
4726
                       frontsurfaceoffset=0.001, backsurfaceoffset=0.001, 
4727
                       modscanfront=None, modscanback=None, relative=False, 
4728
                       debug=False):
4729
        
4730
        """
4731
        Handler function that decides how to handle different number of front
4732
        and back sensors. If number for front sensors is not provided or is 
4733
        the same as for the back, _moduleAnalysis
4734
        is called only once. Else it is called twice to get the different front
4735
        and back dictionary. 
4736
                  
4737
        This function defines the scan points to be used in the 
4738
        :py:class:`~bifacial_radiance.AnalysisObj.analysis` function,
4739
        to perform the raytrace through Radiance function `rtrace`
4740

4741
        Parameters
4742
        ------------
4743
        scene : ``SceneObj``
4744
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
4745
        modWanted : int
4746
            Module wanted to sample. If none, defaults to center module (rounding down)
4747
        rowWanted : int
4748
            Row wanted to sample. If none, defaults to center row (rounding down)
4749
        sensorsy : int or list 
4750
            Number of 'sensors' or scanning points along the collector width 
4751
            (CW) of the module(s). If multiple values are passed, first value
4752
            represents number of front sensors, second value is number of back sensors
4753
        sensorsx : int or list 
4754
            Number of 'sensors' or scanning points along the length, the side perpendicular 
4755
            to the collector width (CW) of the module(s) for the back side of the module. 
4756
            If multiple values are passed, first value represents number of 
4757
            front sensors, second value is number of back sensors.
4758
        debug : bool
4759
            Activates various print statemetns for debugging this function.
4760
        modscanfront : dict
4761
            Dictionary to modify the fronstcan values established by this routine 
4762
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
4763
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'. 
4764
            All of these keys are ints or floats except for 'orient' which 
4765
            takes x y z values as string 'x y z', for example '0 0 -1'. 
4766
            These values will overwrite the internally caculated frontscan
4767
            dictionary for the module & row selected.
4768
        modscanback: dict
4769
            Dictionary to modify the backscan values established by this routine 
4770
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
4771
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'.
4772
            All of these keys are ints or floats except for 'orient' which 
4773
            takes x y z values as string 'x y z', for example '0 0 -1'. 
4774
            These values will overwrite the internally caculated frontscan
4775
            dictionary for the module & row selected.    
4776
        relative : Bool
4777
            if passing modscanfront and modscanback to modify dictionarie of positions,
4778
            this sets if the values passed to be updated are relative or absolute. 
4779
            Default is absolute value (relative=False)
4780
   
4781
        
4782
        Returns
4783
        -------
4784
        frontscan : dictionary
4785
            Scan dictionary for module's front side. Used to pass into 
4786
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4787
        backscan : dictionary 
4788
            Scan dictionary for module's back side. Used to pass into 
4789
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4790
                
4791
        """
4792

4793
        # Height:  clearance height for fixed tilt systems, or torque tube
4794
        #           height for single-axis tracked systems.
4795
        #   Single axis tracked systems will consider the offset to calculate the final height.
4796
        
4797
        def _checkSensors(sensors):
2✔
4798
            # Checking Sensors input data for list or tuple
4799
            if (type(sensors)==tuple or type(sensors)==list):
2✔
4800
                try:
2✔
4801
                    sensors_back = sensors[1]
2✔
4802
                    sensors_front = sensors[0]
2✔
4803
                except IndexError: # only 1 value passed??
×
4804
                    sensors_back = sensors_front = sensors[0]
×
4805
            elif (type(sensors)==int or type(sensors)==float):
2✔
4806
                # Ensure sensors are positive int values.
4807
                if int(sensors) < 1:
2✔
4808
                    raise Exception('input sensorsy must be numeric >0')
×
4809
                sensors_back = sensors_front = int(sensors)
2✔
4810
            else:
4811
                print('Warning: invalid value passed for sensors. Setting = 1')
2✔
4812
                sensors_back = sensors_front = 1
2✔
4813
            return sensors_front, sensors_back
2✔
4814
            
4815
        sensorsy_front, sensorsy_back = _checkSensors(sensorsy)
2✔
4816
        sensorsx_front, sensorsx_back = _checkSensors(sensorsx)
2✔
4817
        
4818
        if (sensorsx_back != sensorsx_front) or (sensorsy_back != sensorsy_front):
2✔
4819
            sensors_diff = True
2✔
4820
        else:
4821
            sensors_diff = False
2✔
4822
          
4823
        dtor = np.pi/180.0
2✔
4824

4825
        # Internal scene parameters are stored in scene.sceneDict. Load these into local variables
4826
        sceneDict = scene.sceneDict      
2✔
4827
        
4828
        azimuth = round(sceneDict['azimuth'], 2)
2✔
4829
        tilt = round(sceneDict['tilt'], 2)
2✔
4830
        nMods = sceneDict['nMods']
2✔
4831
        nRows = sceneDict['nRows']
2✔
4832
        originx = sceneDict['originx']
2✔
4833
        originy = sceneDict['originy']
2✔
4834

4835
       # offset = moduleDict['offsetfromaxis']
4836
        offset = scene.module.offsetfromaxis
2✔
4837
        sceney = scene.module.sceney
2✔
4838
        scenex = scene.module.scenex
2✔
4839

4840
        # x needed for sensorsx>1 case
4841
        x = scene.module.x
2✔
4842
        
4843
        ## Check for proper input variables in sceneDict
4844
        if 'pitch' in sceneDict:
2✔
4845
            pitch = sceneDict['pitch']
2✔
4846
        elif 'gcr' in sceneDict:
2✔
4847
            pitch = sceney / sceneDict['gcr']
2✔
4848
        else:
4849
            raise Exception("Error: no 'pitch' or 'gcr' passed in sceneDict" )
×
4850
        
4851
        if 'axis_tilt' in sceneDict:
2✔
4852
            axis_tilt = sceneDict['axis_tilt']
2✔
4853
        else:
4854
            axis_tilt = 0
×
4855

4856
        if hasattr(scene.module,'z'):
2✔
4857
            modulez = scene.module.z
2✔
4858
        else:
4859
            print ("Module's z not set on sceneDict internal dictionary. Setting to default")
×
4860
            modulez = 0.02
×
4861
            
4862
        if frontsurfaceoffset is None:
2✔
4863
            frontsurfaceoffset = 0.001
×
4864
        if backsurfaceoffset is None:
2✔
4865
            backsurfaceoffset = 0.001
×
4866
        
4867
        # The Sensor routine below needs a "hub-height", not a clearance height.
4868
        # The below complicated check checks to see if height (deprecated) is passed,
4869
        # and if clearance_height or hub_height is passed as well.
4870

4871
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
4872
                                                               preferred = 'hub_height',
4873
                                                               nonpreferred = 'clearance_height')
4874
        
4875
        if use_clearanceheight :
2✔
4876
            height = sceneDict['clearance_height'] + 0.5* \
2✔
4877
                np.sin(abs(tilt) * np.pi / 180) * \
4878
                sceney - offset*np.sin(abs(tilt)*np.pi/180)
4879
        else:
4880
            height = sceneDict['hub_height']
2✔
4881

4882

4883
        if debug:
2✔
4884
            print("For debug:\n hub_height, Azimuth, Tilt, nMods, nRows, "
×
4885
                  "Pitch, Offset, SceneY, SceneX")
4886
            print(height, azimuth, tilt, nMods, nRows,
×
4887
                  pitch, offset, sceney, scenex)
4888

4889
        if modWanted == 0:
2✔
4890
            print( " FYI Modules and Rows start at index 1. "
×
4891
                  "Reindexing to modWanted 1"  )
4892
            modWanted = modWanted+1  # otherwise it gives results on Space.
×
4893

4894
        if rowWanted ==0:
2✔
4895
            print( " FYI Modules and Rows start at index 1. "
×
4896
                  "Reindexing to rowWanted 1"  )
4897
            rowWanted = rowWanted+1
×
4898

4899
        if modWanted is None:
2✔
4900
            modWanted = round(nMods / 1.99)
2✔
4901
        if rowWanted is None:
2✔
4902
            rowWanted = round(nRows / 1.99)
2✔
4903
        self.modWanted = modWanted
2✔
4904
        self.rowWanted = rowWanted
2✔
4905
        if debug is True:
2✔
4906
            print( f"Sampling: modWanted {modWanted}, rowWanted {rowWanted} "
×
4907
                  "out of {nMods} modules, {nRows} rows" )
4908

4909
        x0 = (modWanted-1)*scenex - (scenex*(round(nMods/1.99)*1.0-1))
2✔
4910
        y0 = (rowWanted-1)*pitch - (pitch*(round(nRows / 1.99)*1.0-1))
2✔
4911

4912
        x1 = x0 * np.cos ((180-azimuth)*dtor) - y0 * np.sin((180-azimuth)*dtor)
2✔
4913
        y1 = x0 * np.sin ((180-azimuth)*dtor) + y0 * np.cos((180-azimuth)*dtor)
2✔
4914
        z1 = 0
2✔
4915

4916
        if axis_tilt != 0 and azimuth == 90:
2✔
4917
            print ("fixing height for axis_tilt")
×
4918
            z1 = (modWanted-1)*scenex * np.sin(axis_tilt*dtor)
×
4919

4920
        # Edge of Panel
4921
        x2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4922
        y2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4923
        z2 = -(sceney/2.0) * np.sin(tilt*dtor)
2✔
4924

4925

4926
        # Axis of rotation Offset (if offset is not 0) for the front of the module
4927
        x3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4928
        y3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4929
        z3 = (offset + modulez + frontsurfaceoffset) * np.cos(tilt*dtor)
2✔
4930

4931
        # Axis of rotation Offset, for the back of the module 
4932
        x4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4933
        y4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4934
        z4 = (offset - backsurfaceoffset) * np.cos(tilt*dtor)
2✔
4935

4936
        xstartfront = x1 + x2 + x3 + originx
2✔
4937
        xstartback = x1 + x2 + x4 + originx
2✔
4938

4939
        ystartfront = y1 + y2 + y3 + originy
2✔
4940
        ystartback = y1 + y2 + y4 + originy
2✔
4941

4942
        zstartfront = height + z1 + z2 + z3
2✔
4943
        zstartback = height + z1 + z2 + z4
2✔
4944

4945
        #Adjust orientation of scan depending on tilt & azimuth
4946
        zdir = np.cos((tilt)*dtor)
2✔
4947
        ydir = np.sin((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4948
        xdir = np.sin((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4949
        front_orient = '%0.3f %0.3f %0.3f' % (-xdir, -ydir, -zdir)
2✔
4950
        back_orient = '%0.3f %0.3f %0.3f' % (xdir, ydir, zdir)
2✔
4951
    
4952
        #IF cellmodule:
4953
        #TODO: Add check for sensorsx_back
4954
        
4955
        #if (getattr(scene.module, 'cellModule', None)):  #1/2 cell x and y offset to hit the center of a cell
4956
        #    xcell = scene.module.cellModule.xcell
4957
        #    ycell = scene.module.cellModule.ycell
4958
        #    xstartfront = xstartfront - xcell/2 * np.cos((azimuth)*dtor) + ycell/2 * np.sin((azimuth)*dtor) * np.cos((tilt)*dtor)
4959
        #    xstartback = xstartback  - xcell/2 * np.cos((azimuth)*dtor) + ycell/2 * np.sin((azimuth)*dtor) * np.cos((tilt)*dtor)
4960
        #    ystartfront = ystartfront - xcell/2 * np.sin((azimuth)*dtor) + ycell/2 * np.cos((azimuth)*dtor) * np.cos((tilt)*dtor)
4961
        #    ystartback = ystartback  - xcell/2 * np.sin((azimuth)*dtor) + ycell/2 * np.cos((azimuth)*dtor) * np.cos((tilt)*dtor)
4962
        #    zstartfront = zstartfront +xcell/2*np.sin((tilt)*dtor)
4963
        #    zstartback = zstartback +xcell/2*np.sin((tilt)*dtor)
4964
            
4965
        if ((getattr(scene.module, 'cellModule', None)) and
2✔
4966
            (sensorsy_back == scene.module.cellModule.numcellsy)):
4967
            ycell = scene.module.cellModule.ycell
2✔
4968
            xinc_back = -((sceney - ycell ) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4969
            yinc_back = -((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4970
            zinc_back = ((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.sin(tilt*dtor)
2✔
4971
            firstsensorxstartfront = xstartfront - ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4972
            firstsensorxstartback = xstartback  - ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4973
            firstsensorystartfront = ystartfront - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4974
            firstsensorystartback = ystartback - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4975
            firstsensorzstartfront = zstartfront + ycell/2 * np.sin(tilt*dtor)
2✔
4976
            firstsensorzstartback = zstartback + ycell/2  * np.sin(tilt*dtor)
2✔
4977
            xinc_front = xinc_back
2✔
4978
            yinc_front = yinc_back
2✔
4979
            zinc_front = zinc_back
2✔
4980
            
4981
            sx_xinc_front = 0.0
2✔
4982
            sx_yinc_front = 0.0
2✔
4983
            sx_zinc_front = 0.0
2✔
4984
            sx_xinc_back = 0.0
2✔
4985
            sx_yinc_back = 0.0
2✔
4986
            sx_zinc_back = 0.0
2✔
4987
        
4988
            if (sensorsx_back != 1.0):
2✔
4989
                print("Warning: Cell-level module analysis for sensorsx > 1 not "+
×
4990
                      "fine-tuned yet. Use at own risk, some of the x positions "+
4991
                      "might fall in spacing between cells.")
4992
              
4993
        else:        
4994
            xinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4995
            yinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4996
            zinc_back = (sceney/(sensorsy_back + 1.0)) * np.sin(tilt*dtor)
2✔
4997
            
4998
            
4999
            if sensors_diff:
2✔
5000
                xinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
5001
                yinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
5002
                zinc_front = (sceney/(sensorsy_front + 1.0)) * np.sin(tilt*dtor)
2✔
5003
                
5004
            else:
5005
                xinc_front = xinc_back
2✔
5006
                yinc_front = yinc_back
2✔
5007
                zinc_front = zinc_back
2✔
5008
                
5009
            firstsensorxstartfront = xstartfront+xinc_front
2✔
5010
            firstsensorxstartback = xstartback+xinc_back
2✔
5011
            firstsensorystartfront = ystartfront+yinc_front
2✔
5012
            firstsensorystartback = ystartback+yinc_back
2✔
5013
            firstsensorzstartfront = zstartfront + zinc_front
2✔
5014
            firstsensorzstartback = zstartback + zinc_back
2✔
5015
        
5016
            ## Correct positions for sensorsx other than 1
5017
            # TODO: At some point, this equations can include the case where 
5018
            # sensorsx = 1, and cleanup the original position calculation to place
5019
            # firstsensorxstartback before this section on edge not on center.
5020
            # will save some multiplications and division but well, it works :)
5021
            
5022
            if sensorsx_back > 1.0:
2✔
5023
                sx_xinc_back = -(x/(sensorsx_back*1.0+1)) * np.cos((azimuth)*dtor)
×
5024
                sx_yinc_back = (x/(sensorsx_back*1.0+1)) * np.sin((azimuth)*dtor)
×
5025
                # Not needed unless axis_tilt != 0, which is not a current option
5026
                sx_zinc_back = 0.0 #       
×
5027
                
5028
                firstsensorxstartback = firstsensorxstartback + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
5029
                firstsensorystartback = firstsensorystartback - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
5030
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
5031
                #firstsensorxstartfront = firstsensorxstartback
5032
                #firstsensorystartfront = firstsensorystartback                
5033
            else:
5034
                sx_xinc_back = 0.0
2✔
5035
                sx_yinc_back = 0.0
2✔
5036
                sx_zinc_back = 0.0
2✔
5037
            
5038
            if sensorsx_front > 1.0:
2✔
5039
                sx_xinc_front = -(x/(sensorsx_front*1.0+1)) * np.cos((azimuth)*dtor)
×
5040
                sx_yinc_front = (x/(sensorsx_front*1.0+1)) * np.sin((azimuth)*dtor)
×
5041
                # Not needed unless axis_tilt != 0, which is not a current option
5042
                sx_zinc_front = 0.0 # 
×
5043
                
5044
                firstsensorxstartfront = firstsensorxstartfront + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
5045
                firstsensorystartfront = firstsensorystartfront - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
5046

5047
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
5048
            else:
5049
                sx_xinc_front = 0.0
2✔
5050
                sx_yinc_front = 0.0
2✔
5051
                sx_zinc_front = 0.0
2✔
5052
                
5053
                
5054
        if debug is True:
2✔
5055
            print("Azimuth", azimuth)
×
5056
            print("Coordinate Center Point of Desired Panel before azm rotation", x0, y0)
×
5057
            print("Coordinate Center Point of Desired Panel after azm rotation", x1, y1)
×
5058
            print("Edge of Panel", x2, y2, z2)
×
5059
            print("Offset Shift", x3, y3, z3)
×
5060
            print("Final Start Coordinate Front", xstartfront, ystartfront, zstartfront)
×
5061
            print("Increase Coordinates", xinc_front, yinc_front, zinc_front)
×
5062
        
5063
        frontscan = {'xstart': firstsensorxstartfront, 'ystart': firstsensorystartfront,
2✔
5064
                     'zstart': firstsensorzstartfront,
5065
                     'xinc':xinc_front, 'yinc': yinc_front, 'zinc':zinc_front,
5066
                     'sx_xinc':sx_xinc_front, 'sx_yinc':sx_yinc_front,
5067
                     'sx_zinc':sx_zinc_front, 
5068
                     'Nx': sensorsx_front, 'Ny':sensorsy_front, 'Nz':1, 'orient':front_orient }
5069
        backscan = {'xstart':firstsensorxstartback, 'ystart': firstsensorystartback,
2✔
5070
                     'zstart': firstsensorzstartback,
5071
                     'xinc':xinc_back, 'yinc': yinc_back, 'zinc':zinc_back,
5072
                     'sx_xinc':sx_xinc_back, 'sx_yinc':sx_yinc_back,
5073
                     'sx_zinc':sx_zinc_back, 
5074
                     'Nx': sensorsx_back, 'Ny':sensorsy_back, 'Nz':1, 'orient':back_orient }
5075

5076
        if modscanfront is not None:
2✔
5077
            frontscan2 = _modDict(originaldict=frontscan, moddict=modscanfront, relative=relative)
2✔
5078
        else:
5079
            frontscan2 = frontscan.copy()
2✔
5080
        if modscanback is not None:
2✔
5081
            backscan2 = _modDict(originaldict=backscan, moddict=modscanback, relative=relative)
×
5082
        else:
5083
            backscan2 = backscan.copy()   
2✔
5084

5085
        return frontscan2, backscan2
2✔
5086
    
5087
    def groundAnalysis(self, scene, modWanted=None, rowWanted=None, 
2✔
5088
                       sensorsground=None, sensorsgroundx=1):
5089
        """
5090
        run a single ground scan along the entire row-row pitch of the scene. 
5091

5092
        Parameters
5093
        ----------
5094
        scene : ``SceneObj``
5095
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5096
        modWanted : int
5097
            Module wanted to sample. If none, defaults to center module (rounding down)
5098
        rowWanted : int
5099
            Row wanted to sample. If none, defaults to center row (rounding down)
5100
        sensorsground : int (default None)
5101
            Number of scan points along the scene pitch.  Default every 20cm
5102
        sensorsgroundx : int (default 1)
5103
            Number of scans in the x dimension, the side perpendicular 
5104
            to the collector width (CW) of the module(s)
5105

5106
        Returns
5107
        -------
5108
        groundscan : dictionary
5109
            Scan dictionary for the ground including beneath modules. Used to pass into 
5110
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
5111

5112
        """
5113
              
5114
        dtor = np.pi/180.0
2✔
5115

5116
        # Internal scene parameters are stored in scene.sceneDict. Load these into local variables
5117
        sceneDict = scene.sceneDict
2✔
5118

5119
        azimuth = sceneDict['azimuth']
2✔
5120
        #tilt = sceneDict['tilt']
5121
        nMods = sceneDict['nMods']
2✔
5122
        nRows = sceneDict['nRows']
2✔
5123
        originx = sceneDict['originx']
2✔
5124
        originy = sceneDict['originy']
2✔
5125

5126
        sceney = scene.module.sceney
2✔
5127
        scenex = scene.module.scenex
2✔
5128

5129
        # x needed for sensorsx>1 case
5130
        #x = scene.module.x
5131
        
5132
        ## Check for proper input variables in sceneDict
5133
        if 'pitch' in sceneDict:
2✔
5134
            pitch = sceneDict['pitch']
×
5135
        elif 'gcr' in sceneDict:
2✔
5136
            pitch = sceney / sceneDict['gcr']
2✔
5137
        else:
5138
            raise Exception("Error: no 'pitch' or 'gcr' passed in sceneDict" )
×
5139
                     
5140
        if sensorsground is None:
2✔
5141
            sensorsground = max(1,round(pitch * 5)) # scan every 20 cm
2✔
5142
        if modWanted is None:
2✔
5143
            modWanted = round(nMods / 1.99)
2✔
5144
        if rowWanted is None:
2✔
5145
            rowWanted = round(nRows / 1.99)
2✔
5146
        self.modWanted = modWanted
2✔
5147
        self.rowWanted = rowWanted
2✔
5148

5149
        
5150
        x0 = (modWanted-1)*scenex - (scenex*(round(nMods/1.99)*1.0-1))
2✔
5151
        y0 = (rowWanted-1)*pitch - (pitch*(round(nRows / 1.99)*1.0-1))
2✔
5152
        
5153
        x1 = x0 * np.cos ((180-azimuth)*dtor) - y0 * np.sin((180-azimuth)*dtor)
2✔
5154
        y1 = x0 * np.sin ((180-azimuth)*dtor) + y0 * np.cos((180-azimuth)*dtor)
2✔
5155
        
5156
        xstart = x1 + originx
2✔
5157
        ystart = y1 + originy
2✔
5158
        zstart = 0.05
2✔
5159

5160
        ground_orient = '0 0 -1'
2✔
5161

5162
        groundsensorspacing = pitch / (sensorsground - 1)
2✔
5163
        xinc = groundsensorspacing * np.sin((azimuth)*dtor)
2✔
5164
        yinc = groundsensorspacing * np.cos((azimuth)*dtor)
2✔
5165
        zinc = 0
2✔
5166
        
5167
        groundscan = {'xstart': xstart, 'ystart': ystart,
2✔
5168
                     'zstart': zstart,
5169
                     'xinc':xinc, 'yinc': yinc, 'zinc':zinc,
5170
                     'sx_xinc':0, 'sx_yinc':0,
5171
                     'sx_zinc':0,
5172
                     'Nx': sensorsgroundx, 'Ny':sensorsground, 'Nz':1,
5173
                     'orient':ground_orient }
5174

5175
        return groundscan
2✔
5176
      
5177
    def analyzeRow(self, octfile, scene, rowWanted=None, name=None, 
2✔
5178
                   sensorsy=None, sensorsx=None ):
5179
        '''
5180
        Function to Analyze every module in the row. 
5181

5182
        Parameters
5183
        ----------
5184
        octfile : string
5185
            Filename and extension of .oct file
5186
        scene : ``SceneObj``
5187
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5188
        rowWanted : int
5189
            Row wanted to sample. If none, defaults to center row (rounding down)
5190
        sensorsy : int or list 
5191
            Number of 'sensors' or scanning points along the collector width 
5192
            (CW) of the module(s). If multiple values are passed, first value
5193
            represents number of front sensors, second value is number of back sensors
5194
        sensorsx : int or list 
5195
            Number of 'sensors' or scanning points along the length, the side perpendicular 
5196
            to the collector width (CW) of the module(s) for the back side of the module. 
5197
            If multiple values are passed, first value represents number of 
5198
            front sensors, second value is number of back sensors.
5199

5200
        Returns
5201
        -------
5202
        df_row : dataframe
5203
            Dataframe with all values sampled for the row.
5204

5205
        '''
5206
        #allfront = []
5207
        #allback = []
5208

5209
        nMods = scene.sceneDict['nMods']
2✔
5210

5211
        if rowWanted == None:
2✔
5212
            rowWanted = round(self.nRows / 1.99)
×
5213
            
5214
        if name is None:
2✔
5215
                name = 'RowAnalysis_'+str(rowWanted)
×
5216

5217
        df_dict_row = {}
2✔
5218
        row_keys = ['x','y','z','rearZ','mattype','rearMat','Wm2Front','Wm2Back','ModNumber']
2✔
5219
        dict_row = df_dict_row.fromkeys(row_keys)
2✔
5220
        df_row = pd.DataFrame(dict_row, index = [j for j in range(nMods)])
2✔
5221
        
5222
        # Starting on 1 because moduleAnalysis does not consider "0" for row or Mod wanted.
5223
        for i in range (0, nMods):
2✔
5224
            temp_dict = {}
2✔
5225
            frontscan, backscan = self.moduleAnalysis(scene, sensorsy=sensorsy, 
2✔
5226
                                        sensorsx=sensorsx, modWanted = i+1, 
5227
                                        rowWanted = rowWanted) 
5228
            allscan = self.analysis(octfile, name, frontscan, backscan) 
2✔
5229
            front_dict = allscan[0]
2✔
5230
            back_dict = allscan[1]
2✔
5231
            temp_dict['x'] = front_dict['x']
2✔
5232
            temp_dict['y'] = front_dict['y']
2✔
5233
            temp_dict['z'] = front_dict['z']
2✔
5234
            temp_dict['rearx'] = back_dict['z']
2✔
5235
            temp_dict['reary'] = back_dict['z']
2✔
5236
            temp_dict['rearZ'] = back_dict['z']
2✔
5237
            temp_dict['mattype'] = front_dict['mattype']
2✔
5238
            temp_dict['rearMat'] = back_dict['mattype']
2✔
5239
            temp_dict['Wm2Front'] = front_dict['Wm2']
2✔
5240
            temp_dict['Wm2Back'] = back_dict['Wm2']
2✔
5241
            temp_dict['ModNumber'] = i+1
2✔
5242
            df_row.iloc[i] = temp_dict
2✔
5243
        
5244
        # check for path in the new Radiance directory:
5245
        rowpath = os.path.join("results", "CompiledResults")
2✔
5246

5247
        def _checkPath(rowpath):  # create the file structure if it doesn't exist
2✔
5248
            if not os.path.exists(rowpath):
2✔
5249
                os.makedirs(rowpath)
2✔
5250
                print('Making path for compiled results: '+rowpath)
2✔
5251
        
5252
        _checkPath(rowpath)
2✔
5253
        
5254
        savefile = 'compiledRow_{}.csv'.format(rowWanted)
2✔
5255

5256
        df_row.to_csv(os.path.join(rowpath, savefile), sep = ',',
2✔
5257
                           index = False)
5258

5259

5260
        return df_row
2✔
5261

5262
    def analyzeField(self, octfile, scene, name=None, 
2✔
5263
                   sensorsy=None, sensorsx=None ):
5264
        '''
5265
        Function to Analyze every module in a scene
5266

5267
        Parameters
5268
        ----------
5269
        octfile : string
5270
            Filename and extension of .oct file
5271
        scene : ``SceneObj``
5272
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
5273
        rowWanted : int
5274
            Row wanted to sample. If none, defaults to center row (rounding down)
5275
        sensorsy : int or list 
5276
            Number of 'sensors' or scanning points along the collector width 
5277
            (CW) of the module(s). If multiple values are passed, first value
5278
            represents number of front sensors, second value is number of back sensors
5279
        sensorsx : int or list 
5280
            Number of 'sensors' or scanning points along the length, the side perpendicular 
5281
            to the collector width (CW) of the module(s) for the back side of the module. 
5282
            If multiple values are passed, first value represents number of 
5283
            front sensors, second value is number of back sensors.
5284

5285
        Returns
5286
        -------
5287
        df_row : dataframe
5288
            Dataframe with all values sampled for the row.
5289

5290
        '''
5291
        #allfront = []
5292
        #allback = []
5293

5294
        nRows = scene.sceneDict['nRows']
×
5295
            
5296
        if name is None:
×
5297
                name = 'FieldAnalysis'
×
5298

5299
        frames = []
×
5300

5301

5302
        for ii in range(1, nRows+1):
×
5303
            dfrow = self.analyzeRow(octfile=octfile, scene=scene, rowWanted=ii, name=name+'_Row_'+str(ii), 
×
5304
                               sensorsy=sensorsy, sensorsx=sensorsx)
5305
            dfrow['Row'] = ii
×
5306
            frames.append(dfrow)
×
5307
        
5308
        result = pd.concat(frames)
×
5309
       
5310
        # check for path in the new Radiance directory:
5311
        fieldpath = os.path.join("results", "CompiledResults")       
×
5312
        savefile = 'compiledField_{}.csv'.format(name)
×
5313

5314
        result.to_csv(os.path.join(fieldpath, savefile), sep = ',',
×
5315
                           index = False)
5316

5317

5318
        return result
×
5319
    
5320
    def analysis(self, octfile, name, frontscan, backscan=None,
2✔
5321
                 plotflag=False, accuracy='low', RGB=False):
5322
        """
5323
        General analysis function, where linepts are passed in for calling the
5324
        raytrace routine :py:class:`~bifacial_radiance.AnalysisObj._irrPlot` 
5325
        and saved into results with 
5326
        :py:class:`~bifacial_radiance.AnalysisObj._saveResults`.
5327

5328
        
5329
        Parameters
5330
        ------------
5331
        octfile : string
5332
            Filename and extension of .oct file
5333
        name : string 
5334
            Name to append to output files
5335
        frontscan : scene.frontscan object
5336
            Object with the sensor location information for the 
5337
            front of the module
5338
        backscan : scene.backscan object. (optional)
5339
            Object with the sensor location information for the 
5340
            rear side of the module.
5341
        plotflag : boolean
5342
            Include plot of resulting irradiance
5343
        accuracy : string 
5344
            Either 'low' (default - faster) or 'high' (better for low light)
5345
        RGB : Bool
5346
            If the raytrace is a spectral raytrace and information for the three channe
5347
            wants to be saved, set RGB to True.
5348

5349
            
5350
        Returns
5351
        -------
5352
         File saved in `\\results\\irr_name.csv`
5353

5354
        """
5355

5356
        if octfile is None:
2✔
5357
            print('Analysis aborted - no octfile \n')
×
5358
            return None, None
×
5359
        linepts = self._linePtsMakeDict(frontscan)
2✔
5360
        if self.rowWanted:
2✔
5361
            name = name + f'_Row{self.rowWanted}'
2✔
5362
        if self.modWanted:
2✔
5363
            name = name + f'_Module{self.modWanted}'
2✔
5364
        frontDict = self._irrPlot(octfile, linepts, name+'_Front',
2✔
5365
                                    plotflag=plotflag, accuracy=accuracy)
5366

5367
        if backscan is None:  #only one scan
2✔
5368
            if frontDict is not None:
2✔
5369
                self.Wm2Front = np.mean(frontDict['Wm2'])
2✔
5370
                self._saveResults(frontDict, reardata=None, savefile='irr_%s.csv'%(name), RGB=RGB)
2✔
5371
            return frontDict
2✔
5372
        #bottom view.
5373
        linepts = self._linePtsMakeDict(backscan)
2✔
5374
        backDict = self._irrPlot(octfile, linepts, name+'_Back',
2✔
5375
                                   plotflag=plotflag, accuracy=accuracy)
5376

5377
        # don't save if _irrPlot returns an empty file.
5378
        if frontDict is not None:
2✔
5379
            if len(frontDict['Wm2']) != len(backDict['Wm2']):
2✔
5380
                self.Wm2Front = np.mean(frontDict['Wm2'])
2✔
5381
                self.Wm2Back = np.mean(backDict['Wm2'])
2✔
5382
                self.backRatio = self.Wm2Back / (self.Wm2Front + .001)
2✔
5383
                self._saveResults(frontDict, reardata=None, savefile='irr_%s.csv'%(name+'_Front'), RGB=RGB)
2✔
5384
                self._saveResults(data=None, reardata=backDict, savefile='irr_%s.csv'%(name+'_Back'), RGB=RGB)
2✔
5385
            else:
5386
                self._saveResults(frontDict, backDict,'irr_%s.csv'%(name), RGB=RGB)
2✔
5387

5388
        return frontDict, backDict
2✔
5389

5390

5391
    def calc_performance(self, meteo_data, cumulativesky, module,
2✔
5392
                         CECMod2=None, agriPV=False):
5393
        """
5394
        For a given AnalysisObj, use performance.calculateResults to calculate performance, 
5395
        considering electrical mismatch, using PVLib. Cell temperature is calculated 
5396
    
5397
        Parameters
5398
         ----------
5399
        meteo_data : Dict
5400
            Dictionary with meteorological data needed to run CEC model.  Keys:
5401
            'temp_air', 'wind_speed', 'dni', 'dhi', 'ghi'
5402
        module: ModuleObj from scene.module
5403
            Requires CEC Module parameters to be set. If None, default to Prism Solar.
5404
        CECMod2 : Dict
5405
            Dictionary with CEC Module Parameters for a Monofacial module. If None,
5406
            same module as CECMod is used for the BGE calculations, but just 
5407
            using the front irradiance (Gfront). 
5408
    
5409
        Returns
5410
        -------
5411
        performance : dictionary with performance results for that simulation.
5412
            Keys:
5413
            'POA_eff': mean of [(mean of clean Gfront) + clean Grear * bifaciality factor]
5414
            'Gfront_mean': mean of clean Gfront
5415
            'Grear_mean': mean of clean Grear
5416
            'Mismatch': mismatch calculated from the MAD distribution of POA_total
5417
            'Pout_raw': power output calculated from POA_total, considers wind speed and temp_amb if in trackerdict.
5418
            'Pout': power output considering electrical mismatch
5419
            
5420
        """  
5421

5422
        from bifacial_radiance import performance
2✔
5423
        from bifacial_radiance import ModuleObj
2✔
5424
        
5425
        #TODO: Check that meteo_data only includes correct kwargs
5426
        # 'dni', 'ghi', 'dhi', 'temp_air', 'wind_speed'
5427
        
5428
        if cumulativesky is False:
2✔
5429
            
5430
            # If CECMod details aren't passed, use a default Prism Solar value.
5431
            #if type(module) is not ModuleObj:  # not working for some reason..
5432
            if str(type(module)) != "<class 'bifacial_radiance.module.ModuleObj'>":
2✔
5433
                raise TypeError('ModuleObj input required for AnalysisObj.calc_performance. '+\
×
5434
                                f'type passed: {type(module)}')           
5435
    
5436
            self.power_data = performance.calculateResults(module=module, results=self.getResults(),
2✔
5437
                                               CECMod2=CECMod2, agriPV=agriPV,
5438
                                               **meteo_data)
5439

5440
        else:
5441
            # TODO HERE: SUM all keys for rows that have the same rowWanted/modWanted
5442
    
5443
            self.power_data = performance.calculateResultsGencumsky1axis(results=self.getResults(),
×
5444
                                                                 agriPV=agriPV)
5445
            #results.to_csv(os.path.join('results', 'Cumulative_Results.csv'))
5446
    
5447
        #CompiledResults = results         
5448
        #trackerdict = trackerdict
5449

5450
def quickExample(testfolder=None):
2✔
5451
    """
5452
    Example of how to run a Radiance routine for a simple rooftop bifacial system
5453

5454
    """
5455

5456
    import bifacial_radiance
2✔
5457
    
5458
    if testfolder == None:
2✔
5459
        testfolder = bifacial_radiance.main._interactive_directory(
×
5460
            title = 'Select or create an empty directory for the Radiance tree')
5461

5462
    demo = bifacial_radiance.RadianceObj('simple_panel', path=testfolder)  # Create a RadianceObj 'object'
2✔
5463

5464
    # input albedo number or material name like 'concrete'.
5465
    # To see options, run setGround without any input.
5466
    demo.setGround(0.62)
2✔
5467
    try:
2✔
5468
        epwfile = demo.getEPW(lat=40.01667, lon=-105.25) # pull TMY data for any global lat/lon
2✔
5469
    except ConnectionError: # no connection to automatically pull data
×
5470
        pass
×
5471

5472
    metdata = demo.readWeatherFile(epwfile, coerce_year=2001) # read in the EPW weather data from above
2✔
5473
    #metdata = demo.readTMY() # select a TMY file using graphical picker
5474
    # Now we either choose a single time point, or use cumulativesky for the entire year.
5475
    cumulativeSky = False
2✔
5476
    if cumulativeSky:
2✔
5477
        demo.genCumSky() # entire year.
×
5478
    else:
5479
        timeindex = metdata.datetime.index(pd.to_datetime('2001-06-17 12:0:0 -7'))
2✔
5480
        demo.gendaylit(metdata=metdata, timeindex=timeindex)  # Noon, June 17th
2✔
5481

5482

5483
    # create a scene using panels in landscape at 10 deg tilt, 1.5m pitch. 0.2 m ground clearance
5484
    moduletype = 'test-module'
2✔
5485
    module = demo.makeModule(name=moduletype, x=1.59, y=0.95)
2✔
5486
    sceneDict = {'tilt':10,'pitch':1.5,'clearance_height':0.2,
2✔
5487
                 'azimuth':180, 'nMods': 10, 'nRows': 3}
5488
    #makeScene creates a .rad file with 10 modules per row, 3 rows.
5489
    scene = demo.makeScene(module=module, sceneDict=sceneDict)
2✔
5490
    # makeOct combines all of the ground, sky and object files into .oct file.
5491
    octfile = demo.makeOct(demo.getfilelist())
2✔
5492

5493
    # return an analysis object including the scan dimensions for back irradiance
5494
    analysis = bifacial_radiance.AnalysisObj(octfile, demo.name)
2✔
5495
    frontscan, backscan = analysis.moduleAnalysis(scene, sensorsy=9)
2✔
5496
    analysis.analysis(octfile, demo.name, frontscan, backscan, accuracy='low')
2✔
5497
    # bifacial ratio should be 11.6% +/- 0.1% (+/- 1% absolute with glass-glass module)
5498
    print('Annual bifacial ratio average:  %0.3f' %(
2✔
5499
            sum(analysis.Wm2Back) / sum(analysis.Wm2Front) ) )
5500

5501
    return analysis
2✔
5502

5503

STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc