• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

NREL / bifacial_radiance / 10639800187

30 Aug 2024 10:15PM UTC coverage: 73.466% (+0.05%) from 73.42%
10639800187

Pull #542

github

cdeline
bump to pandas 1.4.4 and numba 0.58.1 in requirements.txt
Pull Request #542: bump to pandas 1.4.4 and numba 0.58.1 in requirements.txt

3209 of 4368 relevant lines covered (73.47%)

1.47 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

79.59
/bifacial_radiance/main.py
1
#!/usr/bin/env python
2

3
"""
1✔
4
@author: cdeline
5

6
bifacial_radiance.py - module to develop radiance bifacial scenes, including gendaylit and gencumulativesky
7
7/5/2016 - test script based on G173_journal_height
8
5/1/2017 - standalone module
9

10
Pre-requisites:
11
    This software is written for Python >3.6 leveraging many Anaconda tools (e.g. pandas, numpy, etc)
12

13
    *RADIANCE software should be installed from https://github.com/NREL/Radiance/releases
14

15
    *If you want to use gencumulativesky, move 'gencumulativesky.exe' from
16
    'bifacial_radiance\data' into your RADIANCE source directory.
17

18
    *If using a Windows machine you should download the Jaloxa executables at
19
    http://www.jaloxa.eu/resources/radiance/radwinexe.shtml#Download
20

21
    * Installation of  bifacial_radiance from the repo:
22
    1. Clone the repo
23
    2. Navigate to the directory using the command prompt
24
    3. run `pip install -e . `
25

26
Overview:
27
    Bifacial_radiance includes several helper functions to make it easier to evaluate
28
    different PV system orientations for rear bifacial irradiance.
29
    Note that this is simply an optical model - identifying available rear irradiance under different conditions.
30

31
    For a detailed demonstration example, look at the .ipnyb notebook in \docs\
32

33
    There are two solar resource modes in bifacial_radiance: `gendaylit` uses hour-by-hour solar
34
    resource descriptions using the Perez diffuse tilted plane model.
35
    `gencumulativesky` is an annual average solar resource that combines hourly
36
    Perez skies into one single solar source, and computes an annual average.
37

38
    bifacial_radiance includes five object-oriented classes:
39

40
    RadianceObj:  top level class to work on radiance objects, keep track of filenames,
41
    sky values, PV module type etc.
42

43
    GroundObj:    details for the ground surface and reflectance
44

45
    SceneObj:    scene information including array configuration (row spacing, clearance or hub height)
46

47
    MetObj: meteorological data from EPW (energyplus) file.
48
        Future work: include other file support including TMY files
49

50
    AnalysisObj: Analysis class for plotting and reporting
51

52
"""
53
import logging
2✔
54
logging.basicConfig()
2✔
55
LOGGER = logging.getLogger(__name__)
2✔
56
LOGGER.setLevel(logging.DEBUG)
2✔
57

58
import os, datetime
2✔
59
from subprocess import Popen, PIPE  # replacement for os.system()
2✔
60
import pandas as pd
2✔
61
import numpy as np 
2✔
62
import warnings
2✔
63
#from input import *
64

65
# Mutual parameters across all processes
66
#daydate=sys.argv[1]
67

68

69
global DATA_PATH # path to data files including module.json.  Global context
70
#DATA_PATH = os.path.abspath(pkg_resources.resource_filename('bifacial_radiance', 'data/') )
71
DATA_PATH = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data'))
2✔
72

73
def _findme(lst, a): #find string match in a list. script from stackexchange
2✔
74
    return [i for i, x in enumerate(lst) if x == a]
2✔
75

76
def _missingKeyWarning(dictype, missingkey, newvalue): # prints warnings 
2✔
77
    if type(newvalue) is bool:
2✔
78
        valueunit = ''
×
79
    else:
80
        valueunit = 'm'
2✔
81
    print("Warning: {} Dictionary Parameters passed, but {} is missing. ".format(dictype, missingkey))        
2✔
82
    print("Setting it to default value of {} {} to continue\n".format(newvalue, valueunit))
2✔
83

84
def _normRGB(r, g, b): #normalize by each color for human vision sensitivity
2✔
85
    return r*0.216+g*0.7152+b*0.0722
2✔
86

87
def _popen(cmd, data_in, data_out=PIPE):
2✔
88
    """
89
    Helper function subprocess.popen replaces os.system
90
    - gives better input/output process control
91
    usage: pass <data_in> to process <cmd> and return results
92
    based on rgbeimage.py (Thomas Bleicher 2010)
93
    """
94
    if type(cmd) == str:
2✔
95
        cmd = str(cmd) # gets rid of unicode oddities
2✔
96
        shell=True
2✔
97
    else:
98
        shell=False
2✔
99

100
    p = Popen(cmd, bufsize=-1, stdin=PIPE, stdout=data_out, stderr=PIPE, shell=shell) #shell=True required for Linux? quick fix, but may be security concern
2✔
101
    data, err = p.communicate(data_in)
2✔
102
    #if err:
103
    #    return 'message: '+err.strip()
104
    #if data:
105
    #    return data. in Python3 this is returned as `bytes` and needs to be decoded
106
    if err:
2✔
107
        if data:
2✔
108
            returntuple = (data.decode('latin1'), 'message: '+err.decode('latin1').strip())
×
109
        else:
110
            returntuple = (None, 'message: '+err.decode('latin1').strip())
2✔
111
    else:
112
        if data:
2✔
113
            returntuple = (data.decode('latin1'), None) #Py3 requires decoding
2✔
114
        else:
115
            returntuple = (None, None)
2✔
116

117
    return returntuple
2✔
118

119
def _interactive_load(title=None):
2✔
120
    # Tkinter file picker
121
    import tkinter
×
122
    from tkinter import filedialog
×
123
    root = tkinter.Tk()
×
124
    root.withdraw() #Start interactive file input
×
125
    root.attributes("-topmost", True) #Bring window into foreground
×
126
    return filedialog.askopenfilename(parent=root, title=title) #initialdir = data_dir
×
127

128
def _interactive_directory(title=None):
2✔
129
    # Tkinter directory picker.  Now Py3.6 compliant!
130
    import tkinter
×
131
    from tkinter import filedialog
×
132
    root = tkinter.Tk()
×
133
    root.withdraw() #Start interactive file input
×
134
    root.attributes("-topmost", True) #Bring to front
×
135
    return filedialog.askdirectory(parent=root, title=title)
×
136

137
def _modDict(originaldict, moddict, relative=False):
2✔
138
    '''
139
    Compares keys in originaldict with moddict and updates values of 
140
    originaldict to moddict if existing.
141
    
142
    Parameters
143
    ----------
144
    originaldict : dictionary
145
        Original dictionary calculated, for example frontscan or backscan dictionaries.
146
    moddict : dictionary
147
        Modified dictinoary, for example modscan['xstart'] = 0 to change position of x.
148
    relative : Bool
149
        if passing modscanfront and modscanback to modify dictionarie of positions,
150
        this sets if the values passed to be updated are relative or absolute. 
151
        Default is absolute value (relative=False)
152
            
153
    Returns
154
    -------
155
    originaldict : dictionary
156
        Updated original dictionary with values from moddict.
157
    '''
158
    newdict = originaldict.copy()
2✔
159

160
    for key in moddict:
2✔
161
        try:
2✔
162
            if relative:
2✔
163
                newdict[key] = moddict[key] + newdict[key]
×
164
            else:
165
                newdict[key] = moddict[key]
2✔
166
        except:
×
167
            print("Wrong key in modified dictionary")
×
168
    
169
    return newdict
2✔
170

171
def _heightCasesSwitcher(sceneDict, preferred='hub_height', nonpreferred='clearance_height',
2✔
172
                         suppress_warning=False):
173
        """
174
        
175
        Parameters
176
        ----------
177
        sceneDict : dictionary
178
            Dictionary that might contain more than one way of defining height for 
179
            the array: `clearance_height`, `hub_height`, `height`*
180
            * height deprecated from sceneDict. This function helps choose
181
            * which definition to use.  
182
        preferred : str, optional
183
            When sceneDict has hub_height and clearance_height, or it only has height,
184
            it will leave only the preferred option.. The default is 'hub_height'.
185
        nonpreferred : TYPE, optional
186
            When sceneDict has hub_height and clearance_height, 
187
            it wil ldelete this nonpreferred option. The default is 'clearance_height'.
188
        suppress_warning  :  Bool, default False
189
            If both heights passed in SceneDict, suppress the warning        
190
        
191
        Returns
192
        -------
193
        sceneDict : TYPE
194
            Dictionary now containing the appropriate definition for system height. 
195
        use_clearanceheight : Bool
196
            Helper variable to specify if dictionary has only clearancehet for
197
            use inside `makeScene1axis`. Will get deprecated once that internal
198
            function is streamlined.
199
    
200
        """
201
        # TODO: When we update to python 3.9.0, this could be a Switch Cases (Structural Pattern Matching):
202
    
203
            
204
        heightCases = '_'
2✔
205
        if 'height' in sceneDict:
2✔
206
            heightCases = heightCases+'height__'
2✔
207
        if 'clearance_height' in sceneDict:
2✔
208
            heightCases = heightCases+'clearance_height__'
2✔
209
        if 'hub_height' in sceneDict:
2✔
210
            heightCases = heightCases+'hub_height__'
2✔
211
        
212
        use_clearanceheight = False
2✔
213
        # CASES:
214
        if heightCases == '_height__':
2✔
215
            print("sceneDict Warning: 'height' is being deprecated. "+
2✔
216
                                  "Renaming as "+preferred)
217
            sceneDict[preferred]=sceneDict['height']
2✔
218
            del sceneDict['height']
2✔
219
        
220
        elif heightCases == '_clearance_height__':
2✔
221
            #print("Using clearance_height.")
222
            use_clearanceheight = True
2✔
223
            
224
        elif heightCases == '_hub_height__':
2✔
225
            #print("Using hub_height.'")
226
            pass
2✔
227
        elif heightCases == '_height__clearance_height__':  
2✔
228
            print("sceneDict Warning: 'clearance_height and 'height' "+
2✔
229
                  "(deprecated) are being passed. removing 'height' "+
230
                  "from sceneDict for this tracking routine")
231
            del sceneDict['height']
2✔
232
            use_clearanceheight = True
2✔
233
                            
234
        elif heightCases == '_height__hub_height__':     
2✔
235
            print("sceneDict Warning: 'height' is being deprecated. Using 'hub_height'")
2✔
236
            del sceneDict['height']
2✔
237
        
238
        elif heightCases == '_height__clearance_height__hub_height__':       
2✔
239
            print("sceneDict Warning: 'hub_height', 'clearance_height'"+
×
240
                  ", and 'height' are being passed. Removing 'height'"+
241
                  " (deprecated) and "+ nonpreferred+ ", using "+preferred)
242
            del sceneDict[nonpreferred]
×
243
        
244
        elif heightCases == '_clearance_height__hub_height__':  
2✔
245
            if not suppress_warning:
2✔
246
                print("sceneDict Warning: 'hub_height' and 'clearance_height'"+
2✔
247
                      " are being passed. Using "+preferred+
248
                      " and removing "+ nonpreferred)
249
            del sceneDict[nonpreferred]
2✔
250
    
251
        else: 
252
            print ("sceneDict Error! no argument in sceneDict found "+
×
253
                   "for 'hub_height', 'height' nor 'clearance_height'. "+
254
                   "Exiting routine.")
255
            
256
        return sceneDict, use_clearanceheight
2✔
257

258
def _is_leap_and_29Feb(s): # Removes Feb. 29 if it a leap year.
2✔
259
    return (s.index.year % 4 == 0) & \
2✔
260
           ((s.index.year % 100 != 0) | (s.index.year % 400 == 0)) & \
261
           (s.index.month == 2) & (s.index.day == 29)
262

263
def _subhourlydatatoGencumskyformat(gencumskydata, label='right'):
2✔
264
    # Subroutine to resample, pad, remove leap year and get data in the
265
    # 8760 hourly format
266
    # for saving the temporary files for gencumsky in _saveTempTMY and
267
    # _makeTrackerCSV
268
    
269

270
    #Resample to hourly. Gencumsky wants right-labeled data.
271
    try:
2✔
272
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean()  
2✔
273
    except TypeError: # Pandas 2.0 error
1✔
274
        gencumskydata = gencumskydata.resample('60min', closed='right', label='right').mean(numeric_only=True) 
1✔
275
    
276
    if label == 'left': #switch from left to right labeled by adding an hour
2✔
277
        gencumskydata.index = gencumskydata.index + pd.to_timedelta('1H')
×
278
                     
279

280
    # Padding
281
    tzinfo = gencumskydata.index.tzinfo
2✔
282
    padstart = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0],1,1,1,0 ) ).tz_localize(tzinfo)
2✔
283
    padend = pd.to_datetime('%s-%s-%s %s:%s' % (gencumskydata.index.year[0]+1,1,1,0,0) ).tz_localize(tzinfo)
2✔
284
    gencumskydata.iloc[0] = 0  # set first datapt to zero to forward fill w zeros
2✔
285
    gencumskydata.iloc[-1] = 0  # set last datapt to zero to forward fill w zeros
2✔
286
    # check if index exists. I'm sure there is a way to do this backwards.
287
    if any(gencumskydata.index.isin([padstart])):
2✔
288
        print("Data starts on Jan. 01")
2✔
289
    else:
290
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padstart]))
291
        gencumskydata=pd.concat([gencumskydata,pd.DataFrame(index=[padstart])])
2✔
292
    if any(gencumskydata.index.isin([padend])):
2✔
293
        print("Data ends on Dec. 31st")
2✔
294
    else:
295
        #gencumskydata=gencumskydata.append(pd.DataFrame(index=[padend]))
296
        gencumskydata=pd.concat([gencumskydata, pd.DataFrame(index=[padend])])
2✔
297
    gencumskydata.loc[padstart]=0
2✔
298
    gencumskydata.loc[padend]=0
2✔
299
    gencumskydata=gencumskydata.sort_index() 
2✔
300
    # Fill empty timestamps with zeros
301
    gencumskydata = gencumskydata.resample('60min').asfreq().fillna(0)
2✔
302
    # Mask leap year
303
    leapmask =  ~(_is_leap_and_29Feb(gencumskydata))
2✔
304
    gencumskydata = gencumskydata[leapmask]
2✔
305

306
    if (gencumskydata.index.year[-1] == gencumskydata.index.year[-2]+1) and len(gencumskydata)>8760:
2✔
307
        gencumskydata = gencumskydata[:-1]
×
308
    return gencumskydata
2✔
309
    # end _subhourlydatatoGencumskyformat   
310

311
def _checkRaypath():
2✔
312
    # Ensure that os.environ['RAYPATH'] exists and contains current directory '.'     
313
    if os.name == 'nt':
2✔
314
        splitter = ';'
×
315
    else:
316
        splitter = ':'
2✔
317
    try:
2✔
318
        raypath = os.getenv('RAYPATH', default=None)
2✔
319
        if not raypath:
2✔
320
            raise KeyError()
2✔
321
        raysplit = raypath.split(splitter)
2✔
322
        if not '.' in raysplit:
2✔
323
            os.environ['RAYPATH'] = splitter.join(filter(None, raysplit + ['.'+splitter]))
2✔
324
    except (KeyError, AttributeError, TypeError):
2✔
325
        raise Exception('No RAYPATH set for RADIANCE.  Please check your RADIANCE installation.')
2✔
326
        
327
    
328

329
class RadianceObj:
2✔
330
    """
331
    The RadianceObj top level class is used to work on radiance objects, 
332
    keep track of filenames,  sky values, PV module configuration, etc.
333

334
    Parameters
335
    ----------
336
    name : text to append to output files
337
    filelist : list of Radiance files to create oconv
338
    nowstr : current date/time string
339
    path : working directory with Radiance materials and objects
340

341
    Methods
342
    -------
343
    __init__ : initialize the object
344
    _setPath : change the working directory
345

346
    """
347
    def __repr__(self):
2✔
348
        return str(self.__dict__)  
2✔
349
    def __init__(self, name=None, path=None, hpc=False):
2✔
350
        '''
351
        initialize RadianceObj with path of Radiance materials and objects,
352
        as well as a basename to append to
353

354
        Parameters
355
        ----------
356
        name: string, append temporary and output files with this value
357
        path: location of Radiance materials and objects
358
        hpc:  Keeps track if User is running simulation on HPC so some file 
359
              reading routines try reading a bit longer and some writing 
360
              routines (makeModule) that overwrite themselves are inactivated.
361

362
        Returns
363
        -------
364
        none
365
        '''
366

367
        self.metdata = {}        # data from epw met file
2✔
368
        self.data = {}           # data stored at each timestep
2✔
369
        self.path = ""             # path of working directory
2✔
370
        self.name = ""         # basename to append
2✔
371
        #self.filelist = []         # list of files to include in the oconv
372
        self.materialfiles = []    # material files for oconv
2✔
373
        self.skyfiles = []          # skyfiles for oconv
2✔
374
        self.radfiles = []      # scene rad files for oconv
2✔
375
        self.octfile = []       #octfile name for analysis
2✔
376
        self.Wm2Front = 0       # cumulative tabulation of front W/m2
2✔
377
        self.Wm2Back = 0        # cumulative tabulation of rear W/m2
2✔
378
        self.backRatio = 0      # ratio of rear / front Wm2
2✔
379
        #self.nMods = None        # number of modules per row
380
        #self.nRows = None        # number of rows per scene
381
        self.hpc = hpc           # HPC simulation is being run. Some read/write functions are modified
2✔
382
        
383
        now = datetime.datetime.now()
2✔
384
        self.nowstr = str(now.date())+'_'+str(now.hour)+str(now.minute)+str(now.second)
2✔
385
        _checkRaypath()       # make sure we have RADIANCE path set up correctly
2✔
386

387
        # DEFAULTS
388

389
        if name is None:
2✔
390
            self.name = self.nowstr  # set default filename for output files
×
391
        else:
392
            self.name = name
2✔
393
        self.basename = name # add backwards compatibility for prior versions
2✔
394
        #self.__name__ = self.name  #optional info
395
        #self.__str__ = self.__name__   #optional info
396
        if path is None:
2✔
397
            self._setPath(os.getcwd())
2✔
398
        else:
399
            self._setPath(path)
2✔
400
        # load files in the /materials/ directory
401
        self.materialfiles = self.returnMaterialFiles('materials')
2✔
402

403
    def _setPath(self, path):
2✔
404
        """
405
        setPath - move path and working directory
406

407
        """
408
        self.path = os.path.abspath(path)
2✔
409

410
        print('path = '+ path)
2✔
411
        try:
2✔
412
            os.chdir(self.path)
2✔
413
        except OSError as exc:
2✔
414
            LOGGER.error('Path doesn''t exist: %s' % (path))
2✔
415
            LOGGER.exception(exc)
2✔
416
            raise(exc)
2✔
417

418
        # check for path in the new Radiance directory:
419
        def _checkPath(path):  # create the file structure if it doesn't exist
2✔
420
            if not os.path.exists(path):
2✔
421
                os.makedirs(path)
2✔
422
                print('Making path: '+path)
2✔
423

424
        _checkPath('images'); _checkPath('objects')
2✔
425
        _checkPath('results'); _checkPath('skies'); _checkPath('EPWs')
2✔
426
        # if materials directory doesn't exist, populate it with ground.rad
427
        # figure out where pip installed support files.
428
        from shutil import copy2
2✔
429

430
        if not os.path.exists('materials'):  #copy ground.rad to /materials
2✔
431
            os.makedirs('materials')
2✔
432
            print('Making path: materials')
2✔
433

434
            copy2(os.path.join(DATA_PATH, 'ground.rad'), 'materials')
2✔
435
        # if views directory doesn't exist, create it with two default views - side.vp and front.vp
436
        if not os.path.exists('views'):
2✔
437
            os.makedirs('views')
2✔
438
            with open(os.path.join('views', 'side.vp'), 'w') as f:
2✔
439
                f.write('rvu -vtv -vp -10 1.5 3 -vd 1.581 0 -0.519234 '+
2✔
440
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
441
            with open(os.path.join('views', 'front.vp'), 'w') as f:
2✔
442
                f.write('rvu -vtv -vp 0 -3 5 -vd 0 0.894427 -0.894427 '+
2✔
443
                        '-vu 0 0 1 -vh 45 -vv 45 -vo 0 -va 0 -vs 0 -vl 0')
444

445
    def getfilelist(self):
2✔
446
        """ 
447
        Return concat of matfiles, radfiles and skyfiles
448
        """
449

450
        return self.materialfiles + self.skyfiles + self.radfiles
2✔
451

452
    def save(self, savefile=None):
2✔
453
        """
454
        Pickle the radiance object for further use.
455
        Very basic operation - not much use right now.
456

457
        Parameters
458
        ----------
459
        savefile : str
460
            Optional savefile name, with .pickle extension.
461
            Otherwise default to save.pickle
462

463
        """
464
        
465
        import pickle
2✔
466

467
        if savefile is None:
2✔
468
            savefile = 'save.pickle'
×
469

470
        with open(savefile, 'wb') as f:
2✔
471
            pickle.dump(self, f)
2✔
472
        print('Saved to file {}'.format(savefile))
2✔
473

474
    #def setHPC(self, hpc=True):
475
    #    self.hpc = hpc
476
        
477
    def addMaterial(self, material, Rrefl, Grefl, Brefl, materialtype='plastic', 
2✔
478
                    specularity=0, roughness=0, material_file=None, comment=None, rewrite=True):
479
        """
480
        Function to add a material in Radiance format. 
481

482

483
        Parameters
484
        ----------
485
        material : str
486
            DESCRIPTION.
487
        Rrefl : str
488
            Reflectivity for first wavelength, or 'R' bin.
489
        Grefl : str
490
            Reflecstrtivity for second wavelength, or 'G' bin.
491
        Brefl : str
492
            Reflectivity for third wavelength, or 'B' bin.
493
        materialtype : str, optional
494
            Type of material. The default is 'plastic'. Others can be mirror,
495
            trans, etc. See RADIANCe documentation. 
496
        specularity : str, optional
497
            Ratio of reflection that is specular and not diffuse. The default is 0.
498
        roughness : str, optional
499
            This is the microscopic surface roughness: the more jagged the 
500
            facets are, the rougher it is and more blurry reflections will appear.
501
        material_file : str, optional
502
            DESCRIPTION. The default is None.
503
        comment : str, optional
504
            DESCRIPTION. The default is None.
505
        rewrite : str, optional
506
            DESCRIPTION. The default is True.
507

508
        Returns
509
        -------
510
        None. Just adds the material to the material_file specified or the 
511
        default in ``materials\ground.rad``.
512

513
        References:
514
            See examples of documentation for more materialtype details.
515
            http://www.jaloxa.eu/resources/radiance/documentation/docs/radiance_tutorial.pdf page 10
516
     
517
            Also, you can use https://www.jaloxa.eu/resources/radiance/colour_picker.shtml 
518
            to have a sense of how the material would look with the RGB values as 
519
            well as specularity and roughness.
520

521
            To understand more on reflectivity, specularity and roughness values
522
            https://thinkmoult.com/radiance-specularity-and-roughness-value-examples.html
523
            
524
        """
525
        if material_file is None:
2✔
526
            material_file = 'ground.rad'    
2✔
527
    
528
        matfile = os.path.join('materials', material_file)
2✔
529
        
530
        with open(matfile, 'r') as fp:
2✔
531
            buffer = fp.readlines()
2✔
532
                
533
        # search buffer for material matching requested addition
534
        found = False
2✔
535
        for i in buffer:
2✔
536
            if materialtype and material in i:
2✔
537
                loc = buffer.index(i)
2✔
538
                found = True
2✔
539
                break
2✔
540
        if found:
2✔
541
            if rewrite:            
2✔
542
                print('Material exists, overwriting...\n')
2✔
543
                if comment is None:
2✔
544
                    pre = loc - 1
×
545
                else:
546
                    pre = loc - 2            
2✔
547
                # commit buffer without material match
548
                with open(matfile, 'w') as fp:
2✔
549
                    for i in buffer[0:pre]:
2✔
550
                        fp.write(i)
2✔
551
                    for i in buffer[loc+4:]:
2✔
552
                        fp.write(i)
×
553
        if (found and rewrite) or (not found):
2✔
554
            # append -- This will create the file if it doesn't exist
555
            file_object = open(matfile, 'a')
2✔
556
            file_object.write("\n\n")
2✔
557
            if comment is not None:
2✔
558
                file_object.write("#{}".format(comment))
2✔
559
            file_object.write("\nvoid {} {}".format(materialtype, material))
2✔
560
            if materialtype == 'glass' or materialtype =='mirror':
2✔
561
                file_object.write("\n0\n0\n3 {} {} {}".format(Rrefl, Grefl, Brefl))
×
562
            else:
563
                file_object.write("\n0\n0\n5 {} {} {} {} {}".format(Rrefl, Grefl, Brefl, specularity, roughness))
2✔
564
            file_object.close()
2✔
565
            print('Added material {} to file {}'.format(material, material_file))
2✔
566
        if (found and not rewrite):
2✔
567
            print('Material already exists\n')
2✔
568

569
    def exportTrackerDict(self, trackerdict=None,
2✔
570
                          savefile=None, reindex=None):
571
        """
572
        Use :py:func:`~bifacial_radiance.load._exportTrackerDict` to save a
573
        TrackerDict output as a csv file.
574

575
        Parameters
576
        ----------
577
            trackerdict
578
                The tracker dictionary to save
579
            savefile : str 
580
                path to .csv save file location
581
            reindex : bool
582
                True saves the trackerdict in TMY format, including rows for hours
583
                where there is no sun/irradiance results (empty)
584
                
585
        """
586
        
587
        import bifacial_radiance.load
2✔
588

589
        if trackerdict is None:
2✔
590
            trackerdict = self.trackerdict
2✔
591

592
        if savefile is None:
2✔
593
            savefile = _interactive_load(title='Select a .csv file to save to')
×
594

595
        if reindex is None:
2✔
596
            if self.cumulativesky is True:
2✔
597
                # don't re-index for cumulativesky,
598
                # which has angles for index
599
                reindex = False
2✔
600
            else:
601
                reindex = True
×
602

603
        if self.cumulativesky is True and reindex is True:
2✔
604
            # don't re-index for cumulativesky,
605
            # which has angles for index
606
            print ("\n Warning: For cumulativesky simulations, exporting the "
×
607
                   "TrackerDict requires reindex = False. Setting reindex = "
608
                   "False and proceeding")
609
            reindex = False
×
610

611
        bifacial_radiance.load._exportTrackerDict(trackerdict,
2✔
612
                                                 savefile,
613
                                                 reindex)
614

615

616
    def loadtrackerdict(self, trackerdict=None, fileprefix=None):
2✔
617
        """
618
        Use :py:class:`bifacial_radiance.load._loadtrackerdict` 
619
        to browse the results directory and load back any results saved in there.
620

621
        Parameters
622
        ----------
623
        trackerdict
624
        fileprefix : str
625

626
        """
627
        from bifacial_radiance.load import loadTrackerDict
2✔
628
        if trackerdict is None:
2✔
629
            trackerdict = self.trackerdict
×
630
        (trackerdict, totaldict) = loadTrackerDict(trackerdict, fileprefix)
2✔
631
        self.Wm2Front = totaldict['Wm2Front']
2✔
632
        self.Wm2Back = totaldict['Wm2Back']
2✔
633

634
    def returnOctFiles(self):
2✔
635
        """
636
        Return files in the root directory with `.oct` extension
637

638
        Returns
639
        -------
640
        oct_files : list
641
            List of .oct files
642
        
643
        """
644
        oct_files = [f for f in os.listdir(self.path) if f.endswith('.oct')]
×
645
        #self.oct_files = oct_files
646
        return oct_files
×
647

648
    def returnMaterialFiles(self, material_path=None):
2✔
649
        """
650
        Return files in the Materials directory with .rad extension
651
        appends materials files to the oconv file list
652

653
        Parameters
654
        ----------
655
        material_path : str
656
            Optional parameter to point to a specific materials directory.
657
            otherwise /materials/ is default
658

659
        Returns
660
        -------
661
        material_files : list
662
            List of .rad files
663

664
        """
665
        
666
        if material_path is None:
2✔
667
            material_path = 'materials'
×
668

669
        material_files = [f for f in os.listdir(os.path.join(self.path,
2✔
670
                                                             material_path)) if f.endswith('.rad')]
671

672
        materialfilelist = [os.path.join(material_path, f) for f in material_files]
2✔
673
        self.materialfiles = materialfilelist
2✔
674
        return materialfilelist
2✔
675

676
    def setGround(self, material=None, material_file=None):
2✔
677
        """ 
678
        Use GroundObj constructor class and return a ground object
679
        
680
        Parameters
681
        ------------
682
        material : numeric or str
683
            If number between 0 and 1 is passed, albedo input is assumed and assigned.    
684
            If string is passed with the name of the material desired. e.g. 'litesoil',
685
            properties are searched in `material_file`.
686
            Default Material names to choose from: litesoil, concrete, white_EPDM, 
687
            beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
688
        material_file : str
689
            Filename of the material information. Default `ground.rad`
690
    
691
        Returns
692
        -------
693
        self.ground : tuple
694
            self.ground.normval : numeric
695
            Normalized color value
696
            self.ground.ReflAvg : numeric
697
            Average reflectance
698
        """
699

700
        if material is None:
2✔
701
            try:
×
702
                if self.metdata.albedo is not None:
×
703
                    material = self.metdata.albedo
×
704
                    print(" Assigned Albedo from metdata.albedo")
×
705
            except:
×
706
                pass
×
707
            
708
        self.ground = GroundObj(material, material_file)
2✔
709

710

711
    def getEPW(self, lat=None, lon=None, GetAll=False):
2✔
712
        """
713
        Subroutine to download nearest epw files to latitude and longitude provided,
714
        into the directory \EPWs\
715
        based on github/aahoo.
716
        
717
        .. warning::
718
            verify=false is required to operate within NREL's network.
719
            to avoid annoying warnings, insecurerequestwarning is disabled
720
            currently this function is not working within NREL's network.  annoying!
721
        
722
        Parameters
723
        ----------
724
        lat : decimal 
725
            Used to find closest EPW file.
726
        lon : decimal 
727
            Longitude value to find closest EPW file.
728
        GetAll : boolean 
729
            Download all available files. Note that no epw file will be loaded into memory
730
        
731
        
732
        """
733

734
        import requests, re
2✔
735
        from requests.packages.urllib3.exceptions import InsecureRequestWarning
2✔
736
        requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
2✔
737
        hdr = {'User-Agent' : "Magic Browser",
2✔
738
               'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'
739
               }
740

741
        path_to_save = 'EPWs' # create a directory and write the name of directory here
2✔
742
        if not os.path.exists(path_to_save):
2✔
743
            os.makedirs(path_to_save)
×
744

745
        def _returnEPWnames():
2✔
746
            ''' return a dataframe with the name, lat, lon, url of available files'''
747
            r = requests.get('https://github.com/NREL/EnergyPlus/raw/develop/weather/master.geojson', verify=False)
2✔
748
            data = r.json() #metadata for available files
2✔
749
            #download lat/lon and url details for each .epw file into a dataframe
750
            df = pd.DataFrame({'url':[], 'lat':[], 'lon':[], 'name':[]})
2✔
751
            for location in data['features']:
2✔
752
                match = re.search(r'href=[\'"]?([^\'" >]+)', location['properties']['epw'])
2✔
753
                if match:
2✔
754
                    url = match.group(1)
2✔
755
                    name = url[url.rfind('/') + 1:]
2✔
756
                    lontemp = location['geometry']['coordinates'][0]
2✔
757
                    lattemp = location['geometry']['coordinates'][1]
2✔
758
                    dftemp = pd.DataFrame({'url':[url], 'lat':[lattemp], 'lon':[lontemp], 'name':[name]})
2✔
759
                    #df = df.append(dftemp, ignore_index=True)
760
                    df = pd.concat([df, dftemp], ignore_index=True)
2✔
761
            return df
2✔
762

763
        def _findClosestEPW(lat, lon, df):
2✔
764
            #locate the record with the nearest lat/lon
765
            errorvec = np.sqrt(np.square(df.lat - lat) + np.square(df.lon - lon))
2✔
766
            index = errorvec.idxmin()
2✔
767
            url = df['url'][index]
2✔
768
            name = df['name'][index]
2✔
769
            return url, name
2✔
770

771
        def _downloadEPWfile(url, path_to_save, name):
2✔
772
            r = requests.get(url, verify=False, headers=hdr)
2✔
773
            if r.ok:
2✔
774
                filename = os.path.join(path_to_save, name)
2✔
775
                # py2 and 3 compatible: binary write, encode text first
776
                with open(filename, 'wb') as f:
2✔
777
                    f.write(r.text.encode('ascii', 'ignore'))
2✔
778
                print(' ... OK!')
2✔
779
            else:
780
                print(' connection error status code: %s' %(r.status_code))
×
781
                r.raise_for_status()
×
782

783
        # Get the list of EPW filenames and lat/lon
784
        df = _returnEPWnames()
2✔
785

786
        # find the closest EPW file to the given lat/lon
787
        if (lat is not None) & (lon is not None) & (GetAll is False):
2✔
788
            url, name = _findClosestEPW(lat, lon, df)
2✔
789

790
            # download the EPW file to the local drive.
791
            print('Getting weather file: ' + name)
2✔
792
            _downloadEPWfile(url, path_to_save, name)
2✔
793
            self.epwfile = os.path.join('EPWs', name)
2✔
794

795
        elif GetAll is True:
×
796
            if input('Downloading ALL EPW files available. OK? [y/n]') == 'y':
×
797
                # get all of the EPW files
798
                for index, row in df.iterrows():
×
799
                    print('Getting weather file: ' + row['name'])
×
800
                    _downloadEPWfile(row['url'], path_to_save, row['name'])
×
801
            self.epwfile = None
×
802
        else:
803
            print('Nothing returned. Proper usage: epwfile = getEPW(lat,lon)')
×
804
            self.epwfile = None
×
805

806
        return self.epwfile
2✔
807
      
808
        
809
    def readWeatherFile(self, weatherFile=None, starttime=None, 
2✔
810
                        endtime=None, label=None, source=None,
811
                        coerce_year=None, tz_convert_val=None):
812
        """
813
        Read either a EPW or a TMY file, calls the functions 
814
        :py:class:`~bifacial_radiance.readTMY` or
815
        :py:class:`~bifacial_radiance.readEPW` 
816
        according to the weatherfile extention.
817
        
818
        Parameters
819
        ----------
820
        weatherFile : str
821
            File containing the weather information. EPW, TMY or solargis accepted.
822
        starttime : str
823
            Limited start time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
824
        endtime : str
825
            Limited end time option in 'YYYY-mm-dd_HHMM' or 'mm_dd_HH' format
826
        daydate : str  DEPRECATED
827
            For single day in 'MM/DD' or MM_DD format.  Now use starttime and 
828
            endtime set to the same date.
829
        label : str
830
            'left', 'right', or 'center'. For data that is averaged, defines if
831
            the timestamp refers to the left edge, the right edge, or the 
832
            center of the averaging interval, for purposes of calculating 
833
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
834
            represents data from 10 to 11, and sun position is calculated 
835
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
836
            data and NSRDB uses centered.
837
        source : str
838
            To help identify different types of .csv files. If None, it assumes
839
            it is a TMY3-style formated data. Current options: 'TMY3', 
840
            'solargis', 'EPW'
841
        coerce_year : int
842
            Year to coerce weather data to in YYYY format, ie 2021. 
843
            If more than one year of data in the  weather file, year is NOT coerced.
844
        tz_convert_val : int 
845
            Convert timezone to this fixed value, following ISO standard 
846
            (negative values indicating West of UTC.)
847
        """
848
        #from datetime import datetime
849
        import warnings
2✔
850
        
851
        if weatherFile is None:
2✔
852
            if hasattr(self,'epwfile'):
×
853
                weatherFile = self.epwfile
×
854
            else:
855
                try:
×
856
                    weatherFile = _interactive_load('Select EPW or TMY3 climate file')
×
857
                except:
×
858
                    raise Exception('Interactive load failed. Tkinter not supported'+
×
859
                                    'on this system. Try installing X-Quartz and reloading')
860
        if coerce_year is not None:
2✔
861
            coerce_year = int(coerce_year)
2✔
862
            if str(coerce_year).__len__() != 4:
2✔
863
                warnings.warn('Incorrect coerce_year. Setting to None')
×
864
                coerce_year = None
×
865
                
866
        
867
        def _parseTimes(t, hour, coerce_year):
2✔
868
            '''
869
            parse time input t which could be string mm_dd_HH or YYYY-mm-dd_HHMM
870
            or datetime.datetime object.  Return pd.datetime object.  Define
871
            hour as hour input if not passed directly.
872
            '''
873
            import re
2✔
874
            
875
            if type(t) == str:
2✔
876
                try:
2✔
877
                    tsplit = re.split('-|_| ', t)
2✔
878
                    
879
                    #mm_dd format
880
                    if tsplit.__len__() == 2 and t.__len__() == 5: 
2✔
881
                        if coerce_year is None:
2✔
882
                                coerce_year = 2021 #default year. 
2✔
883
                        tsplit.insert(0,str(coerce_year))
2✔
884
                        tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
885
                        
886
                    #mm_dd_hh or YYYY_mm_dd format
887
                    elif tsplit.__len__() == 3 :
2✔
888
                        if tsplit[0].__len__() == 2:
2✔
889
                            if coerce_year is None:
2✔
890
                                coerce_year = 2021 #default year. 
2✔
891
                            tsplit.insert(0,str(coerce_year))
2✔
892
                        elif tsplit[0].__len__() == 4:
2✔
893
                            tsplit.append(str(hour).rjust(2,'0')+'00')
2✔
894
                            
895
                    #YYYY-mm-dd_HHMM  format
896
                    if tsplit.__len__() == 4 and tsplit[0].__len__() == 4:
2✔
897
                        t_out = pd.to_datetime(''.join(tsplit).ljust(12,'0') ) 
2✔
898
                    
899
                    else:
900
                        raise Exception(f'incorrect time string passed {t}.'
×
901
                                        'Valid options: mm_dd, mm_dd_HH, '
902
                                        'mm_dd_HHMM, YYYY-mm-dd_HHMM')  
903
                except Exception as e:
×
904
                    # Error for incorrect string passed:
905
                    raise(e)
×
906
            else:  #datetime or timestamp
907
                try:
2✔
908
                    t_out = pd.to_datetime(t)
2✔
909
                except pd.errors.ParserError:
×
910
                    print('incorrect time object passed.  Valid options: '
×
911
                          'string or datetime.datetime or pd.timeIndex. You '
912
                          f'passed {type(t)}.')
913
            return t_out, coerce_year
2✔
914
        # end _parseTimes
915
        
916
        def _tz_convert(metdata, metadata, tz_convert_val):
2✔
917
            """
918
            convert metdata to a different local timzone.  Particularly for 
919
            SolarGIS weather files which are returned in UTC by default.
920
            ----------
921
            tz_convert_val : int
922
                Convert timezone to this fixed value, following ISO standard 
923
                (negative values indicating West of UTC.)
924
            Returns: metdata, metadata  
925
            """
926
            import pytz
2✔
927
            if (type(tz_convert_val) == int) | (type(tz_convert_val) == float):
2✔
928
                metadata['TZ'] = tz_convert_val
2✔
929
                metdata = metdata.tz_convert(pytz.FixedOffset(tz_convert_val*60))
2✔
930
            return metdata, metadata
2✔
931
        # end _tz_convert
932

933
        if source is None:
2✔
934
    
935
            if weatherFile[-3:].lower() == 'epw':
2✔
936
                source = 'EPW'
2✔
937
            else:
938
                print('Warning: CSV file passed for input. Assuming it is TMY3'+
2✔
939
                      'style format') 
940
                source = 'TMY3'
2✔
941
            if label is None:
2✔
942
                label = 'right' # EPW and TMY are by deffault right-labeled.
2✔
943

944
        if source.lower() == 'solargis':
2✔
945
            if label is None:
2✔
946
                label = 'center'
2✔
947
            metdata, metadata = self._readSOLARGIS(weatherFile, label=label)
2✔
948

949
        if source.lower() =='epw':
2✔
950
            metdata, metadata = self._readEPW(weatherFile, label=label)
2✔
951

952
        if source.lower() =='tmy3':
2✔
953
            metdata, metadata = self._readTMY(weatherFile, label=label)
2✔
954
        
955
        metdata, metadata = _tz_convert(metdata, metadata, tz_convert_val)
2✔
956
        tzinfo = metdata.index.tzinfo
2✔
957
        tempMetDatatitle = 'metdata_temp.csv'
2✔
958

959
        # Parse the start and endtime strings. 
960
        if starttime is not None:
2✔
961
            starttime, coerce_year = _parseTimes(starttime, 1, coerce_year)
2✔
962
            starttime = starttime.tz_localize(tzinfo)
2✔
963
        if endtime is not None:
2✔
964
            endtime, coerce_year = _parseTimes(endtime, 23, coerce_year)
2✔
965
            endtime = endtime.tz_localize(tzinfo)
2✔
966
        '''
2✔
967
        #TODO: do we really need this check?
968
        if coerce_year is not None and starttime is not None:
969
            if coerce_year != starttime.year or coerce_year != endtime.year:
970
                print("Warning: Coerce year does not match requested sampled "+
971
                      "date(s)'s years. Setting Coerce year to None.")
972
                coerce_year = None
973
        '''        
974

975
        tmydata_trunc = self._saveTempTMY(metdata, filename=tempMetDatatitle, 
2✔
976
                                          starttime=starttime, endtime=endtime, 
977
                                          coerce_year=coerce_year,
978
                                          label=label)
979

980
        if tmydata_trunc.__len__() > 0:
2✔
981
            self.metdata = MetObj(tmydata_trunc, metadata, label = label)
2✔
982
        else:
983
            self.metdata = None
×
984
            raise Exception('Weather file returned zero points for the '
×
985
                  'starttime / endtime  provided')
986
        
987
        
988
        return self.metdata
2✔
989

990
    def _saveTempTMY(self, tmydata, filename=None, starttime=None, endtime=None, 
2✔
991
                     coerce_year=None, label=None):
992
        '''
993
        private function to save part or all of tmydata into /EPWs/ for use 
994
        in gencumsky -G mode and return truncated  tmydata. Gencumsky 8760
995
        starts with Jan 1, 1AM and ends Dec 31, 2400
996
        
997
        starttime:  tz-localized pd.TimeIndex
998
        endtime:    tz-localized pd.TimeIndex
999
        
1000
        returns: tmydata_truncated  : subset of tmydata based on start & end
1001
        '''
1002
        
1003
        
1004
        if filename is None:
2✔
1005
            filename = 'temp.csv'
×
1006
                        
1007
        gencumskydata = None
2✔
1008
        gencumdict = None
2✔
1009
        if len(tmydata) == 8760: 
2✔
1010
            print("8760 line in WeatherFile. Assuming this is a standard hourly"+
2✔
1011
                  " WeatherFile for the year for purposes of saving Gencumulativesky"+
1012
                  " temporary weather files in EPW folder.")
1013
            if coerce_year is None and starttime is not None:
2✔
1014
                coerce_year = starttime.year
2✔
1015

1016
            elif coerce_year is None and len(tmydata.index[:-1].year.unique())>1:
2✔
1017
                coerce_year = 2021                
2✔
1018
            
1019
            if coerce_year:
2✔
1020
                print(f"Coercing year to {coerce_year}")
2✔
1021
                tz = tmydata.index.tz
2✔
1022
                year_vector = np.full(shape=len(tmydata), fill_value=coerce_year)
2✔
1023
                year_vector[-1] = coerce_year+1
2✔
1024
                tmydata.index =  pd.to_datetime({
2✔
1025
                                    'year': year_vector,
1026
                                    'month': tmydata.index.month,
1027
                                    'day': tmydata.index.day,
1028
                                    'hour': tmydata.index.hour})
1029
                
1030
                tmydata = tmydata.tz_localize(tz)
2✔
1031

1032

1033

1034
            # FilterDates
1035
            filterdates = None
2✔
1036
            if starttime is not None and endtime is not None:
2✔
1037
                starttime
2✔
1038
                filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
2✔
1039
            else:
1040
                if starttime is not None:
2✔
1041
                    filterdates = (tmydata.index >= starttime)
2✔
1042
                if endtime is not None:
2✔
1043
                    filterdates = (tmydata.index <= endtime)
×
1044
            
1045
            if filterdates is not None:
2✔
1046
                print("Filtering dates")
2✔
1047
                tmydata[~filterdates] = 0
2✔
1048
        
1049
            gencumskydata = tmydata.copy()
2✔
1050
            
1051
        else:
1052
            if len(tmydata.index.year.unique()) == 1:
2✔
1053
                if coerce_year:
×
1054
                    # TODO: check why subhourly data still has 0 entries on the next day on _readTMY3
1055
                    # in the meantime, let's make Silvana's life easy by just deletig 0 entries
1056
                    tmydata = tmydata[~(tmydata.index.hour == 0)] 
×
1057
                    print(f"Coercing year to {coerce_year}")
×
1058
                    # TODO: this coercing shows a python warning. Turn it off or find another method? bleh.
1059
                    tmydata.index.values[:] = tmydata.index[:] + pd.DateOffset(year=(coerce_year))
×
1060
        
1061
                # FilterDates
1062
                filterdates = None
×
1063
                if starttime is not None and endtime is not None:
×
1064
                    filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1065
                else:
1066
                    if starttime is not None:
×
1067
                        filterdates = (tmydata.index >= starttime)
×
1068
                    if endtime is not None:
×
1069
                        filterdates = (tmydata.index <= endtime)
×
1070
                
1071
                if filterdates is not None:
×
1072
                    print("Filtering dates")
×
1073
                    tmydata[~filterdates] = 0
×
1074
        
1075
                gencumskydata = tmydata.copy()
×
1076
                gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata, 
×
1077
                                                                label=label)
1078
        
1079
            else:
1080
                if coerce_year:
2✔
1081
                    print("More than 1 year of data identified. Can't do coercing")
×
1082
                
1083
                # Check if years are consecutive
1084
                l = list(tmydata.index.year.unique())
2✔
1085
                if l != list(range(min(l), max(l)+1)):
2✔
1086
                    print("Years are not consecutive. Won't be able to use Gencumsky"+
×
1087
                          " because who knows what's going on with this data.")
1088
                else:
1089
                    print("Years are consecutive. For Gencumsky, make sure to select"+
2✔
1090
                          " which yearly temporary weather file you want to use"+
1091
                          " else they will all get accumulated to same hour/day")
1092
                    
1093
                    # FilterDates
1094
                    filterdates = None
2✔
1095
                    if starttime is not None and endtime is not None:
2✔
1096
                        filterdates = (tmydata.index >= starttime) & (tmydata.index <= endtime)
×
1097
                    else:
1098
                        if starttime is not None:
2✔
1099
                            filterdates = (tmydata.index >= starttime)
×
1100
                        if endtime is not None:
2✔
1101
                            filterdates = (tmydata.index <= endtime)
×
1102
                    
1103
                    if filterdates is not None:
2✔
1104
                        print("Filtering dates")
×
1105
                        tmydata = tmydata[filterdates] # Reducing years potentially
×
1106
        
1107
                    # Checking if filtering reduced to just 1 year to do usual savin.
1108
                    if len(tmydata.index.year.unique()) == 1:
2✔
1109
                        gencumskydata = tmydata.copy()
×
1110
                        gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
×
1111
                                                                        label=label)
1112

1113
                    else:
1114
                        gencumdict = [g for n, g in tmydata.groupby(pd.Grouper(freq='Y'))]
2✔
1115
                        
1116
                        for ii in range(0, len(gencumdict)):
2✔
1117
                            gencumskydata = gencumdict[ii]
2✔
1118
                            gencumskydata = _subhourlydatatoGencumskyformat(gencumskydata,
2✔
1119
                                                                            label=label)
1120
                            gencumdict[ii] = gencumskydata
2✔
1121
                        
1122
                        gencumskydata = None # clearing so that the dictionary style can be activated.
2✔
1123
        
1124
        
1125
        # Let's save files in EPWs folder for Gencumsky     
1126
        if gencumskydata is not None:
2✔
1127
            csvfile = os.path.join('EPWs', filename)
2✔
1128
            print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1129
            gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1130
            self.gencumsky_metfile = csvfile
2✔
1131
        
1132
        if gencumdict is not None:
2✔
1133
            self.gencumsky_metfile = []
2✔
1134
            for ii in range (0, len(gencumdict)):
2✔
1135
                gencumskydata = gencumdict[ii]
2✔
1136
                newfilename = filename.split('.')[0]+'_year_'+str(ii)+'.csv'
2✔
1137
                csvfile = os.path.join('EPWs', newfilename)
2✔
1138
                print('Saving file {}, # points: {}'.format(csvfile, gencumskydata.__len__()))
2✔
1139
                gencumskydata.to_csv(csvfile, index=False, header=False, sep=' ', columns=['GHI','DHI'])
2✔
1140
                self.gencumsky_metfile.append(csvfile)
2✔
1141

1142
        return tmydata
2✔
1143

1144
        
1145
    def _readTMY(self, tmyfile=None, label = 'right', coerce_year=None):
2✔
1146
        '''
1147
        use pvlib to read in a tmy3 file.
1148
        Note: pvlib 0.7 does not currently support sub-hourly files. Until
1149
        then, use _readTMYdate() to create the index
1150

1151
        Parameters
1152
        ------------
1153
        tmyfile : str
1154
            Filename of tmy3 to be read with pvlib.tmy.readtmy3
1155
        label : str
1156
            'left', 'right', or 'center'. For data that is averaged, defines if
1157
            the timestamp refers to the left edge, the right edge, or the 
1158
            center of the averaging interval, for purposes of calculating 
1159
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1160
            represents data from 10 to 11, and sun position is calculated 
1161
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1162
            data and NSRDB uses centered.
1163
        coerce_year : int
1164
            Year to coerce to. Default is 2021. 
1165
        
1166
        Returns
1167
        -------
1168
        metdata - MetObj collected from TMY3 file
1169
        '''
1170
        def _convertTMYdate(data, meta):
2✔
1171
            ''' requires pvlib 0.8, updated to handle subhourly timestamps '''
1172
            # get the date column as a pd.Series of numpy datetime64
1173
            data_ymd = pd.to_datetime(data['Date (MM/DD/YYYY)'])
2✔
1174
            # shift the time column so that midnite is 00:00 instead of 24:00
1175
            shifted_hour = data['Time (HH:MM)'].str[:2].astype(int) % 24
2✔
1176
            minute = data['Time (HH:MM)'].str[3:].astype(int) 
2✔
1177
            # shift the dates at midnite so they correspond to the next day
1178
            data_ymd[shifted_hour == 0] += datetime.timedelta(days=1)
2✔
1179
            # NOTE: as of pandas>=0.24 the pd.Series.array has a month attribute, but
1180
            # in pandas-0.18.1, only DatetimeIndex has month, but indices are immutable
1181
            # so we need to continue to work with the panda series of dates `data_ymd`
1182
            data_index = pd.DatetimeIndex(data_ymd)
2✔
1183
            # use indices to check for a leap day and advance it to March 1st
1184
            leapday = (data_index.month == 2) & (data_index.day == 29)
2✔
1185
            data_ymd[leapday] += datetime.timedelta(days=1)
2✔
1186
            # shifted_hour is a pd.Series, so use pd.to_timedelta to get a pd.Series of
1187
            # timedeltas
1188
            # NOTE: as of pvlib-0.6.3, min req is pandas-0.18.1, so pd.to_timedelta
1189
            # unit must be in (D,h,m,s,ms,us,ns), but pandas>=0.24 allows unit='hour'
1190
            data.index = (data_ymd + pd.to_timedelta(shifted_hour, unit='h') +
2✔
1191
                         pd.to_timedelta(minute, unit='min') )
1192

1193
            data = data.tz_localize(int(meta['TZ'] * 3600))
2✔
1194
            
1195
            return data
2✔
1196
        
1197
        
1198
        import pvlib
2✔
1199
        #(tmydata, metadata) = pvlib.tmy.readtmy3(filename=tmyfile) #pvlib<=0.6
1200
        try:
2✔
1201
            (tmydata, metadata) = pvlib.iotools.tmy.read_tmy3(filename=tmyfile,
2✔
1202
                                                          coerce_year=coerce_year,
1203
                                                          map_variables=True)
1204
        except TypeError:  # pvlib < 0.10
1✔
1205
            (tmydata, metadata) = pvlib.iotools.tmy.read_tmy3(filename=tmyfile,
1✔
1206
                                                          coerce_year=coerce_year)
1207
        
1208
        try:
2✔
1209
            tmydata = _convertTMYdate(tmydata, metadata) 
2✔
1210
        except KeyError:
×
1211
            print('PVLib >= 0.8.0 is required for sub-hourly data input')
×
1212
        
1213
        tmydata.rename(columns={'dni':'DNI',
2✔
1214
                                'dhi':'DHI',
1215
                                'temp_air':'DryBulb',
1216
                                'wind_speed':'Wspd',
1217
                                'ghi':'GHI',
1218
                                'albedo':'Alb'
1219
                                }, inplace=True)  #as of v0.11, PVLib changed tmy3 column names..
1220

1221
        return tmydata, metadata
2✔
1222

1223
    def _readEPW(self, epwfile=None, label = 'right', coerce_year=None):
2✔
1224
        """
1225
        Uses readepw from pvlib>0.6.1 but un-do -1hr offset and
1226
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1227
    
1228
        Parameters
1229
        ------------
1230
        epwfile : str
1231
            Direction and filename of the epwfile. If None, opens an interactive
1232
            loading window.
1233
        label : str
1234
            'left', 'right', or 'center'. For data that is averaged, defines if
1235
            the timestamp refers to the left edge, the right edge, or the 
1236
            center of the averaging interval, for purposes of calculating 
1237
            sunposition. For example, TMY3 data is right-labeled, so 11 AM data 
1238
            represents data from 10 to 11, and sun position is calculated 
1239
            at 10:30 AM.  Currently SAM and PVSyst use left-labeled interval 
1240
            data and NSRDB uses centered.
1241
        coerce_year : int
1242
            Year to coerce data to.
1243
        
1244
        """
1245
        
1246
        import pvlib
2✔
1247
        #import re
1248
        
1249
        '''
2✔
1250
        NOTE: In PVLib > 0.6.1 the new epw.read_epw() function reads in time 
1251
        with a default -1 hour offset.  This is reflected in our existing
1252
        workflow. 
1253
        '''
1254
        #(tmydata, metadata) = readepw(epwfile) #
1255
        (tmydata, metadata) = pvlib.iotools.epw.read_epw(epwfile, 
2✔
1256
                                                         coerce_year=coerce_year) #pvlib>0.6.1
1257
        #pvlib uses -1hr offset that needs to be un-done. 
1258
        tmydata.index = tmydata.index+pd.Timedelta(hours=1) 
2✔
1259

1260
        # rename different field parameters to match output from 
1261
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1262
        tmydata.rename(columns={'dni':'DNI',
2✔
1263
                                'dhi':'DHI',
1264
                                'temp_air':'DryBulb',
1265
                                'wind_speed':'Wspd',
1266
                                'ghi':'GHI',
1267
                                'albedo':'Alb'
1268
                                }, inplace=True)    
1269

1270
        return tmydata, metadata
2✔
1271

1272

1273
    def _readSOLARGIS(self, filename=None, label='center'):
2✔
1274
        """
1275
        Read solarGIS data file which is timestamped in UTC.
1276
        rename columns to match TMY3: DNI, DHI, GHI, DryBulb, Wspd
1277
        Timezone is always returned as UTC. Use tz_convert in readWeatherFile
1278
        to manually convert to local time
1279
    
1280
        Parameters
1281
        ------------
1282
        filename : str
1283
            filename of the solarGIS file. 
1284
        label : str
1285
            'left', 'right', or 'center'. For data that is averaged, defines if
1286
            the timestamp refers to the left edge, the right edge, or the 
1287
            center of the averaging interval. SolarGis default style is center,
1288
            unless user requests a right label. 
1289
       
1290
        """
1291
        # file format: anything with # preceding is in the header
1292
        header = []; lat = None; lon = None; elev = None; name = None
2✔
1293
        with open(filename, 'r') as result:
2✔
1294
            for line in result:
2✔
1295
                if line.startswith('#'):
2✔
1296
                    header.append(line)
2✔
1297
                    if line.startswith('#Latitude:'):
2✔
1298
                        lat = line[11:]
2✔
1299
                    if line.startswith('#Longitude:'):
2✔
1300
                        lon = line[12:]
2✔
1301
                    if line.startswith('#Elevation:'):
2✔
1302
                        elev = line[12:17]
2✔
1303
                    if line.startswith('#Site name:'):
2✔
1304
                        name = line[12:-1]
2✔
1305
                else:
1306
                    break
2✔
1307
        metadata = {'latitude':float(lat),
2✔
1308
                    'longitude':float(lon),
1309
                    'altitude':float(elev),
1310
                    'Name':name,
1311
                    'TZ':0.0}
1312
        # read in remainder of data
1313
        data = pd.read_csv(filename,skiprows=header.__len__(), delimiter=';')
2✔
1314

1315
        # rename different field parameters to match output from 
1316
        # pvlib.tmy.readtmy: DNI, DHI, DryBulb, Wspd
1317
        data.rename(columns={'DIF':'DHI',
2✔
1318
                             'TEMP':'DryBulb',
1319
                             'WS':'Wspd',
1320
                             }, inplace=True)    
1321

1322
        # Generate index from Date (DD.HH.YYYY) and Time
1323
        data.index = pd.to_datetime(data.Date + ' ' +  data.Time, 
2✔
1324
                                    dayfirst=True, utc=True,
1325
                                    infer_datetime_format = True)
1326

1327
        
1328
        return data, metadata
2✔
1329

1330

1331
    def getSingleTimestampTrackerAngle(self, metdata, timeindex, gcr=None, 
2✔
1332
                                       azimuth=180, axis_tilt=0, 
1333
                                       limit_angle=45, backtrack=True):
1334
        """
1335
        Helper function to calculate a tracker's angle for use with the 
1336
        fixed tilt routines of bifacial_radiance. It calculates tracker angle for
1337
        sun position at the timeindex passed (no left or right time offset, 
1338
        label = 'center')
1339
        
1340
        Parameters
1341
        ----------
1342
        metdata : :py:class:`~bifacial_radiance.MetObj` 
1343
            Meterological object to set up geometry. Usually set automatically by
1344
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1345
            Default = self.metdata
1346
        timeindex : int
1347
            Index between 0 to 8760 indicating hour to simulate.
1348
        gcr : float
1349
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1350
        azimuth : float or int
1351
            Orientation axis of tracker torque tube. Default North-South (180 deg)
1352
        axis_tilt : float or int
1353
            Default 0. Axis tilt -- not implemented in sensors locations so it's pointless
1354
            at this release to change it.
1355
        limit_angle : float or int
1356
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1357
        backtrack : boolean
1358
            Whether backtracking is enabled (default = True)
1359
        
1360
        """
1361
        '''
2✔
1362
        elev = metdata.elevation
1363
        lat = metdata.latitude
1364
        lon = metdata.longitude
1365
        timestamp = metdata.datetime[timeindex]
1366
        '''
1367
        
1368
        import pvlib
2✔
1369
                
1370
        solpos = metdata.solpos.iloc[timeindex]
2✔
1371
        sunzen = float(solpos.apparent_zenith)
2✔
1372
        sunaz = float(solpos.azimuth) # not substracting the 180 
2✔
1373
        
1374
        trackingdata = pvlib.tracking.singleaxis(sunzen, sunaz,
2✔
1375
                                             axis_tilt, azimuth,
1376
                                             limit_angle, backtrack, gcr)
1377
        
1378
        tracker_theta = float(np.round(trackingdata['tracker_theta'],2))
2✔
1379
        tracker_theta = tracker_theta*-1 # bifacial_radiance uses East (morning) theta as positive
2✔
1380
            
1381
        return tracker_theta
2✔
1382

1383

1384
    def gendaylit(self, timeindex, metdata=None, debug=False):
2✔
1385
        """
1386
        Sets and returns sky information using gendaylit.
1387
        Uses PVLIB for calculating the sun position angles instead of
1388
        using Radiance internal sun position calculation (for that use gendaylit function)
1389
        
1390
        Parameters
1391
        ----------
1392
        timeindex : int
1393
            Index from 0 to ~4000 of the MetObj (daylight hours only)
1394
        metdata : ``MetObj``
1395
            MetObj object with list of dni, dhi, ghi and location
1396
        debug : bool
1397
            Flag to print output of sky DHI and DNI
1398

1399
        Returns
1400
        -------
1401
        skyname : str
1402
            Sets as a self.skyname and returns filename of sky in /skies/ directory. 
1403
            If errors exist, such as DNI = 0 or sun below horizon, this skyname is None
1404

1405
        """
1406
        import warnings
2✔
1407
 
1408
        if metdata is None:
2✔
1409
            try:
2✔
1410
                metdata = self.metdata
2✔
1411
            except:
×
1412
                print('usage: pass metdata, or run after running ' +
×
1413
                      'readWeatherfile() ') 
1414
                return
×
1415

1416
        ground = self.ground
2✔
1417
        
1418
        locName = metdata.city
2✔
1419
        dni = metdata.dni[timeindex]
2✔
1420
        dhi = metdata.dhi[timeindex]
2✔
1421
        ghi = metdata.ghi[timeindex]
2✔
1422
        elev = metdata.elevation
2✔
1423
        lat = metdata.latitude
2✔
1424
        lon = metdata.longitude
2✔
1425

1426
        # Assign Albedos
1427
        try:
2✔
1428
            if ground.ReflAvg.shape == metdata.dni.shape:
2✔
1429
                groundindex = timeindex  
2✔
1430
            elif self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1431
                groundindex = 0
2✔
1432
            else:
1433
                warnings.warn("Shape of ground Albedos and TMY data do not match.")
×
1434
                return
×
1435
        except:
×
1436
            print('usage: make sure to run setGround() before gendaylit()')
×
1437
            return
×
1438

1439
        if debug is True:
2✔
1440
            print('Sky generated with Gendaylit, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1441
            print("Datetime TimeIndex", metdata.datetime[timeindex])
2✔
1442

1443

1444

1445
        #Time conversion to correct format and offset.
1446
        #datetime = metdata.sunrisesetdata['corrected_timestamp'][timeindex]
1447
        #Don't need any of this any more. Already sunrise/sunset corrected and offset by appropriate interval
1448

1449
        # get solar position zenith and azimuth based on site metadata
1450
        #solpos = pvlib.irradiance.solarposition.get_solarposition(datetimetz,lat,lon,elev)
1451
        solpos = metdata.solpos.iloc[timeindex]
2✔
1452
        sunalt = float(solpos.elevation)
2✔
1453
        # Radiance expects azimuth South = 0, PVlib gives South = 180. Must substract 180 to match.
1454
        sunaz = float(solpos.azimuth)-180.0
2✔
1455

1456
        sky_path = 'skies'
2✔
1457

1458
        if dhi <= 0:
2✔
1459
            self.skyfiles = [None]
×
1460
            return None
×
1461
        # We should already be filtering for elevation >0. But just in case...
1462
        if sunalt <= 0:
2✔
1463
            sunalt = np.arcsin((ghi-dhi)/(dni+.001))*180/np.pi # reverse engineer elevation from ghi, dhi, dni
×
1464
            print('Warning: negative sun elevation at '+
×
1465
                  '{}.  '.format(metdata.datetime[timeindex])+
1466
                  'Re-calculated elevation: {:0.2}'.format(sunalt))
1467

1468
        # Note - -W and -O1 option is used to create full spectrum analysis in units of Wm-2
1469
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1470
        skyStr = ("# start of sky definition for daylighting studies\n" + \
2✔
1471
            "# location name: " + str(locName) + " LAT: " + str(lat)
1472
            +" LON: " + str(lon) + " Elev: " + str(elev) + "\n"
1473
            "# Sun position calculated w. PVLib\n" + \
1474
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1475
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, ground.ReflAvg[groundindex]) + \
1476
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1477
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1478
            ground._makeGroundString(index=groundindex, cumulativesky=False)
1479

1480
        time = metdata.datetime[timeindex]
2✔
1481
        #filename = str(time)[2:-9].replace('-','_').replace(' ','_').replace(':','_')
1482
        filename = time.strftime('%Y-%m-%d_%H%M')
2✔
1483
        skyname = os.path.join(sky_path,"sky2_%s_%s_%s.rad" %(lat, lon, filename))
2✔
1484

1485
        skyFile = open(skyname, 'w')
2✔
1486
        skyFile.write(skyStr)
2✔
1487
        skyFile.close()
2✔
1488

1489
        self.skyfiles = [skyname]
2✔
1490

1491
        return skyname
2✔
1492

1493
    def gendaylit2manual(self, dni, dhi, sunalt, sunaz):
2✔
1494
        """
1495
        Sets and returns sky information using gendaylit.
1496
        Uses user-provided data for sun position and irradiance.
1497
        
1498
        .. warning::
1499
            This generates the sky at the sun altitude&azimuth provided, make 
1500
            sure it is the right position relative to how the weather data got
1501
            created and read (i.e. label right, left or center).
1502
            
1503
     
1504
        Parameters
1505
        ------------
1506
        dni: int or float
1507
           Direct Normal Irradiance (DNI) value, in W/m^2
1508
        dhi : int or float
1509
           Diffuse Horizontal Irradiance (DHI) value, in W/m^2 
1510
        sunalt : int or float
1511
           Sun altitude (degrees) 
1512
        sunaz : int or float
1513
           Sun azimuth (degrees) 
1514

1515
        Returns
1516
        -------
1517
        skyname : string
1518
           Filename of sky in /skies/ directory
1519
        """
1520

1521
        
1522
        print('Sky generated with Gendaylit 2 MANUAL, with DNI: %0.1f, DHI: %0.1f' % (dni, dhi))
2✔
1523

1524
        sky_path = 'skies'
2✔
1525

1526
        if sunalt <= 0 or dhi <= 0:
2✔
1527
            self.skyfiles = [None]
×
1528
            return None
×
1529
        
1530
                # Assign Albedos
1531
        try:
2✔
1532
            if self.ground.ReflAvg.shape[0] == 1: # just 1 entry
2✔
1533
                groundindex = 0
2✔
1534
            else:
1535
                print("Ambiguous albedo entry, Set albedo to single value "
×
1536
                      "in setGround()")
1537
                return
×
1538
        except:
×
1539
            print('usage: make sure to run setGround() before gendaylit()')
×
1540
            return
×
1541
        
1542
        
1543
        # Note: -W and -O1 are used to create full spectrum analysis in units of Wm-2       
1544
         #" -L %s %s -g %s \n" %(dni/.0079, dhi/.0079, self.ground.ReflAvg) + \
1545
        skyStr =   ("# start of sky definition for daylighting studies\n" + \
2✔
1546
            "# Manual inputs of DNI, DHI, SunAlt and SunAZ into Gendaylit used \n" + \
1547
            "!gendaylit -ang %s %s" %(sunalt, sunaz)) + \
1548
            " -W %s %s -g %s -O 1 \n" %(dni, dhi, self.ground.ReflAvg[groundindex]) + \
1549
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
1550
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
1551
            self.ground._makeGroundString(index=groundindex, cumulativesky=False)
1552

1553
        skyname = os.path.join(sky_path, "sky2_%s.rad" %(self.name))
2✔
1554

1555
        skyFile = open(skyname, 'w')
2✔
1556
        skyFile.write(skyStr)
2✔
1557
        skyFile.close()
2✔
1558

1559
        self.skyfiles = [skyname]
2✔
1560

1561
        return skyname
2✔
1562

1563
    def genCumSky(self, gencumsky_metfile=None, savefile=None):
2✔
1564
        """ 
1565
        Generate Skydome using gencumsky. 
1566
        
1567
        .. warning::
1568
            gencumulativesky.exe is required to be installed,
1569
            which is not a standard radiance distribution.
1570
            You can find the program in the bifacial_radiance distribution directory
1571
            in \Lib\site-packages\bifacial_radiance\data
1572
            
1573
 
1574
        Use :func:`readWeatherFile(filename, starttime='YYYY-mm-dd_HHMM', endtime='YYYY-mm-dd_HHMM')` 
1575
        to limit gencumsky simulations instead.
1576

1577
        Parameters
1578
        ------------
1579
        gencumsky_metfile : str
1580
            Filename with path to temporary created meteorological file usually created
1581
            in EPWs folder. This csv file has no headers, no index, and two
1582
            space separated columns with values for GHI and DNI for each hour 
1583
            in the year, and MUST have 8760 entries long otherwise gencumulativesky.exe cries. 
1584
        savefile : string
1585
            If savefile is None, defaults to "cumulative"
1586
            
1587
        Returns
1588
        --------
1589
        skyname : str
1590
            Filename of the .rad file containing cumulativesky info
1591
            
1592
        """
1593
        
1594
        # TODO:  error checking and auto-install of gencumulativesky.exe
1595
        # TODO: add check if readWeatherfile has not be done
1596
        # TODO: check if it fails if gcc module has been loaded? (common hpc issue)
1597
        
1598
        #import datetime
1599
        
1600
        if gencumsky_metfile is None:
2✔
1601
            gencumsky_metfile = self.gencumsky_metfile
×
1602
            if isinstance(gencumsky_metfile, str):
×
1603
                print("Loaded ", gencumsky_metfile)
×
1604
                
1605
        if isinstance(gencumsky_metfile, list):
2✔
1606
            print("There are more than 1 year of gencumsky temporal weather file saved."+
×
1607
                  "You can pass which file you want with gencumsky_metfile input. Since "+
1608
                  "No year was selected, defaulting to using the first year of the list")
1609
            gencumsky_metfile = gencumsky_metfile[0] 
×
1610
            print("Loaded ", gencumsky_metfile)
×
1611

1612

1613
        if savefile is None:
2✔
1614
            savefile = "cumulative"
2✔
1615
        sky_path = 'skies'
2✔
1616
        lat = self.metdata.latitude
2✔
1617
        lon = self.metdata.longitude
2✔
1618
        timeZone = self.metdata.timezone
2✔
1619
        '''
2✔
1620
        cmd = "gencumulativesky +s1 -h 0 -a %s -o %s -m %s %s " %(lat, lon, float(timeZone)*15, filetype) +\
1621
            "-time %s %s -date %s %s %s %s %s" % (startdt.hour, enddt.hour+1,
1622
                                                  startdt.month, startdt.day,
1623
                                                  enddt.month, enddt.day,
1624
                                                  gencumsky_metfile)
1625
        '''
1626
        cmd = (f"gencumulativesky +s1 -h 0 -a {lat} -o {lon} -m "
2✔
1627
               f"{float(timeZone)*15} -G {gencumsky_metfile}" )
1628
               
1629
        with open(savefile+".cal","w") as f:
2✔
1630
            _,err = _popen(cmd, None, f)
2✔
1631
            if err is not None:
2✔
1632
                print(err)
2✔
1633

1634
        # Assign Albedos
1635
        try:
2✔
1636
            groundstring = self.ground._makeGroundString(cumulativesky=True)
2✔
1637
        except:
×
1638
            raise Exception('Error: ground reflection not defined.  '
×
1639
                            'Run RadianceObj.setGround() first')
1640
            return
1641
        
1642

1643

1644
        skyStr = "#Cumulative Sky Definition\n" +\
2✔
1645
            "void brightfunc skyfunc\n" + \
1646
            "2 skybright " + "%s.cal\n" % (savefile) + \
1647
            "0\n" + \
1648
            "0\n" + \
1649
            "\nskyfunc glow sky_glow\n" + \
1650
            "0\n" + \
1651
            "0\n" + \
1652
            "4 1 1 1 0\n" + \
1653
            "\nsky_glow source sky\n" + \
1654
            "0\n" + \
1655
            "0\n" + \
1656
            "4 0 0 1 180\n" + \
1657
            groundstring
1658
            
1659
        skyname = os.path.join(sky_path, savefile+".rad")
2✔
1660

1661
        skyFile = open(skyname, 'w')
2✔
1662
        skyFile.write(skyStr)
2✔
1663
        skyFile.close()
2✔
1664

1665
        self.skyfiles = [skyname]#, 'SunFile.rad' ]
2✔
1666

1667
        return skyname
2✔
1668

1669
    def set1axis(self, metdata=None, azimuth=180, limit_angle=45,
2✔
1670
                 angledelta=5, backtrack=True, gcr=1.0 / 3, cumulativesky=True,
1671
                 fixed_tilt_angle=None, useMeasuredTrackerAngle=False,
1672
                 axis_azimuth=None):
1673
        """
1674
        Set up geometry for 1-axis tracking.  Pull in tracking angle details from
1675
        pvlib, create multiple 8760 metdata sub-files where datetime of met data
1676
        matches the tracking angle.  Returns 'trackerdict' which has keys equal to
1677
        either the tracker angles (gencumsky workflow) or timestamps (gendaylit hourly
1678
        workflow)
1679

1680
        Parameters
1681
        ------------
1682
         metdata : :py:class:`~bifacial_radiance.MetObj` 
1683
            Meterological object to set up geometry. Usually set automatically by
1684
            `bifacial_radiance` after running :py:class:`bifacial_radiance.readepw`. 
1685
            Default = self.metdata
1686
        azimuth : numeric
1687
            Orientation axis of tracker torque tube. Default North-South (180 deg).
1688
            For fixed-tilt configuration, input is fixed azimuth (180 is south)
1689
        limit_angle : numeric
1690
            Limit angle (+/-) of the 1-axis tracker in degrees. Default 45
1691
        angledelta : numeric
1692
            Degree of rotation increment to parse irradiance bins. Default 5 degrees.
1693
            (0.4 % error for DNI).  Other options: 4 (.25%), 2.5 (0.1%).
1694
            Note: the smaller the angledelta, the more simulations must be run.
1695
        backtrack : bool
1696
            Whether backtracking is enabled (default = True)
1697
        gcr : float
1698
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
1699
        cumulativesky : bool
1700
            [True] Wether individual csv files are
1701
            created with constant tilt angle for the cumulativesky approach.
1702
            if false, the gendaylit tracking approach must be used.
1703
        fixed_tilt_angle : numeric
1704
            If passed, this changes to a fixed tilt simulation where each hour 
1705
            uses fixed_tilt_angle and axis_azimuth as the tilt and azimuth
1706
        useMeasuredTrackerAngle: Bool
1707
            If True, and data for tracker angles has been passed by being included
1708
            in the WeatherFile object (column name 'Tracker Angle (degrees)'),
1709
            then tracker angles will be set to these values instead of being calculated.
1710
            NOTE that the value for azimuth passed to set1axis must be surface 
1711
            azimuth in the morning and not the axis_azimuth 
1712
            (i.e. for a N-S HSAT, azimuth = 90).
1713
        axis_azimuth : numeric
1714
            DEPRECATED.  returns deprecation warning. Pass the tracker 
1715
            axis_azimuth through to azimuth input instead.
1716

1717

1718
        Returns
1719
        -------
1720
        trackerdict : dictionary 
1721
            Keys represent tracker tilt angles (gencumsky) or timestamps (gendaylit)
1722
            and list of csv metfile, and datetimes at that angle
1723
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
1724
            - or -
1725
            trackerdict[time]['tracker_theta';'surf_azm';'surf_tilt']
1726
        """
1727

1728
        # Documentation check:
1729
        # Removed         Internal variables
1730
        # -------
1731
        # metdata.solpos          dataframe with solar position data
1732
        # metdata.surface_azimuth list of tracker azimuth data
1733
        # metdata.surface_tilt    list of tracker surface tilt data
1734
        # metdata.tracker_theta   list of tracker tilt angle
1735
        import warnings
2✔
1736
        
1737
        if metdata == None:
2✔
1738
            metdata = self.metdata
2✔
1739

1740
        if metdata == {}:
2✔
1741
            raise Exception("metdata doesnt exist yet.  "+
×
1742
                            "Run RadianceObj.readWeatherFile() ")
1743

1744
        if axis_azimuth:
2✔
1745
            azimuth = axis_azimuth
×
1746
            warnings.warn("axis_azimuth is deprecated in set1axis; use azimuth "
×
1747
                          "input instead.", DeprecationWarning)
1748
            
1749
        #backtrack = True   # include backtracking support in later version
1750
        #gcr = 1.0/3.0       # default value - not used if backtrack = False.
1751

1752

1753
        # get 1-axis tracker angles for this location, rounded to nearest 'angledelta'
1754
        trackerdict = metdata._set1axis(cumulativesky=cumulativesky,
2✔
1755
                                       azimuth=azimuth,
1756
                                       limit_angle=limit_angle,
1757
                                       angledelta=angledelta,
1758
                                       backtrack=backtrack,
1759
                                       gcr=gcr,
1760
                                       fixed_tilt_angle=fixed_tilt_angle,
1761
                                       useMeasuredTrackerAngle=useMeasuredTrackerAngle
1762
                                       )
1763
        self.trackerdict = trackerdict
2✔
1764
        self.cumulativesky = cumulativesky
2✔
1765

1766
        return trackerdict
2✔
1767

1768
    def gendaylit1axis(self, metdata=None, trackerdict=None, startdate=None,
2✔
1769
                       enddate=None, debug=False):
1770
        """
1771
        1-axis tracking implementation of gendaylit.
1772
        Creates multiple sky files, one for each time of day.
1773

1774
        Parameters
1775
        ------------
1776
        metdata
1777
            MetObj output from readWeatherFile.  Needs to have 
1778
            RadianceObj.set1axis() run on it first.
1779
        startdate : str 
1780
            DEPRECATED, does not do anything now.
1781
            Recommended to downselect metdata when reading Weather File.
1782
        enddate : str
1783
            DEPRECATED, does not do anything now.
1784
            Recommended to downselect metdata when reading Weather File.
1785
        trackerdict : dictionary
1786
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
1787
        
1788
        Returns
1789
        -------
1790
        Updated trackerdict dictionary 
1791
            Dictionary with keys for tracker tilt angles (gencumsky) or timestamps (gendaylit)
1792
            with the additional dictionary value ['skyfile'] added
1793

1794
        """
1795
        
1796
        if metdata is None:
2✔
1797
            metdata = self.metdata
2✔
1798
        if trackerdict is None:
2✔
1799
            try:
2✔
1800
                trackerdict = self.trackerdict
2✔
1801
            except AttributeError:
×
1802
                print('No trackerdict value passed or available in self')
×
1803

1804
        if startdate is not None or enddate is not None:
2✔
1805
            print("Deprecation Warning: gendyalit1axis no longer downselects"+
×
1806
                  " entries by stardate and enddate. Downselect your data"+
1807
                  " when loading with readWeatherFile")
1808
            return
×
1809
            
1810
        try:
2✔
1811
            metdata.tracker_theta  # this may not exist
2✔
1812
        except AttributeError:
×
1813
            print("metdata.tracker_theta doesn't exist. Run RadianceObj.set1axis() first")
×
1814

1815
        if debug is False:
2✔
1816
            print('Creating ~%d skyfiles. '%(len(trackerdict.keys())))
2✔
1817
        count = 0  # counter to get number of skyfiles created, just for giggles
2✔
1818

1819
        trackerdict2={}
2✔
1820
        for i in range(0, len(trackerdict.keys())):
2✔
1821
            try:
2✔
1822
                time = metdata.datetime[i]
2✔
1823
            except IndexError:  #out of range error
×
1824
                break  # 
×
1825
            #filename = str(time)[5:-12].replace('-','_').replace(' ','_')
1826
            filename = time.strftime('%Y-%m-%d_%H%M')
2✔
1827
            self.name = filename
2✔
1828

1829
            #check for GHI > 0
1830
            #if metdata.ghi[i] > 0:
1831
            if (metdata.ghi[i] > 0) & (~np.isnan(metdata.tracker_theta[i])):  
2✔
1832
                skyfile = self.gendaylit(metdata=metdata,timeindex=i, debug=debug)
2✔
1833
                # trackerdict2 reduces the dict to only the range specified.
1834
                trackerdict2[filename] = trackerdict[filename]  
2✔
1835
                trackerdict2[filename]['skyfile'] = skyfile
2✔
1836
                count +=1
2✔
1837

1838
        print('Created {} skyfiles in /skies/'.format(count))
2✔
1839
        self.trackerdict = trackerdict2
2✔
1840
        return trackerdict2
2✔
1841

1842
    def genCumSky1axis(self, trackerdict=None):
2✔
1843
        """
1844
        1-axis tracking implementation of gencumulativesky.
1845
        Creates multiple .cal files and .rad files, one for each tracker angle.
1846

1847
        Use :func:`readWeatherFile` to limit gencumsky simulations
1848
        
1849
        
1850
        Parameters
1851
        ------------
1852
        trackerdict : dictionary
1853
            Trackerdict generated as output by RadianceObj.set1axis()
1854
            
1855
        Returns
1856
        -------
1857
        trackerdict : dictionary
1858
            Trackerdict dictionary with new entry trackerdict.skyfile  
1859
            Appends 'skyfile'  to the 1-axis dict with the location of the sky .radfile
1860

1861
        """
1862
        
1863
        if trackerdict == None:
2✔
1864
            try:
2✔
1865
                trackerdict = self.trackerdict
2✔
1866
            except AttributeError:
×
1867
                print('No trackerdict value passed or available in self')
×
1868

1869
        for theta in sorted(trackerdict):  
2✔
1870
            # call gencumulativesky with a new .cal and .rad name
1871
            csvfile = trackerdict[theta]['csvfile']
2✔
1872
            savefile = '1axis_%s'%(theta)  #prefix for .cal file and skies\*.rad file
2✔
1873
            skyfile = self.genCumSky(gencumsky_metfile=csvfile, savefile=savefile)
2✔
1874
            trackerdict[theta]['skyfile'] = skyfile
2✔
1875
            print('Created skyfile %s'%(skyfile))
2✔
1876
        # delete default skyfile (not strictly necessary)
1877
        self.skyfiles = None
2✔
1878
        self.trackerdict = trackerdict
2✔
1879
        return trackerdict
2✔
1880

1881

1882
    def makeOct(self, filelist=None, octname=None):
2✔
1883
        """
1884
        Combine everything together into a .oct file
1885

1886
        Parameters
1887
        ----------
1888
        filelist : list 
1889
            Files to include.  otherwise takes self.filelist
1890
        octname : str
1891
            filename (without .oct extension)
1892

1893

1894
        Returns
1895
        -------
1896
        octname : str
1897
            filename of .oct file in root directory including extension
1898
        err : str
1899
            Error message returned from oconv (if any)
1900
        """
1901
        
1902
        if filelist is None:
2✔
1903
            filelist = self.getfilelist()
×
1904
        if octname is None:
2✔
1905
            octname = self.name
2✔
1906

1907
        debug = False
2✔
1908
        #JSS. With the way that the break is handled now, this will wait the 10 for all the hours
1909
        # that were not generated sky files.
1910
        if self.hpc :
2✔
1911
            import time
2✔
1912
            time_to_wait = 10
2✔
1913
            time_counter = 0
2✔
1914
            for file in filelist:
2✔
1915
                if debug:
2✔
1916
                    print("HPC Checking for file %s" % (file))
×
1917
                if None in filelist:  # are we missing any files? abort!
2✔
1918
                    print('Missing files, skipping...')
×
1919
                    self.octfile = None
×
1920
                    return None
×
1921
                #Filesky is being saved as 'none', so it crashes !
1922
                while not os.path.exists(file):
2✔
1923
                    time.sleep(1)
×
1924
                    time_counter += 1
×
1925
                if time_counter > time_to_wait:
2✔
1926
                    print ("filenotfound")
×
1927
                    break
×
1928

1929
        #os.system('oconv '+ ' '.join(filelist) + ' > %s.oct' % (octname))
1930
        if None in filelist:  # are we missing any files? abort!
2✔
1931
            print('Missing files, skipping...')
×
1932
            self.octfile = None
×
1933
            return None
×
1934

1935
        #cmd = 'oconv ' + ' '.join(filelist)
1936
        filelist.insert(0,'oconv')
2✔
1937
        with open('%s.oct' % (octname), "w") as f:
2✔
1938
            _,err = _popen(filelist, None, f)
2✔
1939
            #TODO:  exception handling for no sun up
1940
            if err is not None:
2✔
1941
                if err[0:5] == 'error':
×
1942
                    raise Exception(err[7:])
×
1943
                if err[0:7] == 'message':
×
1944
                    warnings.warn(err[9:], Warning)
×
1945
                    
1946

1947
        #use rvu to see if everything looks good. 
1948
        # use cmd for this since it locks out the terminal.
1949
        #'rvu -vf views\side.vp -e .01 monopanel_test.oct'
1950
        print("Created %s.oct" % (octname))
2✔
1951
        self.octfile = '%s.oct' % (octname)
2✔
1952
        return '%s.oct' % (octname)
2✔
1953

1954
    def makeOct1axis(self, trackerdict=None, singleindex=None, customname=None):
2✔
1955
        """
1956
        Combine files listed in trackerdict into multiple .oct files
1957

1958
        Parameters
1959
        ------------
1960
        trackerdict 
1961
            Output from :py:class:`~bifacial_radiance.RadianceObj.makeScene1axis`
1962
        singleindex : str
1963
            Single index for trackerdict to run makeOct1axis in single-value mode,
1964
            format 'YYYY-MM-DD_HHMM'.
1965
        customname : str 
1966
            Custom text string added to the end of the OCT file name.
1967

1968
        Returns
1969
        -------
1970
        trackerdict
1971
            Append 'octfile'  to the 1-axis dict with the location of the scene .octfile
1972
        """
1973

1974
        if customname is None:
2✔
1975
            customname = ''
2✔
1976

1977
        if trackerdict is None:
2✔
1978
            try:
×
1979
                trackerdict = self.trackerdict
×
1980
            except AttributeError:
×
1981
                print('No trackerdict value passed or available in self')
×
1982
        if singleindex is None:   # loop through all values in the tracker dictionary
2✔
1983
            indexlist = trackerdict.keys()
2✔
1984
        else:  # just loop through one single index in tracker dictionary
1985
            indexlist = [singleindex]
×
1986

1987
        print('\nMaking {} octfiles in root directory.'.format(indexlist.__len__()))
2✔
1988
        for index in sorted(indexlist):  # run through either entire key list of trackerdict, or just a single value
2✔
1989
            try:
2✔
1990
                filelist = self.materialfiles + [trackerdict[index]['skyfile'], trackerdict[index]['radfile']]
2✔
1991
                octname = '1axis_%s%s'%(index, customname)
2✔
1992
                trackerdict[index]['octfile'] = self.makeOct(filelist, octname)
2✔
1993
            except KeyError as e:
×
1994
                print('Trackerdict key error: {}'.format(e))
×
1995

1996
        return trackerdict
2✔
1997

1998
    
1999
    def makeModule(self, name=None, x=None, y=None, z=None,  modulefile=None, 
2✔
2000
                 text=None, customtext='',  xgap=0.01, ygap=0.0, 
2001
                 zgap=0.1, numpanels=1, rewriteModulefile=True, 
2002
                 glass=False, modulematerial=None, bifi=1,  **kwargs):
2003
        """
2004
        pass module generation details into ModuleObj(). See ModuleObj() 
2005
        docstring for more details
2006
        """
2007
        from bifacial_radiance import ModuleObj
2✔
2008

2009
        if name is None:
2✔
2010
            print("usage:  makeModule(name,x,y,z, modulefile = '\objects\*.rad', "+
2✔
2011
                  " zgap = 0.1 (module offset)"+
2012
                  "numpanels = 1 (# of panels in portrait), ygap = 0.05 "+
2013
                  "(slope distance between panels when arrayed), "+
2014
                  "rewriteModulefile = True (or False), bifi = 1")
2015
            print("You can also override module_type info by passing 'text'"+
2✔
2016
                  "variable, or add on at the end for racking details with "+
2017
                  "'customtext'. See function definition for more details")
2018
            print("Optional: tubeParams={} (torque tube details including "
2✔
2019
                  "diameter (torque tube dia. in meters), tubetype='Round' "
2020
                  "(or 'square', 'hex'), material='Metal_Grey' (or 'black')"
2021
                  ", axisofrotation=True (does scene rotate around tube)")
2022
            print("Optional: cellModule={} (create cell-level module by "+
2✔
2023
                  " passing in dictionary with keys 'numcellsx'6 (#cells in "+
2024
                  "X-dir.), 'numcellsy', 'xcell' (cell size in X-dir. in meters),"+
2025
                  "'ycell', 'xcellgap' (spacing between cells in X-dir.), 'ycellgap'")
2026
            print("Optional: omegaParams={} (create the support structure omega by "+
2✔
2027
                  "passing in dictionary with keys 'omega_material' (the material of "+
2028
                  "omega), 'mod_overlap'(the length of the module adjacent piece of"+
2029
                  " omega that overlaps with the module),'x_omega1', 'y_omega' (ideally same"+
2030
                  " for all the parts of omega),'z_omega1', 'x_omega2' (X-dir length of the"+
2031
                  " vertical piece), 'x_omega3', z_omega3")
2032

2033
            return
2✔
2034
        
2035
        """
2✔
2036
        # TODO: check for deprecated torquetube and axisofrotationTorqueTube in
2037
          kwargs.  
2038
        """
2039
        if 'tubeParams' in kwargs:
2✔
2040
            tubeParams = kwargs.pop('tubeParams')
2✔
2041
        else:
2042
            tubeParams = None
2✔
2043
        if 'torquetube' in kwargs:
2✔
2044
            torquetube = kwargs.pop('torquetube')
×
2045
            print("\nWarning: boolean input `torquetube` passed into makeModule"
×
2046
                  ". Starting in v0.4.0 this boolean parameter is deprecated."
2047
                  " Use module.addTorquetube() with `visible` parameter instead.")
2048
            if tubeParams:
×
2049
                tubeParams['visible'] =  torquetube
×
2050
            elif (tubeParams is None) & (torquetube is True):
×
2051
                tubeParams = {'visible':True} # create default TT
×
2052
            
2053
        if 'axisofrotationTorqueTube' in kwargs:
2✔
2054
            axisofrotation = kwargs.pop('axisofrotationTorqueTube')
×
2055
            print("\nWarning: input boolean `axisofrotationTorqueTube` passed "
×
2056
                "into makeModule. Starting in v0.4.0 this boolean parameter is"
2057
                " deprecated. Use module.addTorquetube() with `axisofrotation`"
2058
                "parameter instead.")
2059
            if tubeParams:  #this kwarg only does somehting if there's a TT.
×
2060
                tubeParams['axisofrotation'] = axisofrotation
×
2061
        
2062
        if self.hpc:  # trigger HPC simulation in ModuleObj
2✔
2063
            kwargs['hpc']=True
2✔
2064
            
2065
        self.module = ModuleObj(name=name, x=x, y=y, z=z, bifi=bifi, modulefile=modulefile,
2✔
2066
                   text=text, customtext=customtext, xgap=xgap, ygap=ygap, 
2067
                   zgap=zgap, numpanels=numpanels, 
2068
                   rewriteModulefile=rewriteModulefile, glass=glass, 
2069
                   modulematerial=modulematerial, tubeParams=tubeParams,
2070
                   **kwargs)
2071
        return self.module
2✔
2072
    
2073
    
2074
    
2075
    def makeCustomObject(self, name=None, text=None):
2✔
2076
        """
2077
        Function for development and experimenting with extraneous objects in the scene.
2078
        This function creates a `name.rad` textfile in the objects folder
2079
        with whatever text that is passed to it.
2080
        It is up to the user to pass the correct radiance format.
2081
        
2082
        For example, to create a box at coordinates 0,0 (with its bottom surface
2083
        on the plane z=0):
2084
            
2085
        .. code-block:
2086
        
2087
            name = 'box'
2088
            text='! genbox black PVmodule 0.5 0.5 0.5 | xform -t -0.25 -0.25 0'
2089

2090
        Parameters
2091
        ----------
2092
        name : str
2093
            String input to name the module type
2094
        text : str
2095
            Text used in the radfile to generate the module
2096
        
2097
        """
2098

2099
        customradfile = os.path.join('objects', '%s.rad'%(name)) # update in 0.2.3 to shorten radnames
×
2100
        # py2 and 3 compatible: binary write, encode text first
2101
        with open(customradfile, 'wb') as f:
×
2102
            f.write(text.encode('ascii'))
×
2103

2104
        print("\nCustom Object Name", customradfile)
×
2105
        self.customradfile = customradfile
×
2106
        return customradfile
×
2107

2108

2109
    def printModules(self):
2✔
2110
        # print available module types from ModuleObj
2111
        from bifacial_radiance import ModuleObj
2✔
2112
        modulenames = ModuleObj().readModule()
2✔
2113
        print('Available module names: {}'.format([str(x) for x in modulenames]))
2✔
2114
        return modulenames
2✔
2115
    
2116
    def makeScene(self, module=None, sceneDict=None, radname=None,
2✔
2117
                  moduletype=None):
2118
        """
2119
        Create a SceneObj which contains details of the PV system configuration including
2120
        tilt, row pitch, height, nMods per row, nRows in the system...
2121

2122
        Parameters
2123
        ----------
2124
        module : str or ModuleObj
2125
            String name of module created with makeModule()
2126
        sceneDict : dictionary
2127
            Dictionary with keys: `tilt`, `clearance_height`*, `pitch`,
2128
            `azimuth`, `nMods`, `nRows`, `hub_height`*, `height`*
2129
            * height deprecated from sceneDict. For makeScene (fixed systems)
2130
            if passed it is assumed it reffers to clearance_height.
2131
            `clearance_height` recommended for fixed_tracking systems.
2132
            `hub_height` can also be passed as a possibility.
2133
        radname : str
2134
            Gives a custom name to the scene file. Useful when parallelizing.
2135
        moduletype: DEPRECATED. use the `module` kwarg instead.
2136
        
2137
        Returns
2138
        -------
2139
        SceneObj 
2140
            'scene' with configuration details
2141
            
2142
        """
2143
        if moduletype is not None:
2✔
2144
            module = moduletype
×
2145
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2146
                  "`module` instead")
2147
        if module is None:
2✔
2148
            try:
×
2149
                module = self.module
×
2150
                print(f'Using last saved module, name: {module.name}')
×
2151
            except AttributeError:
×
2152
                print('makeScene(module, sceneDict, nMods, nRows).  '+\
×
2153
                          'Available moduletypes: ' )
2154
                self.printModules() #print available module types
×
2155
                return
×
2156
        self.scene = SceneObj(module)
2✔
2157
        self.scene.hpc = self.hpc  #pass HPC mode from parent
2✔
2158

2159
        if sceneDict is None:
2✔
2160
            print('makeScene(moduletype, sceneDict, nMods, nRows).  '+\
×
2161
                  'sceneDict inputs: .tilt .clearance_height .pitch .azimuth')
2162
            return self.scene
×
2163

2164
        if 'azimuth' not in sceneDict:
2✔
2165
            sceneDict['azimuth'] = 180
2✔
2166

2167
        if 'nRows' not in sceneDict:
2✔
2168
            sceneDict['nRows'] = 7
2✔
2169

2170
        if 'nMods' not in sceneDict:
2✔
2171
            sceneDict['nMods'] = 20
2✔
2172

2173
        # Fixed tilt routine
2174
        # Preferred: clearance_height,
2175
        # If only height is passed, it is assumed to be clearance_height.
2176
        
2177
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
2178
                                                                preferred='clearance_height', 
2179
                                                                nonpreferred='hub_height')
2180
        
2181
        #self.nMods = sceneDict['nMods']
2182
        #self.nRows = sceneDict['nRows']
2183
        sceneRAD = self.scene._makeSceneNxR(sceneDict=sceneDict,
2✔
2184
                                                 radname=radname)
2185

2186
        if 'appendRadfile' not in sceneDict:
2✔
2187
            appendRadfile = False
2✔
2188
        else:
2189
            appendRadfile = sceneDict['appendRadfile']
×
2190

2191
        if appendRadfile:
2✔
2192
            debug = False
×
2193
            try:
×
2194
                self.radfiles.append(sceneRAD)
×
2195
                if debug:
×
2196
                    print( "Radfile APPENDED!")
×
2197
            except:
×
2198
                #TODO: Manage situation where radfile was created with
2199
                #appendRadfile to False first..
2200
                self.radfiles=[]
×
2201
                self.radfiles.append(sceneRAD)
×
2202
                if debug:
×
2203
                    print( "Radfile APPENDAGE created!")
×
2204
        else:
2205
            self.radfiles = [sceneRAD]
2✔
2206
        return self.scene
2✔
2207

2208
    def appendtoScene(self, radfile=None, customObject=None, text=''):
2✔
2209
        """
2210
        Appends to the `Scene radfile` in folder `\objects` the text command in Radiance
2211
        lingo created by the user.
2212
        Useful when using addCustomObject to the scene.
2213

2214
        Parameters
2215
        ----------
2216
        radfile: str
2217
            Directory and name of where .rad scene file is stored
2218
        customObject : str
2219
            Directory and name of custom object .rad file is stored
2220
        text : str 
2221
            Command to be appended to the radfile. Do not leave empty spaces
2222
            at the end.
2223

2224
        Returns
2225
        -------
2226
        Nothing, the radfile must already be created and assigned when running this.
2227
        
2228
        """
2229
        
2230
        #TODO: Add a custom name and replace radfile name
2231

2232
        # py2 and 3 compatible: binary write, encode text first
2233
        text2 = '\n' + text + ' ' + customObject
×
2234
        
2235
        debug = False
×
2236
        if debug:
×
2237
            print (text2)
×
2238

2239
        with open(radfile, 'a+') as f:
×
2240
            f.write(text2)
×
2241

2242

2243

2244
    
2245
    def makeScene1axis(self, trackerdict=None, module=None, sceneDict=None,
2✔
2246
                       cumulativesky=None, moduletype=None):
2247
        """
2248
        Creates a SceneObj for each tracking angle which contains details of the PV
2249
        system configuration including row pitch, hub_height, nMods per row, nRows in the system...
2250

2251
        Parameters
2252
        ------------
2253
        trackerdict
2254
            Output from GenCumSky1axis
2255
        module : str or ModuleObj
2256
            Name or ModuleObj created with makeModule()
2257
        sceneDict : 
2258
            Dictionary with keys:`tilt`, `hub_height`, `pitch`, `azimuth`
2259
        cumulativesky : bool
2260
            Defines if sky will be generated with cumulativesky or gendaylit.
2261
        moduletype: DEPRECATED. use the `module` kwarg instead.
2262

2263
        Returns
2264
        --------
2265
        trackerdict 
2266
            Append the following keys
2267
                'radfile'
2268
                    directory where .rad scene file is stored
2269
                'scene'
2270
                    SceneObj for each tracker theta
2271
                'clearance_height'
2272
                    Calculated ground clearance based on
2273
                    `hub height`, `tilt` angle and overall collector width `sceney`
2274
                
2275
        """
2276
        
2277
        import math
2✔
2278

2279
        if sceneDict is None:
2✔
2280
            print('usage: makeScene1axis(module, sceneDict, nMods, nRows).'+
×
2281
                  'sceneDict inputs: .hub_height .azimuth .nMods .nRows'+
2282
                  'and .pitch or .gcr')
2283
            return
×
2284

2285
        # If no nRows or nMods assigned on deprecated variable or dictionary,
2286
        # assign default.
2287
        if 'nRows' not in sceneDict:
2✔
2288
            sceneDict['nRows'] = 7
×
2289
        if 'nMods' not in sceneDict:
2✔
2290
            sceneDict['nMods'] = 20
×
2291

2292
        if trackerdict is None:
2✔
2293
            try:
2✔
2294
                trackerdict = self.trackerdict
2✔
2295
            except AttributeError:
×
2296
                print('No trackerdict value passed or available in self')
×
2297

2298
        if cumulativesky is None:
2✔
2299
            try:
2✔
2300
                # see if cumulativesky = False was set earlier,
2301
                # e.g. in RadianceObj.set1axis
2302
                cumulativesky = self.cumulativesky
2✔
2303
            except AttributeError:
×
2304
                # default cumulativesky = true to maintain backward compatibility.
2305
                cumulativesky = True
×
2306

2307

2308
        if moduletype is not None:
2✔
2309
            module = moduletype
×
2310
            print("Warning:  input `moduletype` is deprecated. Use kwarg "
×
2311
                  "`module` instead")
2312
        if module is None:
2✔
2313
            try:
×
2314
                module = self.module
×
2315
                print(f'Using last saved module, name: {module.name}')
×
2316
            except AttributeError:
×
2317
                print('usage:  makeScene1axis(trackerdict, module, '+
×
2318
                      'sceneDict, nMods, nRows). ')
2319
                self.printModules() #print available module types
×
2320
                return
×
2321

2322
        if 'orientation' in sceneDict:
2✔
2323
            raise Exception('\n\n ERROR: Orientation format has been '
×
2324
                'deprecated since version 0.2.4. If you want to flip your '
2325
                'modules, on makeModule switch the x and y values.\n\n')
2326
       
2327
        # 1axis routine
2328
        # Preferred hub_height
2329
        sceneDict, use_clearanceheight = _heightCasesSwitcher(sceneDict, 
2✔
2330
                                                        preferred='hub_height', 
2331
                                                        nonpreferred='clearance_height')
2332

2333
        if use_clearanceheight:
2✔
2334
            simplefix = 0
2✔
2335
            hubheight = sceneDict['clearance_height'] # Not really, but this is the fastest 
2✔
2336
            # to make it work with the simplefix as below the actual clearnace height
2337
            # gets calculated and the 0 sets the cosine correction to 0. 
2338
            # TODO CLEAN THIS UP.
2339
            
2340
        else:
2341
            #the hub height is the tracker height at center of rotation.
2342
            hubheight = sceneDict['hub_height']
2✔
2343
            simplefix = 1
2✔
2344

2345
        # we no longer need sceneDict['hub_height'] - it'll be replaced by 'clearance_height' below
2346
        sceneDict.pop('hub_height',None)
2✔
2347
        if cumulativesky is True:        # cumulativesky workflow
2✔
2348
            print('\nMaking .rad files for cumulativesky 1-axis workflow')
2✔
2349
            for theta in trackerdict:
2✔
2350
                scene = SceneObj(module)
2✔
2351
                if trackerdict[theta]['surf_azm'] >= 180:
2✔
2352
                    trackerdict[theta]['surf_azm'] = trackerdict[theta]['surf_azm']-180
2✔
2353
                    trackerdict[theta]['surf_tilt'] = trackerdict[theta]['surf_tilt']*-1
2✔
2354
                radname = '1axis%s_'%(theta,)
2✔
2355

2356
                # Calculating clearance height for this theta.
2357
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2358
                        * scene.module.sceney + scene.module.offsetfromaxis \
2359
                        * math.sin(abs(theta)*math.pi/180)
2360
                # Calculate the ground clearance height based on the hub height. Add abs(theta) to avoid negative tilt angle errors
2361
                #trackerdict[theta]['clearance_height'] = height
2362

2363
                sceneDict.update({'tilt' : trackerdict[theta]['surf_tilt'],
2✔
2364
                                 'clearance_height' :  height,
2365
                                 'azimuth' : trackerdict[theta]['surf_azm'],
2366
                                 'modulez' :  scene.module.z})
2367

2368
                radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2369
                                             radname=radname, addhubheight=True)
2370
                trackerdict[theta]['radfile'] = radfile
2✔
2371
                trackerdict[theta]['scene'] = scene
2✔
2372

2373
            print('{} Radfiles created in /objects/'.format(trackerdict.__len__()))
2✔
2374

2375
        else:  #gendaylit workflow
2376
            print('\nMaking ~%s .rad files for gendaylit 1-axis workflow (this takes a minute..)' % (len(trackerdict)))
2✔
2377
            count = 0
2✔
2378
            for time in trackerdict:
2✔
2379
                scene = SceneObj(module)
2✔
2380

2381
                if trackerdict[time]['surf_azm'] >= 180:
2✔
2382
                    trackerdict[time]['surf_azm'] = trackerdict[time]['surf_azm']-180
×
2383
                    trackerdict[time]['surf_tilt'] = trackerdict[time]['surf_tilt']*-1
×
2384
                theta = trackerdict[time]['theta']
2✔
2385
                radname = '1axis%s_'%(time,)
2✔
2386

2387
                # Calculating clearance height for this time.
2388
                height = hubheight - simplefix*0.5* math.sin(abs(theta) * math.pi / 180) \
2✔
2389
                        * scene.module.sceney + scene.module.offsetfromaxis \
2390
                        * math.sin(abs(theta)*math.pi/180)
2391

2392
                if trackerdict[time]['ghi'] > 0:
2✔
2393
                    #trackerdict[time]['clearance_height'] = height
2394

2395
                    sceneDict.update({'tilt' : trackerdict[time]['surf_tilt'],
2✔
2396
                                     'clearance_height' :  height,
2397
                                     'azimuth' : trackerdict[time]['surf_azm'],
2398
                                     'modulez' :  scene.module.z})
2399

2400
                    # if sceneDict isn't copied, it will change inside the SceneObj since dicts are mutable!
2401
                    radfile = scene._makeSceneNxR(sceneDict=(sceneDict),
2✔
2402
                                                 radname=radname, addhubheight=True)
2403
                    trackerdict[time]['radfile'] = radfile
2✔
2404
                    trackerdict[time]['scene'] = scene
2✔
2405
                    count+=1
2✔
2406
            print('{} Radfiles created in /objects/'.format(count))
2✔
2407

2408
        self.trackerdict = trackerdict
2✔
2409
        #self.nMods = sceneDict['nMods']  #assign nMods and nRows to RadianceObj
2410
        #self.nRows = sceneDict['nRows']
2411
        self.hub_height = hubheight
2✔
2412
        
2413
        return trackerdict
2✔
2414

2415

2416
    def analysis1axis(self, trackerdict=None, singleindex=None, accuracy='low',
2✔
2417
                      customname=None, modWanted=None, rowWanted=None, 
2418
                      sensorsy=9, sensorsx=1,  
2419
                      modscanfront = None, modscanback = None, relative=False, 
2420
                      debug=False ):
2421
        """
2422
        Loop through trackerdict and runs linescans for each scene and scan in there.
2423

2424
        Parameters
2425
        ----------------
2426
        trackerdict 
2427
        singleindex : str
2428
            For single-index mode, just the one index we want to run (new in 0.2.3).
2429
            Example format '21_06_14_12_30' for 2021 June 14th 12:30 pm
2430
        accuracy : str
2431
            'low' or 'high', resolution option used during _irrPlot and rtrace
2432
        customname : str
2433
            Custom text string to be added to the file name for the results .CSV files
2434
        modWanted : int 
2435
            Module to be sampled. Index starts at 1.
2436
        rowWanted : int
2437
            Row to be sampled. Index starts at 1. (row 1)
2438
        sensorsy : int or list 
2439
            Number of 'sensors' or scanning points along the collector width 
2440
            (CW) of the module(s). If multiple values are passed, first value
2441
            represents number of front sensors, second value is number of back sensors
2442
        sensorsx : int or list 
2443
            Number of 'sensors' or scanning points along the length, the side perpendicular 
2444
            to the collector width (CW) of the module(s) for the back side of the module. 
2445
            If multiple values are passed, first value represents number of 
2446
            front sensors, second value is number of back sensors.
2447
        modscanfront : dict
2448
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2449
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2450
            floats except for 'orient' which takes x y z values as string 'x y z'
2451
            for example '0 0 -1'. These values will overwrite the internally
2452
            calculated frontscan dictionary for the module & row selected. If modifying 
2453
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2454
            results writing stage. 
2455
        modscanback : dict
2456
            dictionary with one or more of the following key: xstart, ystart, zstart, 
2457
            xinc, yinc, zinc, Nx, Ny, Nz, orient. All of these keys are ints or 
2458
            floats except for 'orient' which takes x y z values as string 'x y z'
2459
            for example '0 0 -1'. These values will overwrite the internally
2460
            calculated frontscan dictionary for the module & row selected.  If modifying 
2461
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
2462
            results writing stage. 
2463
        relative : Bool
2464
            if passing modscanfront and modscanback to modify dictionarie of positions,
2465
            this sets if the values passed to be updated are relative or absolute. 
2466
            Default is absolute value (relative=False)
2467
        debug : Bool
2468
            Activates internal printing of the function to help debugging.
2469
 
2470

2471
        Returns
2472
        -------
2473
        trackerdict with new keys:
2474
            
2475
            'AnalysisObj'  : analysis object for this tracker theta
2476
            'Wm2Front'     : list of front Wm-2 irradiances, len=sensorsy_back
2477
            'Wm2Back'      : list of rear Wm-2 irradiances, len=sensorsy_back
2478
            'backRatio'    : list of rear irradiance ratios, len=sensorsy_back
2479
        RadianceObj with new appended values: 
2480
            'Wm2Front'     : np Array with front irradiance cumulative
2481
            'Wm2Back'      : np Array with rear irradiance cumulative
2482
            'backRatio'    : np Array with rear irradiance ratios
2483
        """
2484
        
2485
        import warnings
2✔
2486

2487
        if customname is None:
2✔
2488
            customname = ''
2✔
2489

2490
        if trackerdict == None:
2✔
2491
            try:
×
2492
                trackerdict = self.trackerdict
×
2493
            except AttributeError:
×
2494
                print('No trackerdict value passed or available in self')
×
2495

2496
        if singleindex is None:  # run over all values in trackerdict
2✔
2497
            trackerkeys = sorted(trackerdict.keys())
2✔
2498
        else:                   # run in single index mode.
2499
            trackerkeys = [singleindex]
×
2500

2501
        if modWanted == None:
2✔
2502
            modWanted = round(trackerdict[trackerkeys[0]]['scene'].sceneDict['nMods'] / 1.99)
×
2503
        if rowWanted == None:
2✔
2504
            rowWanted = round(trackerdict[trackerkeys[0]]['scene'].sceneDict['nRows'] / 1.99)
×
2505

2506
       
2507
        frontWm2 = 0 # container for tracking front irradiance across module chord. Dynamically size based on first analysis run
2✔
2508
        backWm2 = 0 # container for tracking rear irradiance across module chord.
2✔
2509

2510
        for index in trackerkeys:   # either full list of trackerdict keys, or single index
2✔
2511
            name = '1axis_%s%s'%(index,customname)
2✔
2512
            octfile = trackerdict[index]['octfile']
2✔
2513
            scene = trackerdict[index]['scene']
2✔
2514
            if octfile is None:
2✔
2515
                continue  # don't run analysis if the octfile is none
×
2516
            try:  # look for missing data
2✔
2517
                analysis = AnalysisObj(octfile,name)
2✔
2518
                name = '1axis_%s%s'%(index,customname,)
2✔
2519
                frontscanind, backscanind = analysis.moduleAnalysis(scene=scene, modWanted=modWanted, 
2✔
2520
                                                rowWanted=rowWanted, 
2521
                                                sensorsy=sensorsy, 
2522
                                                sensorsx=sensorsx, 
2523
                                                modscanfront=modscanfront, modscanback=modscanback,
2524
                                                relative=relative, debug=debug)
2525
                analysis.analysis(octfile=octfile,name=name,frontscan=frontscanind,backscan=backscanind,accuracy=accuracy)                
2✔
2526
                trackerdict[index]['AnalysisObj'] = analysis
2✔
2527
            except Exception as e: # problem with file. TODO: only catch specific error types here.
×
2528
                warnings.warn('Index: {}. Problem with file. Error: {}. Skipping'.format(index,e), Warning)
×
2529
                return
×
2530

2531
            #combine cumulative front and back irradiance for each tracker angle
2532
            try:  #on error, trackerdict[index] is returned empty
2✔
2533
                trackerdict[index]['Wm2Front'] = analysis.Wm2Front
2✔
2534
                trackerdict[index]['Wm2Back'] = analysis.Wm2Back
2✔
2535
                trackerdict[index]['backRatio'] = analysis.backRatio
2✔
2536
            except AttributeError as  e:  # no key Wm2Front.
×
2537
                warnings.warn('Index: {}. Trackerdict key not found: {}. Skipping'.format(index,e), Warning)
×
2538
                return
×
2539

2540
            if np.sum(frontWm2) == 0:  # define frontWm2 the first time through
2✔
2541
                frontWm2 =  np.array(analysis.Wm2Front)
2✔
2542
                backWm2 =  np.array(analysis.Wm2Back)
2✔
2543
            else:
2544
                frontWm2 +=  np.array(analysis.Wm2Front)
×
2545
                backWm2 +=  np.array(analysis.Wm2Back)
×
2546
            print('Index: {}. Wm2Front: {}. Wm2Back: {}'.format(index,
2✔
2547
                  np.mean(analysis.Wm2Front), np.mean(analysis.Wm2Back)))
2548

2549
        if np.sum(self.Wm2Front) == 0:
2✔
2550
            self.Wm2Front = frontWm2   # these are accumulated over all indices passed in.
2✔
2551
            self.Wm2Back = backWm2
2✔
2552
        else:
2553
            self.Wm2Front += frontWm2   # these are accumulated over all indices passed in.
×
2554
            self.Wm2Back += backWm2
×
2555
        self.backRatio = np.mean(backWm2)/np.mean(frontWm2+.001)
2✔
2556

2557
        # Save compiled results using _saveresults
2558
        if singleindex is None:
2✔
2559
        
2560
            print ("Saving a cumulative-results file in the main simulation folder." +
2✔
2561
                   "This adds up by sensor location the irradiance over all hours " +
2562
                   "or configurations considered." +
2563
                   "\nWarning: This file saving routine does not clean results, so "+
2564
                   "if your setup has ygaps, or 2+modules or torque tubes, doing "+
2565
                   "a deeper cleaning and working with the individual results "+
2566
                   "files in the results folder is highly suggested.")
2567
            cumfilename = 'cumulative_results_%s.csv'%(customname)
2✔
2568
            if self.cumulativesky is True: 
2✔
2569
                frontcum = pd.DataFrame()
2✔
2570
                rearcum = pd.DataFrame()
2✔
2571
                temptrackerdict = trackerdict[list(trackerdict)[0]]['AnalysisObj']
2✔
2572
                #temptrackerdict = trackerdict[0.0]['AnalysisObj']
2573
                frontcum ['x'] = temptrackerdict.x
2✔
2574
                frontcum ['y'] = temptrackerdict.y
2✔
2575
                frontcum ['z'] = temptrackerdict.z
2✔
2576
                frontcum ['mattype'] = temptrackerdict.mattype
2✔
2577
                frontcum ['Wm2'] = self.Wm2Front
2✔
2578
                rearcum ['x'] = temptrackerdict.x
2✔
2579
                rearcum ['y'] = temptrackerdict.x
2✔
2580
                rearcum ['z'] = temptrackerdict.rearZ
2✔
2581
                rearcum ['mattype'] = temptrackerdict.rearMat
2✔
2582
                rearcum ['Wm2'] = self.Wm2Back
2✔
2583
                cumanalysisobj = AnalysisObj()
2✔
2584
                print ("\nSaving Cumulative results" )
2✔
2585
                cumanalysisobj._saveResultsCumulative(frontcum, rearcum, savefile=cumfilename)
2✔
2586
            else: # trackerkeys are day/hour/min, and there's no easy way to find a 
2587
                # tilt of 0, so making a fake linepoint object for tilt 0 
2588
                # and then saving.
2589
                try:
2✔
2590
                    cumscene = trackerdict[trackerkeys[0]]['scene']
2✔
2591
                    cumscene.sceneDict['tilt']=0
2✔
2592
                    cumscene.sceneDict['clearance_height'] = self.hub_height
2✔
2593
                    cumanalysisobj = AnalysisObj()
2✔
2594
                    frontscancum, backscancum = cumanalysisobj.moduleAnalysis(scene=cumscene, modWanted=modWanted, 
2✔
2595
                                                rowWanted=rowWanted, 
2596
                                                sensorsy=sensorsy, 
2597
                                                sensorsx=sensorsx,
2598
                                                modscanfront=modscanfront, modscanback=modscanback,
2599
                                                relative=relative, debug=debug)
2600
                    x,y,z = cumanalysisobj._linePtsArray(frontscancum)
2✔
2601
                    x,y,rearz = cumanalysisobj._linePtsArray(backscancum)
2✔
2602
        
2603
                    frontcum = pd.DataFrame()
2✔
2604
                    rearcum = pd.DataFrame()
2✔
2605
                    frontcum ['x'] = x
2✔
2606
                    frontcum ['y'] = y
2✔
2607
                    frontcum ['z'] = z
2✔
2608
                    frontcum ['mattype'] = trackerdict[trackerkeys[0]]['AnalysisObj'].mattype
2✔
2609
                    frontcum ['Wm2'] = self.Wm2Front
2✔
2610
                    rearcum ['x'] = x
2✔
2611
                    rearcum ['y'] = y
2✔
2612
                    rearcum ['z'] = rearz
2✔
2613
                    rearcum ['mattype'] = trackerdict[trackerkeys[0]]['AnalysisObj'].rearMat
2✔
2614
                    rearcum ['Wm2'] = self.Wm2Back
2✔
2615
                    print ("\nSaving Cumulative results" )
2✔
2616
                    cumanalysisobj._saveResultsCumulative(frontcum, rearcum, savefile=cumfilename)            
2✔
2617
                except:
2✔
2618
                    print("Not able to save a cumulative result for this simulation.")
2✔
2619
        return trackerdict
2✔
2620

2621

2622

2623
# End RadianceObj definition
2624

2625
class GroundObj:
2✔
2626
    """
2627
    Class to set and return details for the ground surface materials and reflectance.
2628
    If 1 albedo value is passed, it is used as default.
2629
    If 3 albedo values are passed, they are assigned to each of the three wavelength placeholders (RGB),
2630
    
2631
    If material type is known, it is used to get reflectance info.  
2632
    if material type isn't known, material_info.list is returned
2633

2634
    Parameters
2635
    ------------
2636
    materialOrAlbedo : numeric or str
2637
        If number between 0 and 1 is passed, albedo input is assumed and assigned.    
2638
        If string is passed with the name of the material desired. e.g. 'litesoil',
2639
        properties are searched in `material_file`.
2640
        Default Material names to choose from: litesoil, concrete, white_EPDM, 
2641
        beigeroof, beigeroof_lite, beigeroof_heavy, black, asphalt
2642
    material_file : str
2643
        Filename of the material information. Default `ground.rad`
2644
    silent       :  bool   
2645
        suppress print statements. Default False  
2646

2647
    Returns
2648
    -------
2649

2650
    """
2651
   
2652
    def __init__(self, materialOrAlbedo=None, material_file=None, silent=False):
2✔
2653
        import warnings
2✔
2654
        from numbers import Number
2✔
2655
        
2656
        self.normval = None
2✔
2657
        self.ReflAvg = None
2✔
2658
        self.Rrefl = None
2✔
2659
        self.Grefl = None
2✔
2660
        self.Brefl = None
2✔
2661

2662
        self.ground_type = 'custom'        
2✔
2663

2664
        if material_file is None:
2✔
2665
            material_file = 'ground.rad'
2✔
2666
            
2667
        self.material_file = material_file           
2✔
2668
        if materialOrAlbedo is None: # Case where it's none.
2✔
2669
            print('\nInput albedo 0-1, or string from ground.printGroundMaterials().'
×
2670
            '\nAlternatively, run setGround after readWeatherData()'
2671
            'and setGround will read metdata.albedo if available')
2672
            return
×
2673
            
2674
        if isinstance(materialOrAlbedo, str) :
2✔
2675
            self.ground_type = materialOrAlbedo  
2✔
2676
            # Return the RGB albedo for material ground_type
2677
            materialOrAlbedo = self.printGroundMaterials(self.ground_type)
2✔
2678
            
2679
        # Check for double and int. 
2680
        if isinstance(materialOrAlbedo, Number):
2✔
2681
            materialOrAlbedo = np.array([[materialOrAlbedo, 
2✔
2682
                                          materialOrAlbedo, materialOrAlbedo]])
2683
        
2684
        if isinstance(materialOrAlbedo, list):
2✔
2685
            materialOrAlbedo = np.asarray(materialOrAlbedo)
×
2686
        
2687
        # By this point, materialOrAlbedo should be a np.ndarray:
2688
        if isinstance(materialOrAlbedo, np.ndarray):
2✔
2689

2690
            if materialOrAlbedo.ndim == 0:
2✔
2691
            # numpy array of one single value, i.e. np.array(0.62)
2692
            # after this if, np.array([0.62])
2693
                materialOrAlbedo = materialOrAlbedo.reshape([1])
×
2694
                
2695
            if materialOrAlbedo.ndim == 1:
2✔
2696
            # If np.array is ([0.62]), this repeats it so at the end it's
2697
            # np.array ([0.62, 0.62, 0.62])
2698
                materialOrAlbedo = np.repeat(np.array([materialOrAlbedo]), 
2✔
2699
                                             3, axis=1).reshape(
2700
                                                     len(materialOrAlbedo),3)
2701
            
2702
            if (materialOrAlbedo.ndim == 2) & (materialOrAlbedo.shape[1] > 3): 
2✔
2703
                    warnings.warn("Radiance only raytraces 3 wavelengths at "
2✔
2704
                                  "a time. Trimming albedo np.array input to "
2705
                                  "3 wavelengths.")
2706
                    materialOrAlbedo = materialOrAlbedo[:,0:3]
2✔
2707
        # By this point we should have np.array of dim=2 and shape[1] = 3.        
2708
        # Check for invalid values
2709
        if (materialOrAlbedo > 1).any() or (materialOrAlbedo < 0).any():
2✔
2710
            if not silent:
2✔
2711
                print('Warning: albedo values greater than 1 or less than 0. '
2✔
2712
                      'Constraining to [0..1]')
2713
            materialOrAlbedo = materialOrAlbedo.clip(min=0, max=1)
2✔
2714
        try:
2✔
2715
            self.Rrefl = materialOrAlbedo[:,0]
2✔
2716
            self.Grefl = materialOrAlbedo[:,1]
2✔
2717
            self.Brefl = materialOrAlbedo[:,2]
2✔
2718
            self.normval = _normRGB(materialOrAlbedo[:,0],materialOrAlbedo[:,1],
2✔
2719
                                    materialOrAlbedo[:,2])
2720
            self.ReflAvg = np.round(np.mean(materialOrAlbedo, axis=1),4)
2✔
2721
            if not silent:
2✔
2722
                print(f'Loading albedo, {self.ReflAvg.__len__()} value(s), '
2✔
2723
                      f'{self._nonzeromean(self.ReflAvg):0.3f} avg\n'
2724
                      f'{self.ReflAvg[self.ReflAvg != 0].__len__()} nonzero albedo values.')
2725
        except IndexError as e:
×
2726
            print('albedo.shape should be 3 column (N x 3)')
×
2727
            raise e
×
2728
    
2729
    def printGroundMaterials(self, materialString=None):
2✔
2730
        """
2731
        printGroundMaterials(materialString=None)
2732
        
2733
        input: None or materialString.  If None, return list of acceptable
2734
        material types from ground.rad.  If valid string, return RGB albedo
2735
        of the material type selected.
2736
        """
2737
        
2738
        import warnings
2✔
2739
        material_path = 'materials'
2✔
2740
        
2741
        f = open(os.path.join(material_path, self.material_file))
2✔
2742
        keys = [] #list of material key names
2✔
2743
        Rreflall = []; Greflall=[]; Breflall=[] #RGB material reflectance  
2✔
2744
        temp = f.read().split()
2✔
2745
        f.close()
2✔
2746
        #return indices for 'plastic' definition
2747
        index = _findme(temp,'plastic')
2✔
2748
        for i in index:
2✔
2749
            keys.append(temp[i+1])# after plastic comes the material name
2✔
2750
            Rreflall.append(float(temp[i+5]))#RGB reflectance comes a few more down the list
2✔
2751
            Greflall.append(float(temp[i+6]))
2✔
2752
            Breflall.append(float(temp[i+7]))
2✔
2753
        
2754
        if materialString is not None:
2✔
2755
            try:
2✔
2756
                index = _findme(keys,materialString)[0]
2✔
2757
            except IndexError:
×
2758
                warnings.warn('Error - materialString not in '
×
2759
                              f'{self.material_file}: {materialString}')
2760
            return(np.array([[Rreflall[index], Greflall[index], Breflall[index]]]))
2✔
2761
        else:
2762
            return(keys)
2✔
2763
            
2764
    def _nonzeromean(self, val):
2✔
2765
        '''  array mean excluding zero. return zero if everything's zero'''
2766
        tempmean = np.nanmean(val)
2✔
2767
        if tempmean > 0:
2✔
2768
            tempmean = np.nanmean(val[val !=0])
2✔
2769
        return tempmean     
2✔
2770
        
2771
    def _makeGroundString(self, index=0, cumulativesky=False):
2✔
2772
        '''
2773
        create string with ground reflectance parameters for use in 
2774
        gendaylit and gencumsky.
2775
        
2776
        Parameters
2777
        -----------
2778
        index : integer
2779
            Index of time for time-series albedo. Default 0
2780
        cumulativesky:  Boolean
2781
            If true, set albedo to average of time series values.
2782

2783
        Returns
2784
        -------
2785
        groundstring:  text with albedo details to append to sky.rad in
2786
                       gendaylit
2787
        '''
2788
         
2789
        
2790
        try:  
2✔
2791
            if cumulativesky is True:
2✔
2792
                Rrefl = self._nonzeromean(self.Rrefl) 
2✔
2793
                Grefl = self._nonzeromean(self.Grefl) 
2✔
2794
                Brefl = self._nonzeromean(self.Brefl)
2✔
2795
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
2796
            else:
2797
                Rrefl = self.Rrefl[index]
2✔
2798
                Grefl = self.Grefl[index]
2✔
2799
                Brefl = self.Brefl[index]
2✔
2800
                normval = _normRGB(Rrefl, Grefl, Brefl)
2✔
2801

2802
            # Check for all zero albedo case
2803
            if normval == 0:
2✔
2804
                normval = 1
×
2805
            
2806
            groundstring = ( f'\nskyfunc glow ground_glow\n0\n0\n4 ' 
2✔
2807
                f'{Rrefl/normval} {Grefl/normval} {Brefl/normval} 0\n' 
2808
                '\nground_glow source ground\n0\n0\n4 0 0 -1 180\n' 
2809
                f'\nvoid plastic {self.ground_type}\n0\n0\n5 '
2810
                f'{Rrefl:0.3f} {Grefl:0.3f} {Brefl:0.3f} 0 0\n'
2811
                f"\n{self.ground_type} ring groundplane\n" 
2812
                '0\n0\n8\n0 0 -.01\n0 0 1\n0 100' )
2813
        except IndexError as err:
×
2814
            print(f'Index {index} passed to albedo with only '
×
2815
                  f'{self.Rrefl.__len__()} values.'   )
2816
            raise err
×
2817
        return groundstring
2✔
2818

2819
        
2820

2821
class SceneObj:
2✔
2822
    '''
2823
    Scene information including PV module type, bifaciality, array info
2824
    pv module orientation defaults: Azimuth = 180 (south)
2825
    pv module origin: z = 0 bottom of frame. y = 0 lower edge of frame. 
2826
    x = 0 vertical centerline of module
2827

2828
    scene includes module details (x,y,bifi, sceney (collector_width), scenex)
2829
    
2830
    Parameters
2831
    ------------
2832
    module : str or ModuleObj
2833
            String name of module created with makeModule()
2834
    name : str
2835
           Identifier of scene in case of multiple scenes. Default `Scene0'.
2836
           Automatically increments if makeScene is run multiple times.
2837

2838
    Returns
2839
    -------
2840
    
2841
    '''
2842
    def __repr__(self):
2✔
2843
        return str(self.__dict__)
×
2844
    def __init__(self, module=None, name=None):
2✔
2845
        ''' initialize SceneObj
2846
        '''
2847
        from bifacial_radiance import ModuleObj
2✔
2848
        # should sceneDict be initialized here? This is set in _makeSceneNxR
2849
        if module is None:
2✔
2850
            return
×
2851
        elif type(module) == str:
2✔
2852
            self.module = ModuleObj(name=module)
2✔
2853

2854

2855
        elif type(module) == ModuleObj: # try moduleObj
2✔
2856
            self.module = module
2✔
2857

2858
        #self.moduleDict = self.module.getDataDict()
2859
        #self.scenex = self.module.scenex
2860
        #self.sceney = self.module.sceney
2861
        #self.offsetfromaxis = self.moduleDict['offsetfromaxis']
2862
        #TODO: get rid of these 4 values
2863
        
2864
        self.modulefile = self.module.modulefile
2✔
2865
        self.hpc = False  #default False.  Set True by makeScene after sceneobj created.
2✔
2866
        if name is None:
2✔
2867
            self.name = 'Scene0'
2✔
2868
        else:
2869
            self.name = name
×
2870

2871
    def _makeSceneNxR(self, modulename=None, sceneDict=None, radname=None, addhubheight=False):
2✔
2872
        """
2873
        Arrange module defined in :py:class:`bifacial_radiance.SceneObj` into a N x R array.
2874
        Returns a :py:class:`bifacial_radiance.SceneObj` which contains details 
2875
        of the PV system configuration including `tilt`, `row pitch`, `hub_height`
2876
        or `clearance_height`, `nMod`s per row, `nRows` in the system.
2877

2878
        The returned scene has (0,0) coordinates centered at the module at the
2879
        center of the array. For 5 rows, that is row 3, for 4 rows, that is
2880
        row 2 also (rounds down). For 5 modules in the row, that is module 3,
2881
        for 4 modules in the row, that is module 2 also (rounds down)
2882

2883
        Parameters
2884
        ------------
2885
        modulename: str 
2886
            Name of module created with :py:class:`~bifacial_radiance.RadianceObj.makeModule`.
2887
        sceneDict : dictionary 
2888
            Dictionary of scene parameters.
2889
                clearance_height : numeric
2890
                    (meters). 
2891
                pitch : numeric
2892
                    Separation between rows
2893
                tilt : numeric 
2894
                    Valid input ranges -90 to 90 degrees
2895
                azimuth : numeric 
2896
                    A value denoting the compass direction along which the
2897
                    axis of rotation lies. Measured in decimal degrees East
2898
                    of North. [0 to 180) possible.
2899
                nMods : int 
2900
                    Number of modules per row (default = 20)
2901
                nRows : int 
2902
                    Number of rows in system (default = 7)
2903
        radname : str
2904
            String for name for radfile.
2905
        addhubheight : Bool, default False
2906
            Add hubheight back to the sceneDict since it was stripped out 
2907
            by makeScene1axis
2908

2909

2910
        Returns
2911
        -------
2912
        radfile : str
2913
             Filename of .RAD scene in /objects/
2914
        scene : :py:class:`~bifacial_radiance.SceneObj `
2915
             Returns a `SceneObject` 'scene' with configuration details
2916

2917
        """
2918
        import copy
2✔
2919
        
2920
        if modulename is None:
2✔
2921
            modulename = self.module.name
2✔
2922

2923
        if sceneDict is None:
2✔
2924
            print('makeScene(modulename, sceneDict, nMods, nRows).  sceneDict'
×
2925
                  ' inputs: .tilt .azimuth .nMods .nRows' 
2926
                  ' AND .tilt or .gcr ; AND .hub_height or .clearance_height')
2927
        else: sceneDict = copy.deepcopy(sceneDict)
2✔
2928

2929

2930
        if 'orientation' in sceneDict:
2✔
2931
            raise Exception('\n\n ERROR: Orientation format has been '
×
2932
                'deprecated since version 0.2.4. If you want to flip your '
2933
                'modules, on makeModule switch the x and y values.\n\n')
2934

2935
        if 'azimuth' not in sceneDict:
2✔
2936
            sceneDict['azimuth'] = 180
×
2937

2938
        if 'axis_tilt' not in sceneDict:
2✔
2939
            sceneDict['axis_tilt'] = 0
2✔
2940

2941
        if 'originx' not in sceneDict:
2✔
2942
            sceneDict['originx'] = 0
2✔
2943

2944
        if 'originy' not in sceneDict:
2✔
2945
            sceneDict['originy'] = 0
2✔
2946

2947
        if radname is None:
2✔
2948
            radname =  str(self.module.name).strip().replace(' ', '_')
2✔
2949

2950
        # loading variables
2951
        tilt = sceneDict['tilt']
2✔
2952
        azimuth = sceneDict['azimuth']
2✔
2953
        nMods = sceneDict['nMods']
2✔
2954
        nRows = sceneDict['nRows']
2✔
2955
        axis_tilt = sceneDict['axis_tilt']
2✔
2956
        originx = sceneDict ['originx']
2✔
2957
        originy = sceneDict['originy']
2✔
2958

2959
        # hub_height, clearance_height and height logic.
2960
        # this routine uses hub_height to move the panels up so it's important 
2961
        # to have a value for that, either obtianing from clearance_height 
2962
        # (if coming from makeScene) or from hub_height itself.
2963
        # it is assumed that if no clearance_height or hub_height is passed,
2964
        # hub_height = height.
2965

2966
        
2967
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, preferred='hub_height', 
2✔
2968
                                                     nonpreferred='clearance_height')
2969
        
2970
        if use_clearanceheight :
2✔
2971
            hubheight = sceneDict['clearance_height'] + 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
2972
            * self.module.sceney - self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
2973

2974
            title_clearance_height = sceneDict['clearance_height'] 
2✔
2975
            if addhubheight:
2✔
2976
                sceneDict['hub_height'] = np.round(hubheight,3)
2✔
2977
        else:
2978
            hubheight = sceneDict['hub_height'] 
2✔
2979
            # this calculates clearance_height, used for the title
2980
            title_clearance_height = sceneDict['hub_height'] - 0.5* np.sin(abs(tilt) * np.pi / 180) \
2✔
2981
            * self.module.sceney + self.module.offsetfromaxis*np.sin(abs(tilt)*np.pi/180)
2982

2983
        try: 
2✔
2984
            if sceneDict['pitch'] >0:
2✔
2985
                pitch = sceneDict['pitch'] 
2✔
2986
            else:
2987
                raise Exception('default to gcr')
×
2988
            
2989
        except:
2✔
2990

2991
            if 'gcr' in sceneDict:
2✔
2992
                pitch = np.round(self.module.sceney/sceneDict['gcr'],3)
2✔
2993
            else:
2994
                raise Exception('No valid `pitch` or `gcr` in sceneDict')
×
2995

2996

2997

2998
        ''' INITIALIZE VARIABLES '''
2✔
2999
        text = '!xform '
2✔
3000

3001
        text += '-rx %s -t %s %s %s ' %(tilt, 0, 0, hubheight)
2✔
3002
        
3003
        # create nMods-element array along x, nRows along y. 1cm module gap.
3004
        text += '-a %s -t %s 0 0 -a %s -t 0 %s 0 ' %(nMods, self.module.scenex, nRows, pitch)
2✔
3005

3006
        # azimuth rotation of the entire shebang. Select the row to scan here based on y-translation.
3007
        # Modifying so center row is centered in the array. (i.e. 3 rows, row 2. 4 rows, row 2 too)
3008
        # Since the array is already centered on row 1, module 1, we need to increment by Nrows/2-1 and Nmods/2-1
3009

3010
        text += (f'-i 1 -t {-self.module.scenex*(round(nMods/1.999)*1.0-1)} '
2✔
3011
                 f'{-pitch*(round(nRows / 1.999)*1.0-1)} 0 -rz {180-azimuth} '
3012
                 f'-t {originx} {originy} 0 ' )
3013
        
3014
        #axis tilt only working for N-S trackers
3015
        if axis_tilt != 0 and azimuth == 90:  
2✔
3016
            print("Axis_Tilt is still under development. The scene will be "
×
3017
                  "created with the proper axis tilt, and the tracking angle"
3018
                  "will consider the axis_tilt, but the sensors for the "
3019
                  "analysis might not fall in the correct surfaces unless you"
3020
                  " manually position them for this version. Sorry! :D ")
3021
                  
3022
            text += (f'-rx {axis_tilt} -t 0 0 %s ' %(
×
3023
                self.module.scenex*(round(nMods/1.99)*1.0-1)*np.sin(
3024
                        axis_tilt * np.pi/180) ) )
3025

3026
        filename = (f'{radname}_C_{title_clearance_height:0.2f}_rtr_{pitch:0.2f}_tilt_{tilt:0.0f}_'
2✔
3027
                    f'{nMods}modsx{nRows}rows_origin{originx},{originy}.rad' )
3028
        
3029
        if self.hpc:
2✔
3030
            text += f'"{os.path.join(os.getcwd(), self.modulefile)}"'
2✔
3031
            radfile = os.path.join(os.getcwd(), 'objects', filename)
2✔
3032
        else:
3033
            text += f'"{os.path.join(self.modulefile)}"'
2✔
3034
            radfile = os.path.join('objects',filename)
2✔
3035

3036
        # py2 and 3 compatible: binary write, encode text first
3037
        with open(radfile, 'wb') as f:
2✔
3038
            f.write(text.encode('ascii'))
2✔
3039

3040
        self.gcr = self.module.sceney / pitch
2✔
3041
        self.text = text
2✔
3042
        self.radfiles = radfile
2✔
3043
        self.sceneDict = sceneDict
2✔
3044
#        self.hub_height = hubheight
3045
        return radfile
2✔
3046
    
3047
   
3048
    def showScene(self):
2✔
3049
        """ 
3050
        Method to call objview on the scene included in self
3051
            
3052
        """
3053
        cmd = 'objview %s %s' % (os.path.join('materials', 'ground.rad'),
×
3054
                                         self.radfiles)
3055
        print('Rendering scene. This may take a moment...')
×
3056
        _,err = _popen(cmd,None)
×
3057
        if err is not None:
×
3058
            print('Error: {}'.format(err))
×
3059
            print('possible solution: install radwinexe binary package from '
×
3060
                  'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml'
3061
                  ' into your RADIANCE binaries path')
3062
            return
×
3063

3064
    def saveImage(self, filename=None, view=None):
2✔
3065
        """
3066
        Save an image of the scene to /images/. A default ground (concrete material) 
3067
        and sun (due East or West azimuth and 65 elevation) are created. 
3068

3069
        Parameters:    
3070
            filename : string, optional. name for image file, defaults to scene name
3071
            view     : string, optional.  name for view file in /views. default to 'side.vp'  
3072
                      Input of 'XYZ' into view will do a zoomed out view of the whole scene              
3073

3074
        """
3075
        import tempfile
2✔
3076
        
3077
        temp_dir = tempfile.TemporaryDirectory()
2✔
3078
        pid = os.getpid()
2✔
3079
        if filename is None:
2✔
3080
            filename = f'{self.name}'
×
3081
            
3082
        if view is None:
2✔
3083
            view = 'side.vp'
2✔
3084

3085
        # fake lighting temporary .radfile.  Use 65 elevation and +/- 90 azimuth
3086
        # use a concrete ground surface
3087
        if (self.sceneDict['azimuth'] > 100 and self.sceneDict['tilt'] >= 0) or \
2✔
3088
            (self.sceneDict['azimuth'] <= 100 and self.sceneDict['tilt'] < 0):
3089
            sunaz = 90
×
3090
        else:
3091
            sunaz = -90
2✔
3092
        ground = GroundObj('concrete', silent=True) 
2✔
3093
        ltfile = os.path.join(temp_dir.name, f'lt{pid}.rad')
2✔
3094
        with open(ltfile, 'w') as f:
2✔
3095
            f.write("!gensky -ang %s %s +s\n" %(65, sunaz) + \
2✔
3096
            "skyfunc glow sky_mat\n0\n0\n4 1 1 1 0\n" + \
3097
            "\nsky_mat source sky\n0\n0\n4 0 0 1 180\n" + \
3098
            ground._makeGroundString() )
3099
        
3100
        # make .rif and run RAD
3101
        riffile = os.path.join(temp_dir.name, f'ov{pid}.rif')
2✔
3102
        with open(riffile, 'w') as f:
2✔
3103
                f.write("scene= materials/ground.rad " +\
2✔
3104
                        f"{self.radfiles} {ltfile}\n".replace("\\",'/') +\
3105
                    f"EXPOSURE= .5\nUP= Z\nview= {view.replace('.vp','')} -vf views/{view}\n" +\
3106
                    f"oconv= -f\nPICT= images/{filename}")
3107
        _,err = _popen(["rad",'-s',riffile], None)
2✔
3108
        if err:
2✔
3109
            print(err)
×
3110
        else:
3111
            print(f"Scene image saved: images/{filename}_{view.replace('.vp','')}.hdr")
2✔
3112
        
3113
        temp_dir.cleanup()
2✔
3114

3115
# end of SceneObj
3116

3117

3118
        
3119
class MetObj:
2✔
3120
    """
3121
    Meteorological data from EPW file.
3122

3123
    Initialize the MetObj from tmy data already read in. 
3124
    
3125
    Parameters
3126
    -----------
3127
    tmydata : DataFrame
3128
        TMY3 output from :py:class:`~bifacial_radiance.RadianceObj.readTMY` or 
3129
        from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3130
    metadata : Dictionary
3131
        Metadata output from output from :py:class:`~bifacial_radiance.RadianceObj.readTMY`` 
3132
        or from :py:class:`~bifacial_radiance.RadianceObj.readEPW`.
3133
    label : str
3134
        label : str
3135
        'left', 'right', or 'center'. For data that is averaged, defines if the
3136
        timestamp refers to the left edge, the right edge, or the center of the
3137
        averaging interval, for purposes of calculating sunposition. For
3138
        example, TMY3 data is right-labeled, so 11 AM data represents data from
3139
        10 to 11, and sun position should be calculated at 10:30 AM.  Currently
3140
        SAM and PVSyst use left-labeled interval data and NSRDB uses centered.
3141

3142
    """
3143

3144
    def __init__(self, tmydata, metadata, label = 'right'):
2✔
3145

3146
        import pytz
2✔
3147
        import pvlib
2✔
3148
        #import numpy as np
3149
        
3150
        #First prune all GHI = 0 timepoints.  New as of 0.4.0
3151
        # TODO: is this a good idea?  This changes default behavior...
3152
        tmydata = tmydata[tmydata.GHI > 0]
2✔
3153

3154
        #  location data.  so far needed:
3155
        # latitude, longitude, elevation, timezone, city
3156
        self.latitude = metadata['latitude']; lat=self.latitude
2✔
3157
        self.longitude = metadata['longitude']; lon=self.longitude
2✔
3158
        self.elevation = metadata['altitude']; elev=self.elevation
2✔
3159
        self.timezone = metadata['TZ']
2✔
3160
        try:
2✔
3161
            self.city = metadata['Name'] # readepw version
2✔
3162
        except KeyError:
2✔
3163
            self.city = metadata['city'] # pvlib version
2✔
3164
        #self.location.state_province_region = metadata['State'] # unecessary
3165
        self.datetime = tmydata.index.tolist() # this is tz-aware.
2✔
3166
        self.ghi = np.array(tmydata.GHI)
2✔
3167
        self.dhi = np.array(tmydata.DHI)
2✔
3168
        self.dni = np.array(tmydata.DNI)
2✔
3169
        try:
2✔
3170
            self.albedo = np.array(tmydata.Alb)
2✔
3171
        except AttributeError: # no TMY albedo data
2✔
3172
            self.albedo = None
2✔
3173
        
3174
        # Try and retrieve dewpoint and pressure
3175
        try:
2✔
3176
            self.dewpoint = np.array(tmydata['temp_dew'])
2✔
3177
        except KeyError:
2✔
3178
            self.dewpoint = None
2✔
3179

3180
        try:
2✔
3181
            self.pressure = np.array(tmydata['atmospheric_pressure'])
2✔
3182
        except KeyError:
2✔
3183
            self.pressure = None
2✔
3184

3185
        try:
2✔
3186
            self.temp_air = np.array(tmydata['temp_air'])
2✔
3187
        except KeyError:
2✔
3188
            self.temp_air = None
2✔
3189

3190
        if self.temp_air is None:
2✔
3191
            try:
2✔
3192
                self.temp_air = np.array(tmydata['DryBulb'])
2✔
3193
            except KeyError:
×
3194
                self.temp_air = None
×
3195

3196
        try:
2✔
3197
            self.wind_speed = np.array(tmydata['wind_speed'])
2✔
3198
        except KeyError:
2✔
3199
            self.wind_speed = None
2✔
3200
        
3201
        if self.wind_speed is None:
2✔
3202
            try:
2✔
3203
                self.wind_speed = np.array(tmydata['Wspd'])
2✔
3204
            except KeyError:
×
3205
                self.wind_speed = None
×
3206
            
3207
        # Try and retrieve TrackerAngle
3208
        try:
2✔
3209
            self.meastracker_angle = np.array(tmydata['Tracker Angle (degrees)'])
2✔
3210
        except KeyError:
2✔
3211
            self.meastracker_angle= None
2✔
3212
            
3213
            
3214
        #v0.2.5: initialize MetObj with solpos, sunrise/set and corrected time
3215
        datetimetz = pd.DatetimeIndex(self.datetime)
2✔
3216
        try:  # make sure the data is tz-localized.
2✔
3217
            datetimetz = datetimetz.tz_localize(pytz.FixedOffset(self.timezone*60))#  use pytz.FixedOffset (in minutes)
2✔
3218
        except TypeError:  # data is tz-localized already. Just put it in local time.
2✔
3219
            datetimetz = datetimetz.tz_convert(pytz.FixedOffset(self.timezone*60))
2✔
3220
        #check for data interval. default 1h.
3221
        try:
2✔
3222
            interval = datetimetz[1]-datetimetz[0]
2✔
3223
        except IndexError:
2✔
3224
            interval = pd.Timedelta('1h') # ISSUE: if 1 datapoint is passed, are we sure it's hourly data?
2✔
3225
            print ("WARNING: TMY interval was unable to be defined, so setting it to 1h.")
2✔
3226
        # TODO:  Refactor this into a subfunction. first calculate minutedelta 
3227
        # based on label and interval (-30, 0, +30, +7.5 etc) then correct all.        
3228
        if label.lower() == 'center':
2✔
3229
            print("Calculating Sun position for center labeled data, at exact timestamp in input Weather File")
2✔
3230
            sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
3231
            sunup['corrected_timestamp'] = datetimetz
2✔
3232
        else:
3233
            if interval== pd.Timedelta('1h'):
2✔
3234

3235
                if label.lower() == 'right':
2✔
3236
                    print("Calculating Sun position for Metdata that is right-labeled ", 
2✔
3237
                          "with a delta of -30 mins. i.e. 12 is 11:30 sunpos")
3238
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
2✔
3239
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes before timestamp
2✔
3240
                    # vector update of minutedelta at sunrise
3241
                    sunrisemask = sunup.index.hour-1==sunup['sunrise'].dt.hour
2✔
3242
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunrisemask,np.floor((60-(sunup['sunrise'].dt.minute))/2))
2✔
3243
                    # vector update of minutedelta at sunset
3244
                    sunsetmask = sunup.index.hour-1==sunup['sunset'].dt.hour
2✔
3245
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunsetmask,np.floor((60-(sunup['sunset'].dt.minute))/2))
2✔
3246
                    # save corrected timestamp
3247
                    sunup['corrected_timestamp'] = sunup.index-pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
3248
        
3249
                elif label.lower() == 'left':        
2✔
3250
                    print("Calculating Sun position for Metdata that is left-labeled ",
2✔
3251
                          "with a delta of +30 mins. i.e. 12 is 12:30 sunpos.")
3252
                    sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) 
2✔
3253
                    sunup['minutedelta']= int(interval.seconds/2/60) # default sun angle 30 minutes after timestamp
2✔
3254
                    # vector update of minutedelta at sunrise
3255
                    sunrisemask = sunup.index.hour==sunup['sunrise'].dt.hour
2✔
3256
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunrisemask,np.ceil((60+sunup['sunrise'].dt.minute)/2))
2✔
3257
                    # vector update of minutedelta at sunset
3258
                    sunsetmask = sunup.index.hour==sunup['sunset'].dt.hour
2✔
3259
                    sunup['minutedelta'] = sunup['minutedelta'].mask(sunsetmask,np.ceil((60+sunup['sunset'].dt.minute)/2))
2✔
3260
                    # save corrected timestamp
3261
                    sunup['corrected_timestamp'] = sunup.index+pd.to_timedelta(sunup['minutedelta'], unit='m')
2✔
3262
                else: raise ValueError('Error: invalid weather label passed. Valid inputs: right, left or center')
×
3263
            else:
3264
                minutedelta = int(interval.seconds/2/60)
×
3265
                print("Interval in weather data is less than 1 hr, calculating"
×
3266
                      f" Sun position with a delta of -{minutedelta} minutes.")
3267
                print("If you want no delta for sunposition, use "
×
3268
                      "readWeatherFile( label='center').")
3269
                #datetimetz=datetimetz-pd.Timedelta(minutes = minutedelta)   # This doesn't check for Sunrise or Sunset
3270
                #sunup= pvlib.irradiance.solarposition.get_sun_rise_set_transit(datetimetz, lat, lon) # deprecated in pvlib 0.6.1
3271
                sunup= pvlib.irradiance.solarposition.sun_rise_set_transit_spa(datetimetz, lat, lon) #new for pvlib >= 0.6.1
×
3272
                sunup['corrected_timestamp'] = sunup.index-pd.Timedelta(minutes = minutedelta)
×
3273
    
3274
        self.solpos = pvlib.irradiance.solarposition.get_solarposition(sunup['corrected_timestamp'],lat,lon,elev)
2✔
3275
        self.sunrisesetdata=sunup
2✔
3276
        self.label = label
2✔
3277

3278

3279
    def _set1axis(self, azimuth=180, limit_angle=45, angledelta=None, 
2✔
3280
                  backtrack=True, gcr=1.0/3.0, cumulativesky=True, 
3281
                  fixed_tilt_angle=None, axis_tilt=0, useMeasuredTrackerAngle=False):
3282

3283
        """
3284
        Set up geometry for 1-axis tracking cumulativesky.  Solpos data
3285
        already stored in `metdata.solpos`. Pull in tracking angle details from
3286
        pvlib, create multiple 8760 metdata sub-files where datetime of met
3287
        data matches the tracking angle.
3288

3289
        Parameters
3290
        ------------
3291
        cumulativesky : bool
3292
            Whether individual csv files are created
3293
            with constant tilt angle for the cumulativesky approach.
3294
            if false, the gendaylit tracking approach must be used.
3295
        azimuth : numerical
3296
            orientation axis of tracker torque tube. Default North-South (180 deg)
3297
            For fixed tilt simulations  this is the orientation azimuth
3298
        limit_angle : numerical
3299
            +/- limit angle of the 1-axis tracker in degrees. Default 45
3300
        angledelta : numerical
3301
            Degree of rotation increment to parse irradiance bins.
3302
            Default 5 degrees (0.4 % error for DNI).
3303
            Other options: 4 (.25%), 2.5 (0.1%).
3304
            (the smaller the angledelta, the more simulations)
3305
        backtrack : bool
3306
            Whether backtracking is enabled (default = True)
3307
        gcr : float
3308
            Ground coverage ratio for calculation backtracking. Defualt [1.0/3.0] 
3309
        axis_tilt : float
3310
            Tilt of the axis. While it can be considered for the tracking calculation,
3311
            the scene geometry creation of the trackers does not support tilte
3312
            axis_trackers yet (but can be done manuallyish. See Tutorials)
3313
        fixed_tilt_angle : numeric
3314
            If passed, this changes to a fixed tilt simulation where each hour
3315
            uses fixed_tilt_angle and azimuth as the tilt and azimuth
3316

3317
        Returns
3318
        -------
3319
        trackerdict : dictionary 
3320
            Keys for tracker tilt angles and
3321
            list of csv metfile, and datetimes at that angle
3322
            trackerdict[angle]['csvfile';'surf_azm';'surf_tilt';'UTCtime']
3323
        metdata.solpos : dataframe
3324
            Dataframe with output from pvlib solar position for each timestep
3325
        metdata.sunrisesetdata :
3326
            Pandas dataframe with sunrise, sunset and adjusted time data.
3327
        metdata.tracker_theta : list
3328
            Tracker tilt angle from pvlib for each timestep
3329
        metdata.surface_tilt : list
3330
            Tracker surface tilt angle from pvlib for each timestep
3331
        metdata.surface_azimuth : list
3332
            Tracker surface azimuth angle from pvlib for each timestep
3333
        """
3334
          
3335
        #axis_tilt = 0       # only support 0 tilt trackers for now
3336
        self.cumulativesky = cumulativesky   # track whether we're using cumulativesky or gendaylit
2✔
3337

3338
        if (cumulativesky is True) & (angledelta is None):
2✔
3339
            angledelta = 5  # round angle to 5 degrees for cumulativesky
×
3340

3341
        # get 1-axis tracker angles for this location,
3342
        # round to nearest 'angledelta'
3343
        if self.meastracker_angle is not None and useMeasuredTrackerAngle is True:
2✔
3344
            print("Tracking Data: Reading from provided Tracker Angles")
2✔
3345
        elif self.meastracker_angle is None and useMeasuredTrackerAngle is True:
2✔
3346
            useMeasuredTrackerAngle = False
×
3347
            print("Warning: Using Measured Tracker Angles was specified but DATA"+
×
3348
                  " for trackers has not yet been assigned. "+
3349
                  " Assign it by making it a column on your Weatherdata File "+
3350
                  "named 'Tracker Angle (degrees)' and run ReadWeatherFile again")
3351

3352
        trackingdata = self._getTrackingAngles(azimuth,
2✔
3353
                                               limit_angle,
3354
                                               angledelta,
3355
                                               axis_tilt = axis_tilt,
3356
                                               backtrack = backtrack,
3357
                                               gcr = gcr,
3358
                                               fixed_tilt_angle=fixed_tilt_angle,
3359
                                               useMeasuredTrackerAngle=useMeasuredTrackerAngle)
3360

3361
        # get list of unique rounded tracker angles
3362
        theta_list = trackingdata.dropna()['theta_round'].unique()
2✔
3363

3364
        if cumulativesky is True:
2✔
3365
            # create a separate metfile for each unique tracker theta angle.
3366
            # return dict of filenames and details
3367
            trackerdict = self._makeTrackerCSV(theta_list,trackingdata)
2✔
3368
        else:
3369
            # trackerdict uses timestamp as keys. return azimuth
3370
            # and tilt for each timestamp
3371
            #times = [str(i)[5:-12].replace('-','_').replace(' ','_') for i in self.datetime]
3372
            times = [i.strftime('%Y-%m-%d_%H%M') for i in self.datetime]
2✔
3373
            #trackerdict = dict.fromkeys(times)
3374
            trackerdict = {}
2✔
3375
            for i,time in enumerate(times) :
2✔
3376
                # remove NaN tracker theta from trackerdict
3377
                if (self.ghi[i] > 0) & (~np.isnan(self.tracker_theta[i])):
2✔
3378
                    trackerdict[time] = {
2✔
3379
                                        'surf_azm':self.surface_azimuth[i],
3380
                                        'surf_tilt':self.surface_tilt[i],
3381
                                        'theta':self.tracker_theta[i],
3382
                                        'ghi':self.ghi[i],
3383
                                        'dhi':self.dhi[i],
3384
                                        'temp_air':self.temp_air[i],
3385
                                        'wind_speed':self.wind_speed[i]
3386
                                        }
3387

3388
        return trackerdict
2✔
3389

3390

3391
    def _getTrackingAngles(self, azimuth=180, limit_angle=45,
2✔
3392
                           angledelta=None, axis_tilt=0, backtrack=True,
3393
                           gcr = 1.0/3.0, fixed_tilt_angle=None,
3394
                           useMeasuredTrackerAngle=False):
3395
        '''
3396
        Helper subroutine to return 1-axis tracker tilt and azimuth data.
3397

3398
        Parameters
3399
        ----------
3400
        same as pvlib.tracking.singleaxis, plus:
3401
        angledelta : degrees
3402
            Angle to round tracker_theta to.  This is for
3403
            cumulativesky simulations. Other input options: None (no 
3404
            rounding of tracker angle) 
3405
        fixed_tilt_angle : (Optional) degrees
3406
            This changes to a fixed tilt simulation where each hour uses 
3407
            fixed_tilt_angle and azimuth as the tilt and azimuth
3408

3409
        Returns
3410
        -------
3411
        DataFrame with the following columns:
3412
            * tracker_theta: The rotation angle of the tracker.
3413
                tracker_theta = 0 is horizontal, and positive rotation angles 
3414
                are clockwise.
3415
            * aoi: The angle-of-incidence of direct irradiance onto the
3416
                rotated panel surface.
3417
            * surface_tilt: The angle between the panel surface and the earth
3418
                surface, accounting for panel rotation.
3419
            * surface_azimuth: The azimuth of the rotated panel, determined by
3420
                projecting the vector normal to the panel's surface to the 
3421
                earth's  surface.
3422
            * 'theta_round' : tracker_theta rounded to the nearest 'angledelta'
3423
            If no angledelta is specified, it is rounded to the nearest degree.
3424
        '''
3425
        import pvlib
2✔
3426
        import warnings
2✔
3427
        from pvlib.irradiance import aoi 
2✔
3428
        #import numpy as np
3429
        #import pandas as pd
3430
        
3431
        solpos = self.solpos
2✔
3432
        
3433
        #New as of 0.3.2:  pass fixed_tilt_angle and switches to FIXED TILT mode
3434

3435
        if fixed_tilt_angle is not None:
2✔
3436
            # system with fixed tilt = fixed_tilt_angle 
3437
            surface_tilt=fixed_tilt_angle
2✔
3438
            surface_azimuth=azimuth 
2✔
3439
            # trackingdata keys: 'tracker_theta', 'aoi', 'surface_azimuth', 'surface_tilt'
3440
            trackingdata = pd.DataFrame({'tracker_theta':fixed_tilt_angle,
2✔
3441
                                         'aoi':aoi(surface_tilt, surface_azimuth,
3442
                                                   solpos['zenith'], 
3443
                                                   solpos['azimuth']),
3444
                                         'surface_azimuth':azimuth,
3445
                                         'surface_tilt':fixed_tilt_angle})
3446
        elif useMeasuredTrackerAngle:           
2✔
3447
            # tracked system
3448
            surface_tilt=self.meastracker_angle
2✔
3449
            surface_azimuth=azimuth
2✔
3450

3451
            trackingdata = pd.DataFrame({'tracker_theta':self.meastracker_angle,
2✔
3452
                                         'aoi':aoi(surface_tilt, surface_azimuth,
3453
                                                   solpos['zenith'], 
3454
                                                   solpos['azimuth']),
3455
                                         'surface_azimuth':azimuth,
3456
                                         'surface_tilt':abs(self.meastracker_angle)})
3457

3458

3459
        else:
3460
            # get 1-axis tracker tracker_theta, surface_tilt and surface_azimuth
3461
            with warnings.catch_warnings():
2✔
3462
                warnings.filterwarnings("ignore", category=RuntimeWarning)
2✔
3463
                trackingdata = pvlib.tracking.singleaxis(solpos['zenith'],
2✔
3464
                                                     solpos['azimuth'],
3465
                                                     axis_tilt,
3466
                                                     azimuth,
3467
                                                     limit_angle,
3468
                                                     backtrack,
3469
                                                     gcr)
3470
            
3471
        # save tracker tilt information to metdata.tracker_theta,
3472
        # metdata.surface_tilt and metdata.surface_azimuth
3473
        self.tracker_theta = np.round(trackingdata['tracker_theta'],2).tolist()
2✔
3474
        self.surface_tilt = np.round(trackingdata['surface_tilt'],2).tolist()
2✔
3475
        self.surface_azimuth = np.round(trackingdata['surface_azimuth'],2).tolist()
2✔
3476
        # undo the  timestamp offset put in by solpos.
3477
        #trackingdata.index = trackingdata.index + pd.Timedelta(minutes = 30)
3478
        # It may not be exactly 30 minutes any more...
3479
        trackingdata.index = self.sunrisesetdata.index  #this has the original time data in it
2✔
3480

3481
        # round tracker_theta to increments of angledelta for use in cumulativesky
3482
        def _roundArbitrary(x, base=angledelta):
2✔
3483
            # round to nearest 'base' value.
3484
            # mask NaN's to avoid rounding error message
3485
            return base * (x/float(base)).round()
2✔
3486

3487
        if angledelta == 0:
2✔
3488
            raise ZeroDivisionError('Angledelta = 0. Use None instead')
×
3489
        elif angledelta is None: # don't round theta
2✔
3490
            trackingdata['theta_round'] = trackingdata['tracker_theta']
×
3491
        else:  # round theta
3492
            trackingdata['theta_round'] = \
2✔
3493
                _roundArbitrary(trackingdata['tracker_theta'], angledelta)
3494

3495
        return trackingdata
2✔
3496

3497
    def _makeTrackerCSV(self, theta_list, trackingdata):
2✔
3498
        '''
3499
        Create multiple new irradiance csv files with data for each unique
3500
        rounded tracker angle. Return a dictionary with the new csv filenames
3501
        and other details, Used for cumulativesky tracking
3502

3503
        Parameters
3504
        -----------
3505
        theta_list : array
3506
             Array of unique tracker angle values
3507

3508
        trackingdata : Pandas 
3509
             Pandas Series with hourly tracker angles from
3510
             :pvlib.tracking.singleaxis
3511

3512
        Returns
3513
        --------
3514
        trackerdict : dictionary
3515
              keys: *theta_round tracker angle  (default: -45 to +45 in
3516
                                                 5 degree increments).
3517
              sub-array keys:
3518
                  *datetime:  array of datetime strings in this group of angles
3519
                  *count:  number of datapoints in this group of angles
3520
                  *surf_azm:  tracker surface azimuth during this group of angles
3521
                  *surf_tilt:  tilt angle average during this group of angles
3522
                  *csvfile:  name of csv met data file saved in /EPWs/
3523
        '''
3524

3525
        dt = pd.to_datetime(self.datetime)
2✔
3526

3527
        trackerdict = dict.fromkeys(theta_list)
2✔
3528

3529
        for theta in sorted(trackerdict):  
2✔
3530
            trackerdict[theta] = {}
2✔
3531
            csvfile = os.path.join('EPWs', '1axis_{}.csv'.format(theta))
2✔
3532
            tempdata = trackingdata[trackingdata['theta_round'] == theta]
2✔
3533

3534
            #Set up trackerdict output for each value of theta
3535
            trackerdict[theta]['csvfile'] = csvfile
2✔
3536
            trackerdict[theta]['surf_azm'] = tempdata['surface_azimuth'].median()
2✔
3537
            trackerdict[theta]['surf_tilt'] = abs(theta)
2✔
3538
            datetimetemp = tempdata.index.strftime('%Y-%m-%d %H:%M:%S') #local time
2✔
3539
            trackerdict[theta]['datetime'] = datetimetemp
2✔
3540
            trackerdict[theta]['count'] = datetimetemp.__len__()
2✔
3541
            #Create new temp csv file with zero values for all times not equal to datetimetemp
3542
            # write 8760 2-column csv:  GHI,DHI
3543
            ghi_temp = []
2✔
3544
            dhi_temp = []
2✔
3545
            for g, d, time in zip(self.ghi, self.dhi,
2✔
3546
                                  dt.strftime('%Y-%m-%d %H:%M:%S')):
3547

3548
                # is this time included in a particular theta_round angle?
3549
                if time in datetimetemp:
2✔
3550
                    ghi_temp.append(g)
2✔
3551
                    dhi_temp.append(d)
2✔
3552
                else:
3553
                    # mask out irradiance at this time, since it
3554
                    # belongs to a different bin
3555
                    ghi_temp.append(0.0)
2✔
3556
                    dhi_temp.append(0.0)
2✔
3557
            # save in 2-column GHI,DHI format for gencumulativesky -G
3558
            savedata = pd.DataFrame({'GHI':ghi_temp, 'DHI':dhi_temp},
2✔
3559
                                    index = self.datetime).tz_localize(None)
3560
            # Fill partial year. Requires 2021 measurement year.
3561
            savedata = _subhourlydatatoGencumskyformat(savedata, 
2✔
3562
                                                       label=self.label)
3563
            print('Saving file {}, # points: {}'.format(
2✔
3564
                  trackerdict[theta]['csvfile'], datetimetemp.__len__()))
3565

3566
            savedata.to_csv(csvfile,
2✔
3567
                            index=False,
3568
                            header=False,
3569
                            sep=' ',
3570
                            columns=['GHI','DHI'])
3571

3572

3573
        return trackerdict
2✔
3574

3575

3576
class AnalysisObj:
2✔
3577
    """
3578
    Analysis class for performing raytrace to obtain irradiance measurements
3579
    at the array, as well plotting and reporting results.
3580
    """
3581
    def __repr__(self):
2✔
3582
        return str(self.__dict__)    
2✔
3583
    def __init__(self, octfile=None, name=None, hpc=False):
2✔
3584
        """
3585
        Initialize AnalysisObj by pointing to the octfile.  Scan information
3586
        is defined separately by passing scene details into AnalysisObj.moduleAnalysis()
3587
        
3588
        Parameters
3589
        ------------
3590
        octfile : string
3591
            Filename and extension of .oct file
3592
        name    :
3593
        hpc     : boolean, default False. Waits for octfile for a
3594
                  longer time if parallel processing.
3595
        """
3596

3597
        self.octfile = octfile
2✔
3598
        self.name = name
2✔
3599
        self.hpc = hpc
2✔
3600

3601
    def makeImage(self, viewfile, octfile=None, name=None):
2✔
3602
        """
3603
        Makes a visible image (rendering) of octfile, viewfile
3604
        """
3605
        
3606
        import time
2✔
3607

3608
        if octfile is None:
2✔
3609
            octfile = self.octfile
2✔
3610
        if name is None:
2✔
3611
            name = self.name
2✔
3612

3613
        #TODO: update this for cross-platform compatibility w/ os.path.join
3614
        if self.hpc :
2✔
3615
            time_to_wait = 10
2✔
3616
            time_counter = 0
2✔
3617
            filelist = [octfile, "views/"+viewfile]
2✔
3618
            for file in filelist:
2✔
3619
                while not os.path.exists(file):
2✔
3620
                    time.sleep(1)
×
3621
                    time_counter += 1
×
3622
                    if time_counter > time_to_wait:break
×
3623

3624
        print('Generating visible render of scene')
2✔
3625
        #TODO: update this for cross-platform compatibility w os.path.join
3626
        os.system("rpict -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+
2✔
3627
                  "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 "+
3628
                  "-ad 1536 -as 392 -av 25 25 25 -lr 8 -lw 1e-4 -vf views/"
3629
                  +viewfile+ " " + octfile +
3630
                  " > images/"+name+viewfile[:-3] +".hdr")
3631

3632
    def makeFalseColor(self, viewfile, octfile=None, name=None):
2✔
3633
        """
3634
        Makes a false-color plot of octfile, viewfile
3635
        
3636
        .. note::
3637
            For Windows requires installation of falsecolor.exe,
3638
            which is part of radwinexe-5.0.a.8-win64.zip found at
3639
            http://www.jaloxa.eu/resources/radiance/radwinexe.shtml
3640
        """
3641
        #TODO: error checking for installation of falsecolor.exe 
3642
        
3643
        if octfile is None:
2✔
3644
            octfile = self.octfile
2✔
3645
        if name is None:
2✔
3646
            name = self.name
2✔
3647

3648
        print('Generating scene in WM-2. This may take some time.')
2✔
3649
        #TODO: update and test this for cross-platform compatibility using os.path.join
3650
        cmd = "rpict -i -dp 256 -ar 48 -ms 1 -ds .2 -dj .9 -dt .1 "+\
2✔
3651
              "-dc .5 -dr 1 -ss 1 -st .1 -ab 3  -aa .1 -ad 1536 -as 392 " +\
3652
              "-av 25 25 25 -lr 8 -lw 1e-4 -vf views/"+viewfile + " " + octfile
3653

3654
        WM2_out,err = _popen(cmd,None)
2✔
3655
        if err is not None:
2✔
3656
            print('Error: {}'.format(err))
×
3657
            return
×
3658

3659
        # determine the extreme maximum value to help with falsecolor autoscale
3660
        extrm_out,err = _popen("pextrem",WM2_out.encode('latin1'))
2✔
3661
        # cast the pextrem string as a float and find the max value
3662
        WM2max = max(map(float,extrm_out.split()))
2✔
3663
        print('Saving scene in false color')
2✔
3664
        #auto scale false color map
3665
        if WM2max < 1100:
2✔
3666
            cmd = "falsecolor -l W/m2 -m 1 -s 1100 -n 11"
2✔
3667
        else:
3668
            cmd = "falsecolor -l W/m2 -m 1 -s %s"%(WM2max,)
×
3669
        with open(os.path.join("images","%s%s_FC.hdr"%(name,viewfile[:-3]) ),"w") as f:
2✔
3670
            data,err = _popen(cmd,WM2_out.encode('latin1'),f)
2✔
3671
            if err is not None:
2✔
3672
                print(err)
×
3673
                print('possible solution: install radwinexe binary package from '
×
3674
                      'http://www.jaloxa.eu/resources/radiance/radwinexe.shtml')
3675

3676
    def _linePtsArray(self, linePtsDict):
2✔
3677
        """
3678
        Helper function to just print the x y and z values in an array format,
3679
        just like they will show in the .csv result files.
3680
        
3681
        """
3682
        xstart = linePtsDict['xstart']
2✔
3683
        ystart = linePtsDict['ystart']
2✔
3684
        zstart = linePtsDict['zstart']
2✔
3685
        xinc = linePtsDict['xinc']
2✔
3686
        yinc = linePtsDict['yinc']
2✔
3687
        zinc = linePtsDict['zinc']
2✔
3688
        sx_xinc = linePtsDict['sx_xinc']
2✔
3689
        sx_yinc = linePtsDict['sx_yinc']
2✔
3690
        sx_zinc = linePtsDict['sx_zinc']
2✔
3691
        Nx = int(linePtsDict['Nx'])
2✔
3692
        Ny = int(linePtsDict['Ny'])
2✔
3693
        Nz = int(linePtsDict['Nz'])
2✔
3694

3695
        x = []
2✔
3696
        y = []
2✔
3697
        z = []
2✔
3698

3699
        for iz in range(0,Nz):
2✔
3700
            for ix in range(0,Nx):
2✔
3701
                for iy in range(0,Ny):
2✔
3702
                    x . append(xstart+iy*xinc+ix*sx_xinc)
2✔
3703
                    y . append(ystart+iy*yinc+ix*sx_yinc)
2✔
3704
                    z . append(zstart+iy*zinc+ix*sx_zinc)
2✔
3705

3706
        return x, y, z
2✔
3707
        
3708
    def _linePtsMakeDict(self, linePtsDict):
2✔
3709
        a = linePtsDict
2✔
3710
        linepts = self._linePtsMake3D(a['xstart'],a['ystart'],a['zstart'],
2✔
3711
                            a['xinc'], a['yinc'], a['zinc'],
3712
                            a['sx_xinc'], a['sx_yinc'], a['sx_zinc'],
3713
                            a['Nx'],a['Ny'],a['Nz'],a['orient'])
3714
        return linepts
2✔
3715

3716
    def _linePtsMake3D(self, xstart, ystart, zstart, xinc, yinc, zinc,
2✔
3717
                       sx_xinc, sx_yinc, sx_zinc,
3718
                      Nx, Ny, Nz, orient):
3719
        #create linepts text input with variable x,y,z.
3720
        #If you don't want to iterate over a variable, inc = 0, N = 1.
3721

3722
        linepts = ""
2✔
3723
        # make sure Nx, Ny, Nz are ints.
3724
        Nx = int(Nx)
2✔
3725
        Ny = int(Ny)
2✔
3726
        Nz = int(Nz)
2✔
3727

3728

3729
        for iz in range(0,Nz):
2✔
3730
            for ix in range(0,Nx):
2✔
3731
                for iy in range(0,Ny):
2✔
3732
                    xpos = xstart+iy*xinc+ix*sx_xinc
2✔
3733
                    ypos = ystart+iy*yinc+ix*sx_yinc
2✔
3734
                    zpos = zstart+iy*zinc+ix*sx_zinc
2✔
3735
                    linepts = linepts + str(xpos) + ' ' + str(ypos) + \
2✔
3736
                          ' '+str(zpos) + ' ' + orient + " \r"
3737
        return(linepts)
2✔
3738

3739
    def _irrPlot(self, octfile, linepts, mytitle=None, plotflag=None,
2✔
3740
                   accuracy='low'):
3741
        """
3742
        (plotdict) = _irrPlot(linepts,title,time,plotflag, accuracy)
3743
        irradiance plotting using rtrace
3744
        pass in the linepts structure of the view along with a title string
3745
        for the plots.  
3746

3747
        Parameters
3748
        ------------
3749
        octfile : string
3750
            Filename and extension of .oct file
3751
        linepts : 
3752
            Output from :py:class:`bifacial_radiance.AnalysisObj._linePtsMake3D`
3753
        mytitle : string
3754
            Title to append to results files
3755
        plotflag : Boolean
3756
            Include plot of resulting irradiance
3757
        accuracy : string
3758
            Either 'low' (default - faster) or 'high'
3759
            (better for low light)
3760

3761
        Returns
3762
        -------
3763
        out : dictionary
3764
            out.x,y,z  - coordinates of point
3765
            .r,g,b     - r,g,b values in Wm-2
3766
            .Wm2            - equal-weight irradiance
3767
            .mattype        - material intersected
3768
            .title      - title passed in
3769
        """
3770
        
3771
        if mytitle is None:
2✔
3772
            mytitle = octfile[:-4]
×
3773

3774
        if plotflag is None:
2✔
3775
            plotflag = False
×
3776

3777
        
3778
        if self.hpc :
2✔
3779
            import time
2✔
3780
            time_to_wait = 10
2✔
3781
            time_counter = 0
2✔
3782
            while not os.path.exists(octfile):
2✔
3783
                time.sleep(1)
×
3784
                time_counter += 1
×
3785
                if time_counter > time_to_wait:
×
3786
                    print('Warning: OCTFILE NOT FOUND')
×
3787
                    break
×
3788

3789
        if octfile is None:
2✔
3790
            print('Analysis aborted. octfile = None' )
×
3791
            return None
×
3792

3793
        keys = ['Wm2','x','y','z','r','g','b','mattype']
2✔
3794
        out = {key: [] for key in keys}
2✔
3795
        #out = dict.fromkeys(['Wm2','x','y','z','r','g','b','mattype','title'])
3796
        out['title'] = mytitle
2✔
3797
        print ('Linescan in process: %s' %(mytitle))
2✔
3798
        #rtrace ambient values set for 'very accurate':
3799
        #cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
3800

3801
        if accuracy == 'low':
2✔
3802
            #rtrace optimized for faster scans: (ab2, others 96 is too coarse)
3803
            cmd = "rtrace -i -ab 2 -aa .1 -ar 256 -ad 2048 -as 256 -h -oovs "+ octfile
2✔
3804
        elif accuracy == 'high':
×
3805
            #rtrace ambient values set for 'very accurate':
3806
            cmd = "rtrace -i -ab 5 -aa .08 -ar 512 -ad 2048 -as 512 -h -oovs "+ octfile
×
3807
        else:
3808
            print('_irrPlot accuracy options: "low" or "high"')
×
3809
            return({})
×
3810

3811

3812

3813
        temp_out,err = _popen(cmd,linepts.encode())
2✔
3814
        if err is not None:
2✔
3815
            if err[0:5] == 'error':
×
3816
                raise Exception(err[7:])
×
3817
            else:
3818
                print(err)
×
3819

3820
        # when file errors occur, temp_out is None, and err message is printed.
3821
        if temp_out is not None:
2✔
3822
            for line in temp_out.splitlines():
2✔
3823
                temp = line.split('\t')
2✔
3824
                out['x'].append(float(temp[0]))
2✔
3825
                out['y'].append(float(temp[1]))
2✔
3826
                out['z'].append(float(temp[2]))
2✔
3827
                out['r'].append(float(temp[3]))
2✔
3828
                out['g'].append(float(temp[4]))
2✔
3829
                out['b'].append(float(temp[5]))
2✔
3830
                out['mattype'].append(temp[6])
2✔
3831
                out['Wm2'].append(sum([float(i) for i in temp[3:6]])/3.0)
2✔
3832

3833

3834
            if plotflag is True:
2✔
3835
                import matplotlib.pyplot as plt
×
3836
                plt.figure()
×
3837
                plt.plot(out['Wm2'])
×
3838
                plt.ylabel('Wm2 irradiance')
×
3839
                plt.xlabel('variable')
×
3840
                plt.title(mytitle)
×
3841
                plt.show()
×
3842
        else:
3843
            out = None   # return empty if error message.
×
3844

3845
        return(out)
2✔
3846

3847
    def _saveResults(self, data=None, reardata=None, savefile=None, RGB = False):
2✔
3848
        """
3849
        Function to save output from _irrPlot
3850
        If rearvals is passed in, back ratio is saved
3851
        If data = None then only reardata is saved.
3852
    
3853
        Returns
3854
        --------
3855
        savefile : str
3856
            If set to None, will write to default .csv filename in results folder.
3857
        """
3858

3859
        if savefile is None:
2✔
3860
            savefile = data['title'] + '.csv'
×
3861
        
3862
        if data is None and reardata is not None: # only rear data is passed.
2✔
3863
            data = reardata
2✔
3864
            reardata = None
2✔
3865
            # run process like normal but swap labels at the end
3866
            rearswapflag = True  
2✔
3867
        else:
3868
            rearswapflag = False
2✔
3869
            
3870
        # make savefile dataframe and set self.attributes
3871
        
3872
        if RGB:
2✔
3873
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype', 'Wm2','r', 'g', 'b' ]}
×
3874
        else:
3875
            data_sub = {key:data[key] for key in ['x', 'y', 'z', 'mattype','Wm2' ]}
2✔
3876
            
3877
        df = pd.DataFrame(data_sub)
2✔
3878
        df = df.rename(columns={'Wm2':'Wm2Front'})
2✔
3879
        
3880
        if reardata is not None:
2✔
3881
            df.insert(3, 'rearZ', reardata['z'])
2✔
3882
            df.insert(5, 'rearMat', reardata['mattype'])
2✔
3883
            df.insert(7, 'Wm2Back',  reardata['Wm2'])
2✔
3884
            # add 1mW/m2 to avoid dividebyzero
3885
            df.insert(8, 'Back/FrontRatio',  df['Wm2Back'] / (df['Wm2Front']+.001))
2✔
3886
            df['backRatio'] = df['Back/FrontRatio']
2✔
3887
            df['rearX'] = reardata['x']
2✔
3888
            df['rearY'] = reardata['y']
2✔
3889
            if RGB:
2✔
3890
                df['rearR'] = reardata['r']
×
3891
                df['rearG'] = reardata['g']
×
3892
                df['rearB'] = reardata['b']
×
3893
                
3894
        # rename columns if only rear data was originally passed
3895
        if rearswapflag:
2✔
3896
            df = df.rename(columns={'Wm2Front':'Wm2Back','mattype':'rearMat'})
2✔
3897
        # set attributes of analysis to equal columns of df
3898
        for col in df.columns:
2✔
3899
            setattr(self, col, list(df[col]))    
2✔
3900
        # only save a subset
3901
        df = df.drop(columns=['rearX','rearY','backRatio'], errors='ignore')
2✔
3902
        df.to_csv(os.path.join("results", savefile), sep=',',
2✔
3903
                           index=False, float_format='%0.3f')
3904

3905

3906
        print('Saved: %s'%(os.path.join("results", savefile)))
2✔
3907
        return os.path.join("results", savefile)
2✔
3908

3909
    def _saveResultsCumulative(self, data, reardata=None, savefile=None):
2✔
3910
        """
3911
        TEMPORARY FUNCTION -- this is a fix to save ONE cumulative results csv
3912
        in the main working folder for when doing multiple entries in a 
3913
        tracker dict.
3914
        
3915
        Returns
3916
        --------
3917
        savefile : str
3918
            If set to None, will write to default .csv filename in results folder.
3919
        """
3920

3921
        if savefile is None:
2✔
3922
            savefile = data['title'] + '.csv'
×
3923
        # make dataframe from results
3924
        data_sub = {key:data[key] for key in ['x', 'y', 'z', 'Wm2', 'mattype']}
2✔
3925
        self.x = data['x']
2✔
3926
        self.y = data['y']
2✔
3927
        self.z = data['z']
2✔
3928
        self.mattype = data['mattype']
2✔
3929
        #TODO: data_sub front values don't seem to be saved to self.
3930
        if reardata is not None:
2✔
3931
            self.rearX = reardata['x']
2✔
3932
            self.rearY = reardata['y']
2✔
3933
            self.rearMat = reardata['mattype']
2✔
3934
            data_sub['rearMat'] = self.rearMat
2✔
3935
            self.rearZ = reardata['z']
2✔
3936
            data_sub['rearZ'] = self.rearZ
2✔
3937
            self.Wm2Front = data_sub.pop('Wm2')
2✔
3938
            data_sub['Wm2Front'] = self.Wm2Front
2✔
3939
            self.Wm2Back = reardata['Wm2']
2✔
3940
            data_sub['Wm2Back'] = self.Wm2Back
2✔
3941
            self.backRatio = [x/(y+.001) for x,y in zip(reardata['Wm2'],data['Wm2'])] # add 1mW/m2 to avoid dividebyzero
2✔
3942
            data_sub['Back/FrontRatio'] = self.backRatio
2✔
3943
            df = pd.DataFrame.from_dict(data_sub)
2✔
3944
            df.to_csv(savefile, sep = ',',
2✔
3945
                      columns = ['x','y','z','rearZ','mattype','rearMat',
3946
                                 'Wm2Front','Wm2Back','Back/FrontRatio'],
3947
                                 index=False, float_format='%0.3f') # new in 0.2.3
3948

3949
        else:
3950
            df = pd.DataFrame.from_dict(data_sub)
×
3951
            df.to_csv(savefile, sep=',', float_format='%0.3f',
×
3952
                      columns=['x','y','z', 'mattype','Wm2'], index=False)
3953

3954
        print('Saved: %s'%(savefile))
2✔
3955
        return (savefile)   
2✔
3956

3957
    def moduleAnalysis(self, scene, modWanted=None, rowWanted=None,
2✔
3958
                       sensorsy=9, sensorsx=1, 
3959
                       frontsurfaceoffset=0.001, backsurfaceoffset=0.001, 
3960
                       modscanfront=None, modscanback=None, relative=False, 
3961
                       debug=False):
3962
        
3963
        """
3964
        Handler function that decides how to handle different number of front
3965
        and back sensors. If number for front sensors is not provided or is 
3966
        the same as for the back, _moduleAnalysis
3967
        is called only once. Else it is called twice to get the different front
3968
        and back dictionary. 
3969
                  
3970
        This function defines the scan points to be used in the 
3971
        :py:class:`~bifacial_radiance.AnalysisObj.analysis` function,
3972
        to perform the raytrace through Radiance function `rtrace`
3973

3974
        Parameters
3975
        ------------
3976
        scene : ``SceneObj``
3977
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
3978
        modWanted : int
3979
            Module wanted to sample. If none, defaults to center module (rounding down)
3980
        rowWanted : int
3981
            Row wanted to sample. If none, defaults to center row (rounding down)
3982
        sensorsy : int or list 
3983
            Number of 'sensors' or scanning points along the collector width 
3984
            (CW) of the module(s). If multiple values are passed, first value
3985
            represents number of front sensors, second value is number of back sensors
3986
        sensorsx : int or list 
3987
            Number of 'sensors' or scanning points along the length, the side perpendicular 
3988
            to the collector width (CW) of the module(s) for the back side of the module. 
3989
            If multiple values are passed, first value represents number of 
3990
            front sensors, second value is number of back sensors.
3991
        debug : bool
3992
            Activates various print statemetns for debugging this function.
3993
        modscanfront : dict
3994
            Dictionary to modify the fronstcan values established by this routine 
3995
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
3996
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'. If modifying 
3997
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
3998
            results writing stage. All of these keys are ints or 
3999
            floats except for 'orient' which takes x y z values as string 'x y z'
4000
            for example '0 0 -1'. These values will overwrite the internally
4001
            calculated frontscan dictionary for the module & row selected.
4002
        modscanback: dict
4003
            Dictionary to modify the backscan values established by this routine 
4004
            and set a specific value. Keys possible are 'xstart', 'ystart', 'zstart',
4005
            'xinc', 'yinc', 'zinc', 'Nx', 'Ny', 'Nz', and 'orient'. If modifying 
4006
            Nx, Ny or Nz, make sure to modify on modscanback to avoid issues on 
4007
            results writing stage. All of these keys are ints or 
4008
            floats except for 'orient' which takes x y z values as string 'x y z'
4009
            for example '0 0 -1'. These values will overwrite the internally
4010
            calculated frontscan dictionary for the module & row selected.    
4011
        relative : Bool
4012
            if passing modscanfront and modscanback to modify dictionarie of positions,
4013
            this sets if the values passed to be updated are relative or absolute. 
4014
            Default is absolute value (relative=False)
4015
   
4016
        
4017
        Returns
4018
        -------
4019
        frontscan : dictionary
4020
            Scan dictionary for module's front side. Used to pass into 
4021
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4022
        backscan : dictionary 
4023
            Scan dictionary for module's back side. Used to pass into 
4024
            :py:class:`~bifacial_radiance.AnalysisObj.analysis` function
4025
                
4026
        """
4027

4028
        # Height:  clearance height for fixed tilt systems, or torque tube
4029
        #           height for single-axis tracked systems.
4030
        #   Single axis tracked systems will consider the offset to calculate the final height.
4031
        
4032
        def _checkSensors(sensors):
2✔
4033
            # Checking Sensors input data for list or tuple
4034
            if (type(sensors)==tuple or type(sensors)==list):
2✔
4035
                try:
2✔
4036
                    sensors_back = sensors[1]
2✔
4037
                    sensors_front = sensors[0]
2✔
4038
                except IndexError: # only 1 value passed??
×
4039
                    sensors_back = sensors_front = sensors[0]
×
4040
            elif (type(sensors)==int or type(sensors)==float):
2✔
4041
                # Ensure sensors are positive int values.
4042
                if int(sensors) < 1:
2✔
4043
                    raise Exception('input sensorsy must be numeric >0')
×
4044
                sensors_back = sensors_front = int(sensors)
2✔
4045
            else:
4046
                print('Warning: invalid value passed for sensors. Setting = 1')
2✔
4047
                sensors_back = sensors_front = 1
2✔
4048
            return sensors_front, sensors_back
2✔
4049
            
4050
        sensorsy_front, sensorsy_back = _checkSensors(sensorsy)
2✔
4051
        sensorsx_front, sensorsx_back = _checkSensors(sensorsx)
2✔
4052
        
4053
        if (sensorsx_back != sensorsx_front) or (sensorsy_back != sensorsy_front):
2✔
4054
            sensors_diff = True
2✔
4055
        else:
4056
            sensors_diff = False
2✔
4057
          
4058
        dtor = np.pi/180.0
2✔
4059

4060
        # Internal scene parameters are stored in scene.sceneDict. Load these into local variables
4061
        sceneDict = scene.sceneDict
2✔
4062

4063
        azimuth = sceneDict['azimuth']
2✔
4064
        tilt = sceneDict['tilt']
2✔
4065
        nMods = sceneDict['nMods']
2✔
4066
        nRows = sceneDict['nRows']
2✔
4067
        originx = sceneDict['originx']
2✔
4068
        originy = sceneDict['originy']
2✔
4069

4070
       # offset = moduleDict['offsetfromaxis']
4071
        offset = scene.module.offsetfromaxis
2✔
4072
        sceney = scene.module.sceney
2✔
4073
        scenex = scene.module.scenex
2✔
4074

4075
        # x needed for sensorsx>1 case
4076
        x = scene.module.x
2✔
4077
        
4078
        ## Check for proper input variables in sceneDict
4079
        if 'pitch' in sceneDict:
2✔
4080
            pitch = sceneDict['pitch']
2✔
4081
        elif 'gcr' in sceneDict:
2✔
4082
            pitch = sceney / sceneDict['gcr']
2✔
4083
        else:
4084
            raise Exception("Error: no 'pitch' or 'gcr' passed in sceneDict" )
×
4085
        
4086
        if 'axis_tilt' in sceneDict:
2✔
4087
            axis_tilt = sceneDict['axis_tilt']
2✔
4088
        else:
4089
            axis_tilt = 0
×
4090

4091
        if hasattr(scene.module,'z'):
2✔
4092
            modulez = scene.module.z
2✔
4093
        else:
4094
            print ("Module's z not set on sceneDict internal dictionary. Setting to default")
×
4095
            modulez = 0.02
×
4096
            
4097
        if frontsurfaceoffset is None:
2✔
4098
            frontsurfaceoffset = 0.001
×
4099
        if backsurfaceoffset is None:
2✔
4100
            backsurfaceoffset = 0.001
×
4101
        
4102
        # The Sensor routine below needs a "hub-height", not a clearance height.
4103
        # The below complicated check checks to see if height (deprecated) is passed,
4104
        # and if clearance_height or hub_height is passed as well.
4105

4106
        sceneDict, use_clearanceheight  = _heightCasesSwitcher(sceneDict, 
2✔
4107
                                                               preferred = 'hub_height',
4108
                                                               nonpreferred = 'clearance_height',
4109
                                                               suppress_warning=True)
4110
        
4111
        if use_clearanceheight :
2✔
4112
            height = sceneDict['clearance_height'] + 0.5* \
2✔
4113
                np.sin(abs(tilt) * np.pi / 180) * \
4114
                sceney - offset*np.sin(abs(tilt)*np.pi/180)
4115
        else:
4116
            height = sceneDict['hub_height']
2✔
4117

4118

4119
        if debug:
2✔
4120
            print("For debug:\n hub_height, Azimuth, Tilt, nMods, nRows, "
×
4121
                  "Pitch, Offset, SceneY, SceneX")
4122
            print(height, azimuth, tilt, nMods, nRows,
×
4123
                  pitch, offset, sceney, scenex)
4124

4125
        if modWanted == 0:
2✔
4126
            print( " FYI Modules and Rows start at index 1. "
×
4127
                  "Reindexing to modWanted 1"  )
4128
            modWanted = modWanted+1  # otherwise it gives results on Space.
×
4129

4130
        if rowWanted ==0:
2✔
4131
            print( " FYI Modules and Rows start at index 1. "
×
4132
                  "Reindexing to rowWanted 1"  )
4133
            rowWanted = rowWanted+1
×
4134

4135
        if modWanted is None:
2✔
4136
            modWanted = round(nMods / 1.99)
2✔
4137
        if rowWanted is None:
2✔
4138
            rowWanted = round(nRows / 1.99)
2✔
4139

4140
        if debug is True:
2✔
4141
            print( f"Sampling: modWanted {modWanted}, rowWanted {rowWanted} "
×
4142
                  "out of {nMods} modules, {nRows} rows" )
4143

4144
        x0 = (modWanted-1)*scenex - (scenex*(round(nMods/1.99)*1.0-1))
2✔
4145
        y0 = (rowWanted-1)*pitch - (pitch*(round(nRows / 1.99)*1.0-1))
2✔
4146

4147
        x1 = x0 * np.cos ((180-azimuth)*dtor) - y0 * np.sin((180-azimuth)*dtor)
2✔
4148
        y1 = x0 * np.sin ((180-azimuth)*dtor) + y0 * np.cos((180-azimuth)*dtor)
2✔
4149
        z1 = 0
2✔
4150

4151
        if axis_tilt != 0 and azimuth == 90:
2✔
4152
            print ("fixing height for axis_tilt")
×
4153
            z1 = (modWanted-1)*scenex * np.sin(axis_tilt*dtor)
×
4154

4155
        # Edge of Panel
4156
        x2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4157
        y2 = (sceney/2.0) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4158
        z2 = -(sceney/2.0) * np.sin(tilt*dtor)
2✔
4159

4160

4161
        # Axis of rotation Offset (if offset is not 0) for the front of the module
4162
        x3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4163
        y3 = (offset + modulez + frontsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4164
        z3 = (offset + modulez + frontsurfaceoffset) * np.cos(tilt*dtor)
2✔
4165

4166
        # Axis of rotation Offset, for the back of the module 
4167
        x4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.sin((azimuth)*dtor)
2✔
4168
        y4 = (offset - backsurfaceoffset) * np.sin(tilt*dtor) * np.cos((azimuth)*dtor)
2✔
4169
        z4 = (offset - backsurfaceoffset) * np.cos(tilt*dtor)
2✔
4170

4171
        xstartfront = x1 + x2 + x3 + originx
2✔
4172
        xstartback = x1 + x2 + x4 + originx
2✔
4173

4174
        ystartfront = y1 + y2 + y3 + originy
2✔
4175
        ystartback = y1 + y2 + y4 + originy
2✔
4176

4177
        zstartfront = height + z1 + z2 + z3
2✔
4178
        zstartback = height + z1 + z2 + z4
2✔
4179

4180
        #Adjust orientation of scan depending on tilt & azimuth
4181
        zdir = np.cos((tilt)*dtor)
2✔
4182
        ydir = np.sin((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4183
        xdir = np.sin((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4184
        front_orient = '%0.3f %0.3f %0.3f' % (-xdir, -ydir, -zdir)
2✔
4185
        back_orient = '%0.3f %0.3f %0.3f' % (xdir, ydir, zdir)
2✔
4186
    
4187
        #IF cellmodule:
4188
        #TODO: Add check for sensorsx_back
4189
        #temp = scene.moduleDict.get('cellModule') #error-free way to query it
4190
        #if ((temp is not None) and
4191
        if ((getattr(scene.module, 'cellModule', None)) and
2✔
4192
            (sensorsy_back == scene.module.cellModule.numcellsy)):
4193
            ycell = scene.module.cellModule.ycell
2✔
4194
            xinc_back = -((sceney - ycell ) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4195
            yinc_back = -((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4196
            zinc_back = ((sceney - ycell) / (scene.module.cellModule.numcellsy-1)) * np.sin(tilt*dtor)
2✔
4197
            firstsensorxstartfront = xstartfront - scene.module.cellModule.ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4198
            firstsensorxstartback = xstartback  - ycell/2 * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4199
            firstsensorystartfront = ystartfront - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4200
            firstsensorystartback = ystartback - ycell/2 * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4201
            firstsensorzstartfront = zstartfront + ycell/2 * np.sin(tilt*dtor)
2✔
4202
            firstsensorzstartback = zstartback + ycell/2  * np.sin(tilt*dtor)
2✔
4203
            xinc_front = xinc_back
2✔
4204
            yinc_front = yinc_back
2✔
4205
            zinc_front = zinc_back
2✔
4206
            
4207
            sx_xinc_front = 0.0
2✔
4208
            sx_yinc_front = 0.0
2✔
4209
            sx_zinc_front = 0.0
2✔
4210
            sx_xinc_back = 0.0
2✔
4211
            sx_yinc_back = 0.0
2✔
4212
            sx_zinc_back = 0.0
2✔
4213
        
4214
            if (sensorsx_back != 1.0):
2✔
4215
                print("Warning: Cell-level module analysis for sensorsx > 1 not "+
×
4216
                      "fine-tuned yet. Use at own risk, some of the x positions "+
4217
                      "might fall in spacing between cells.")
4218
        else:        
4219
            xinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4220
            yinc_back = -(sceney/(sensorsy_back + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4221
            zinc_back = (sceney/(sensorsy_back + 1.0)) * np.sin(tilt*dtor)
2✔
4222
            
4223
            
4224
            if sensors_diff:
2✔
4225
                xinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.sin((azimuth)*dtor)
2✔
4226
                yinc_front = -(sceney/(sensorsy_front + 1.0)) * np.cos((tilt)*dtor) * np.cos((azimuth)*dtor)
2✔
4227
                zinc_front = (sceney/(sensorsy_front + 1.0)) * np.sin(tilt*dtor)
2✔
4228
                
4229
            else:
4230
                xinc_front = xinc_back
2✔
4231
                yinc_front = yinc_back
2✔
4232
                zinc_front = zinc_back
2✔
4233
                
4234
            firstsensorxstartfront = xstartfront+xinc_front
2✔
4235
            firstsensorxstartback = xstartback+xinc_back
2✔
4236
            firstsensorystartfront = ystartfront+yinc_front
2✔
4237
            firstsensorystartback = ystartback+yinc_back
2✔
4238
            firstsensorzstartfront = zstartfront + zinc_front
2✔
4239
            firstsensorzstartback = zstartback + zinc_back
2✔
4240
        
4241
            ## Correct positions for sensorsx other than 1
4242
            # TODO: At some point, this equations can include the case where 
4243
            # sensorsx = 1, and cleanup the original position calculation to place
4244
            # firstsensorxstartback before this section on edge not on center.
4245
            # will save some multiplications and division but well, it works :)
4246
            
4247
            if sensorsx_back > 1.0:
2✔
4248
                sx_xinc_back = -(x/(sensorsx_back*1.0+1)) * np.cos((azimuth)*dtor)
×
4249
                sx_yinc_back = (x/(sensorsx_back*1.0+1)) * np.sin((azimuth)*dtor)
×
4250
                # Not needed unless axis_tilt != 0, which is not a current option
4251
                sx_zinc_back = 0.0 #       
×
4252
                
4253
                firstsensorxstartback = firstsensorxstartback + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
4254
                firstsensorystartback = firstsensorystartback - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
4255
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
4256
                #firstsensorxstartfront = firstsensorxstartback
4257
                #firstsensorystartfront = firstsensorystartback                
4258
            else:
4259
                sx_xinc_back = 0.0
2✔
4260
                sx_yinc_back = 0.0
2✔
4261
                sx_zinc_back = 0.0
2✔
4262
            
4263
            if sensorsx_front > 1.0:
2✔
4264
                sx_xinc_front = -(x/(sensorsx_front*1.0+1)) * np.cos((azimuth)*dtor)
×
4265
                sx_yinc_front = (x/(sensorsx_front*1.0+1)) * np.sin((azimuth)*dtor)
×
4266
                # Not needed unless axis_tilt != 0, which is not a current option
4267
                sx_zinc_front = 0.0 # 
×
4268
                
4269
                firstsensorxstartfront = firstsensorxstartfront + (x/2.0) * np.cos((azimuth)*dtor) + sx_xinc_back
×
4270
                firstsensorystartfront = firstsensorystartfront - (x/2.0) * np.sin((azimuth)*dtor) + sx_yinc_back
×
4271

4272
                # firstsensorzstartback Not needed unless axis_tilt != 0, which is not a current option
4273
            else:
4274
                sx_xinc_front = 0.0
2✔
4275
                sx_yinc_front = 0.0
2✔
4276
                sx_zinc_front = 0.0
2✔
4277
                
4278
                
4279
        if debug is True:
2✔
4280
            print("Azimuth", azimuth)
×
4281
            print("Coordinate Center Point of Desired Panel before azm rotation", x0, y0)
×
4282
            print("Coordinate Center Point of Desired Panel after azm rotation", x1, y1)
×
4283
            print("Edge of Panel", x2, y2, z2)
×
4284
            print("Offset Shift", x3, y3, z3)
×
4285
            print("Final Start Coordinate Front", xstartfront, ystartfront, zstartfront)
×
4286
            print("Increase Coordinates", xinc_front, yinc_front, zinc_front)
×
4287
        
4288
        frontscan = {'xstart': firstsensorxstartfront, 'ystart': firstsensorystartfront,
2✔
4289
                     'zstart': firstsensorzstartfront,
4290
                     'xinc':xinc_front, 'yinc': yinc_front, 'zinc':zinc_front,
4291
                     'sx_xinc':sx_xinc_front, 'sx_yinc':sx_yinc_front,
4292
                     'sx_zinc':sx_zinc_front, 
4293
                     'Nx': sensorsx_front, 'Ny':sensorsy_front, 'Nz':1, 'orient':front_orient }
4294
        backscan = {'xstart':firstsensorxstartback, 'ystart': firstsensorystartback,
2✔
4295
                     'zstart': firstsensorzstartback,
4296
                     'xinc':xinc_back, 'yinc': yinc_back, 'zinc':zinc_back,
4297
                     'sx_xinc':sx_xinc_back, 'sx_yinc':sx_yinc_back,
4298
                     'sx_zinc':sx_zinc_back, 
4299
                     'Nx': sensorsx_back, 'Ny':sensorsy_back, 'Nz':1, 'orient':back_orient }
4300

4301
        if modscanfront is not None:
2✔
4302
            frontscan2 = _modDict(originaldict=frontscan, moddict=modscanfront, relative=relative)
2✔
4303
        else:
4304
            frontscan2 = frontscan.copy()
2✔
4305
        if modscanback is not None:
2✔
4306
            backscan2 = _modDict(originaldict=backscan, moddict=modscanback, relative=relative)
×
4307
        else:
4308
            backscan2 = backscan.copy()   
2✔
4309
                
4310
        return frontscan2, backscan2 
2✔
4311
      
4312
    def analyzeRow(self, octfile, scene, rowWanted=None, name=None, 
2✔
4313
                   sensorsy=None, sensorsx=None ):
4314
        '''
4315
        Function to Analyze every module in the row. 
4316

4317
        Parameters
4318
        ----------
4319
        octfile : string
4320
            Filename and extension of .oct file
4321
        scene : ``SceneObj``
4322
            Generated with :py:class:`~bifacial_radiance.RadianceObj.makeScene`.
4323
        rowWanted : int
4324
            Row wanted to sample. If none, defaults to center row (rounding down)
4325
        sensorsy : int or list 
4326
            Number of 'sensors' or scanning points along the collector width 
4327
            (CW) of the module(s). If multiple values are passed, first value
4328
            represents number of front sensors, second value is number of back sensors
4329
        sensorsx : int or list 
4330
            Number of 'sensors' or scanning points along the length, the side perpendicular 
4331
            to the collector width (CW) of the module(s) for the back side of the module. 
4332
            If multiple values are passed, first value represents number of 
4333
            front sensors, second value is number of back sensors.
4334

4335
        Returns
4336
        -------
4337
        df_row : dataframe
4338
            Dataframe with all values sampled for the row.
4339

4340
        '''
4341
        #allfront = []
4342
        #allback = []
4343
        
4344
        nMods = scene.sceneDict['nMods']
2✔
4345
        
4346
        if rowWanted == None:
2✔
4347
            rowWanted = round(self.nRows / 1.99)
×
4348
        df_dict_row = {}
2✔
4349
        row_keys = ['x','y','z','rearZ','mattype','rearMat','Wm2Front','Wm2Back','Back/FrontRatio']
2✔
4350
        dict_row = df_dict_row.fromkeys(row_keys)
2✔
4351
        df_row = pd.DataFrame(dict_row, index = [j for j in range(nMods)])
2✔
4352
        
4353
        for i in range (nMods):
2✔
4354
            temp_dict = {}
2✔
4355
            frontscan, backscan = self.moduleAnalysis(scene, sensorsy=sensorsy, 
2✔
4356
                                        sensorsx=sensorsx, modWanted = i+1, 
4357
                                        rowWanted = rowWanted) 
4358
            allscan = self.analysis(octfile, name+'_Module_'+str(i), frontscan, backscan) 
2✔
4359
            front_dict = allscan[0]
2✔
4360
            back_dict = allscan[1]
2✔
4361
            temp_dict['x'] = front_dict['x']
2✔
4362
            temp_dict['y'] = front_dict['y']
2✔
4363
            temp_dict['z'] = front_dict['z']
2✔
4364
            temp_dict['rearZ'] = back_dict['z']
2✔
4365
            temp_dict['mattype'] = front_dict['mattype']
2✔
4366
            temp_dict['rearMat'] = back_dict['mattype']
2✔
4367
            temp_dict['Wm2Front'] = front_dict['Wm2']
2✔
4368
            temp_dict['Wm2Back'] = back_dict['Wm2']
2✔
4369
            temp_dict['Back/FrontRatio'] = list(np.array(front_dict['Wm2'])/np.array(back_dict['Wm2']))
2✔
4370
            df_row.iloc[i] = temp_dict
2✔
4371
            #allfront.append(front)
4372
        return df_row
2✔
4373

4374
    def analysis(self, octfile, name, frontscan, backscan,
2✔
4375
                 plotflag=False, accuracy='low', RGB=False):
4376
        """
4377
        General analysis function, where linepts are passed in for calling the
4378
        raytrace routine :py:class:`~bifacial_radiance.AnalysisObj._irrPlot` 
4379
        and saved into results with 
4380
        :py:class:`~bifacial_radiance.AnalysisObj._saveResults`.
4381

4382
        
4383
        Parameters
4384
        ------------
4385
        octfile : string
4386
            Filename and extension of .oct file
4387
        name : string 
4388
            Name to append to output files
4389
        frontscan : scene.frontscan object
4390
            Object with the sensor location information for the 
4391
            front of the module
4392
        backscan : scene.backscan object
4393
            Object with the sensor location information for the 
4394
            rear side of the module
4395
        plotflag : boolean
4396
            Include plot of resulting irradiance
4397
        accuracy : string 
4398
            Either 'low' (default - faster) or 'high' (better for low light)
4399
        RGB : Bool
4400
            If the raytrace is a spectral raytrace and information for the three channe
4401
            wants to be saved, set RGB to True.
4402

4403
            
4404
        Returns
4405
        -------
4406
         File saved in `\\results\\irr_name.csv`
4407

4408
        """
4409

4410
        if octfile is None:
2✔
4411
            print('Analysis aborted - no octfile \n')
×
4412
            return None, None
×
4413
        linepts = self._linePtsMakeDict(frontscan)
2✔
4414
        frontDict = self._irrPlot(octfile, linepts, name+'_Front',
2✔
4415
                                    plotflag=plotflag, accuracy=accuracy)
4416

4417
        #bottom view.
4418
        linepts = self._linePtsMakeDict(backscan)
2✔
4419
        backDict = self._irrPlot(octfile, linepts, name+'_Back',
2✔
4420
                                   plotflag=plotflag, accuracy=accuracy)
4421
        # don't save if _irrPlot returns an empty file.
4422
        if frontDict is not None:
2✔
4423
            if len(frontDict['Wm2']) != len(backDict['Wm2']):
2✔
4424
                self.Wm2Front = np.mean(frontDict['Wm2'])
2✔
4425
                self.Wm2Back = np.mean(backDict['Wm2'])
2✔
4426
                self.backRatio = self.Wm2Back / (self.Wm2Front + .001)
2✔
4427
                self._saveResults(frontDict, reardata=None, savefile='irr_%s.csv'%(name+'_Front'), RGB=RGB)
2✔
4428
                self._saveResults(data=None, reardata=backDict, savefile='irr_%s.csv'%(name+'_Back'), RGB=RGB)
2✔
4429
            else:
4430
                self._saveResults(frontDict, backDict,'irr_%s.csv'%(name), RGB=RGB)
2✔
4431

4432
        return frontDict, backDict
2✔
4433

4434

4435
def quickExample(testfolder=None):
2✔
4436
    """
4437
    Example of how to run a Radiance routine for a simple rooftop bifacial system
4438

4439
    """
4440

4441
    import bifacial_radiance
2✔
4442
    
4443
    if testfolder == None:
2✔
4444
        testfolder = bifacial_radiance.main._interactive_directory(
×
4445
            title = 'Select or create an empty directory for the Radiance tree')
4446

4447
    demo = bifacial_radiance.RadianceObj('simple_panel', path=testfolder)  # Create a RadianceObj 'object'
2✔
4448

4449
    # input albedo number or material name like 'concrete'.
4450
    # To see options, run setGround without any input.
4451
    demo.setGround(0.62)
2✔
4452
    try:
2✔
4453
        epwfile = demo.getEPW(lat=40.01667, lon=-105.25) # pull TMY data for any global lat/lon
2✔
4454
    except ConnectionError: # no connection to automatically pull data
×
4455
        pass
×
4456

4457
    metdata = demo.readWeatherFile(epwfile, coerce_year=2001) # read in the EPW weather data from above
2✔
4458
    #metdata = demo.readTMY() # select a TMY file using graphical picker
4459
    # Now we either choose a single time point, or use cumulativesky for the entire year.
4460
    cumulativeSky = False
2✔
4461
    if cumulativeSky:
2✔
4462
        demo.genCumSky() # entire year.
×
4463
    else:
4464
        timeindex = metdata.datetime.index(pd.to_datetime('2001-06-17 12:0:0 -7'))
2✔
4465
        demo.gendaylit(metdata=metdata, timeindex=timeindex)  # Noon, June 17th
2✔
4466

4467

4468
    # create a scene using panels in landscape at 10 deg tilt, 1.5m pitch. 0.2 m ground clearance
4469
    moduletype = 'test-module'
2✔
4470
    module = demo.makeModule(name=moduletype, x=1.59, y=0.95)
2✔
4471
    sceneDict = {'tilt':10,'pitch':1.5,'clearance_height':0.2,
2✔
4472
                 'azimuth':180, 'nMods': 10, 'nRows': 3}
4473
    #makeScene creates a .rad file with 10 modules per row, 3 rows.
4474
    scene = demo.makeScene(module=module, sceneDict=sceneDict)
2✔
4475
    # makeOct combines all of the ground, sky and object files into .oct file.
4476
    octfile = demo.makeOct(demo.getfilelist())
2✔
4477

4478
    # return an analysis object including the scan dimensions for back irradiance
4479
    analysis = bifacial_radiance.AnalysisObj(octfile, demo.name)
2✔
4480
    frontscan, backscan = analysis.moduleAnalysis(scene, sensorsy=9)
2✔
4481
    analysis.analysis(octfile, demo.name, frontscan, backscan, accuracy='low')
2✔
4482
    # bifacial ratio should be 11.6% +/- 0.1% (+/- 1% absolute with glass-glass module)
4483
    print('Annual bifacial ratio average:  %0.3f' %(
2✔
4484
            sum(analysis.Wm2Back) / sum(analysis.Wm2Front) ) )
4485

4486
    return analysis
2✔
4487

4488

STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc