ATLAS Offline Software
Public Member Functions | Private Attributes | List of all members
sct_calib_tf.SCTCalibExecutor Class Reference
Inheritance diagram for sct_calib_tf.SCTCalibExecutor:
Collaboration diagram for sct_calib_tf.SCTCalibExecutor:

Public Member Functions

def __init__ (self, skeleton)
 
def preExecute (self, input=set(), output=set())
 
def execute (self)
 
def postExecute (self)
 
def validate (self)
 

Private Attributes

 _errMsg
 Try to avoid validation of output files self.skipOutputFileValidation=True. More...
 
 _isValidated
 
 _hasValidated
 
 _logScan
 

Detailed Description

Definition at line 313 of file sct_calib_tf.py.

Constructor & Destructor Documentation

◆ __init__()

def sct_calib_tf.SCTCalibExecutor.__init__ (   self,
  skeleton 
)

Definition at line 314 of file sct_calib_tf.py.

314  def __init__(self, skeleton):
315  athenaExecutor.__init__(self,
316  name = 'sctcalib',
317  skeletonCA='SCT_CalibAlgs.SCTCalib_Skeleton')
318 

Member Function Documentation

◆ execute()

def sct_calib_tf.SCTCalibExecutor.execute (   self)

Definition at line 446 of file sct_calib_tf.py.

446  def execute(self):
447 
448  runArgs=self.conf._argdict
449  # Check the run for criteria in runSelector
450  if runArgs['doRunSelector']._value:
451  import SCT_CalibAlgs.runSelector as runSelector
452  part=runArgs['part']._value
453  if runArgs['splitHitMap']._value == 1 :
454  skipQueue = 1
455  else:
456  skipQueue = 0
457  checkRun=runSelector.main(RunNumber,part,skipQueue,Stream)
458  if not checkRun:
459 
460  print ("Run ", RunNumber, " didn't pass run selection criteria. It will not be processed and no output will be generated. Finish execution and exit gracefully")
461  emptyDic = {}
462  self._trf._dataDictionary = emptyDic
463 
464  self._isValidated = True
465  self._trf._exitMsg = 'Did not pass run selection criteria. Finish execution and exit gracefully.'
466  self._trf._exitCode = 0
468  self._trf.generateReport(fast=True)
469  sys.exit(0)
470 
471  rootHitmapFiles = []
472  rootLbFiles = []
473  rootBSerrFiles = []
474  for inputFileName in runArgs['input'] :
475  if inputFileName.find("SCTHitMaps") != -1:
476  rootHitmapFiles.append(inputFileName)
477  if inputFileName.find("SCTLB") != -1:
478  rootLbFiles.append(inputFileName)
479  if inputFileName.find("SCTBSErrors") != -1:
480  rootBSerrFiles.append(inputFileName)
481 
482  if runArgs['splitHitMap']._value ==2 :
483 
484  if len(rootHitmapFiles) > 0 :
485 
486  fileutil.remove('SCTHitMaps.root')
487 
488  cmd = "cp -v $ROOTSYS/bin/hadd . \n"
489  cmd += "hadd -n 10 SCTHitMaps.root "
490  for inputFileName in rootHitmapFiles :
491  cmd += "%s " %(inputFileName)
492  cmd += "\n"
493 
494  print (cmd)
495  self._echologger.info('Merging Hitmap files!')
496  retcode=1
497  try:
498  retcode = os.system(cmd)
499  except OSError:
500  retcode = 1
501  if retcode == 0:
502  self._echologger.info('Root merge successful')
503  else:
504  self._echologger.error("FAILED to merge root files")
505 
506  if ( len(rootLbFiles) > 0 and (len(rootLbFiles) == len(rootHitmapFiles)) ):
507 
508  fileutil.remove('SCTLB.root')
509 
510  cmd = "cp -v $ROOTSYS/bin/hadd . \n"
511  cmd += "hadd -n 10 SCTLB.root "
512  for inputFileName in rootLbFiles :
513  cmd += "%s " %(inputFileName)
514  cmd += "\n"
515 
516  print (cmd)
517  self._echologger.info('Merging LBHitmap files!')
518  retcode=1
519  try:
520  retcode = os.system(cmd)
521  except OSError:
522  retcode = 1
523  if retcode == 0:
524  self._echologger.info('Root merge successful')
525  else:
526  self._echologger.error("FAILED to merge root files")
527 
528  if ( len(rootBSerrFiles) > 0 and (len(rootBSerrFiles) == len(rootHitmapFiles)) ):
529 
530  fileutil.remove('SCTBSErrors.root')
531 
532  cmd = "cp -v $ROOTSYS/bin/hadd . \n"
533  cmd += "hadd -n 10 SCTBSErrors.root "
534  for inputFileName in rootBSerrFiles :
535  cmd += "%s " %(inputFileName)
536  cmd += "\n"
537 
538  print (cmd)
539  self._echologger.info('Merging BSerr files!')
540  retcode=1
541  try:
542  retcode = os.system(cmd)
543  except OSError:
544  retcode = 1
545  if retcode == 0:
546  self._echologger.info('Root merge successful')
547  else:
548  self._echologger.error("FAILED to merge root files")
549 
550  super(SCTCalibExecutor, self).execute()
551 
552  if self._rc != 0:
553  try:
554  if 'less than the required minimum number of events' in open('log.sctcalib').read():
555  self._errMsg = 'Successful but warrants further investigation'
556  raise trfExceptions.TransformValidationException(trfExit.nameToCode('TRF_UNKOWN'), self._errMsg)
557  except trfExceptions.TransformValidationException:
558  pass
559 

◆ postExecute()

def sct_calib_tf.SCTCalibExecutor.postExecute (   self)

Definition at line 560 of file sct_calib_tf.py.

560  def postExecute(self):
561 
562  runArgs=self.conf._argdict
563  prefix=runArgs['prefix']._value
564 
565  # After processing Hitmaps, change Metadata of SCTHitMaps and SCTLB (and SCTBSErrors) files so
566  # they contain the number of events. This value can be used when processing
567  # noisy strips to avoid running over empty files
568 
569  listOfKeys = self._trf.dataDictionary
570 
571  if 'doNoisyStrip' in runArgs['part']._value and runArgs['splitHitMap']._value == 1:
572  outInstance0 = self.conf.dataDictionary[list(self._output)[0]]
573  outTFile0 = TFile(outInstance0._value[0])
574  print (outTFile0.GetName())
575  outNentries0 = int(outTFile0.Get('GENERAL/events').GetEntries())
576  outInstance0._setMetadata(outInstance0._value,{'nentries': outNentries0})
577 
578  outInstance1 = self.conf.dataDictionary[list(self._output)[1]]
579  outTFile1 = TFile(outInstance1._value[0])
580  print (outTFile1.GetName())
581  outNentries1 = int(outTFile1.Get('GENERAL/events').GetEntries())
582  outInstance1._setMetadata(outInstance1._value,{'nentries': outNentries1})
583 
584  if ('doDeadStrip' in runArgs['part']._value or 'doDeadChip' in runArgs['part']._value or 'doQuietStrip' in runArgs['part']._value or 'doQuietChip' in runArgs['part']._value ) and runArgs['splitHitMap']._value == 1:
585  outInstance0 = self.conf.dataDictionary[list(self._output)[0]]
586  outTFile0 = TFile(outInstance0._value[0])
587  print (outTFile0.GetName())
588  outNentries0 = int(outTFile0.Get('GENERAL/events').GetEntries())
589  outInstance0._setMetadata(outInstance0._value,{'nentries': outNentries0})
590 
591  outInstance1 = self.conf.dataDictionary[list(self._output)[1]]
592  outTFile1 = TFile(outInstance1._value[0])
593  print (outTFile1.GetName())
594  outNentries1 = int(outTFile1.Get('GENERAL/events').GetEntries())
595  outInstance1._setMetadata(outInstance1._value,{'nentries': outNentries1})
596 
597  if 'doDeadStrip' in runArgs['part']._value and runArgs['splitHitMap']._value != 1:
598  pwd=os.getcwd()
599  deadFile=pwd+'/'+prefix+'.DeadStripsFile.xml'
600  deadSummary=pwd+'/'+prefix+'.DeadSummaryFile.xml'
601 
602  numLinesFile = 0
603  numLinesSummary = 0
604  if os.path.exists(deadFile):
605  numLinesFile = sum(1 for line in open(deadFile))
606  if os.path.exists(deadSummary):
607  numLinesSummary = sum(1 for line in open(deadSummary))
608 
609  # if the files exist, but there were no dead strips there won't be COOL file, making the job fail
610  # remove the COOL file of the list of output files. Clunky, but temporal fix
611 
612  if ( numLinesFile == 2 and numLinesSummary == 20 ):
613  dataDic = self._trf.dataDictionary
614  listOfKeys = []
615 
616  for key in dataDic:
617  if key != 'COOL':
618  listOfKeys.append(key)
619 
620  redDict = {key:dataDic[key] for key in listOfKeys}
621  self._trf._dataDictionary = redDict
622 
623  if 'doDeadChip' in runArgs['part']._value and runArgs['splitHitMap']._value != 1:
624  pwd=os.getcwd()
625  deadFile=pwd+'/'+prefix+'.DeadChipsFile.xml'
626  deadSummary=pwd+'/'+prefix+'.DeadSummaryFile.xml'
627 
628  numLinesFile = 0
629  numLinesSummary = 0
630  if os.path.exists(deadFile):
631  numLinesFile = sum(1 for line in open(deadFile))
632  if os.path.exists(deadSummary):
633  numLinesSummary = sum(1 for line in open(deadSummary))
634 
635  # if the files exist, but there were no dead strips there won't be COOL file, making the job fail
636  # remove the COOL file of the list of output files. Clunky, but temporal fix
637 
638  if ( numLinesFile == 2 and numLinesSummary == 20 ):
639  dataDic = self._trf.dataDictionary
640  listOfKeys = []
641 
642  for key in dataDic:
643  if key != 'COOL':
644  listOfKeys.append(key)
645 
646  redDict = {key:dataDic[key] for key in listOfKeys}
647  self._trf._dataDictionary = redDict
648 
649  if 'doQuietStrip' in runArgs['part']._value and runArgs['splitHitMap']._value != 1:
650  pwd=os.getcwd()
651  deadFile=pwd+'/'+prefix+'.QuietStripsFile.xml'
652  deadSummary=pwd+'/'+prefix+'.QuietSummaryFile.xml'
653 
654  numLinesFile = 0
655  numLinesSummary = 0
656  if os.path.exists(deadFile):
657  numLinesFile = sum(1 for line in open(deadFile))
658  if os.path.exists(deadSummary):
659  numLinesSummary = sum(1 for line in open(deadSummary))
660 
661  # if the files exist, but there were no dead strips there won't be COOL file, making the job fail
662  # remove the COOL file of the list of output files. Clunky, but temporal fix
663 
664  if ( numLinesFile == 2 and numLinesSummary == 20 ):
665  dataDic = self._trf.dataDictionary
666  listOfKeys = []
667 
668  for key in dataDic:
669  if key != 'COOL':
670  listOfKeys.append(key)
671 
672  redDict = {key:dataDic[key] for key in listOfKeys}
673  self._trf._dataDictionary = redDict
674 
675  if 'doQuietChip' in runArgs['part']._value and runArgs['splitHitMap']._value != 1:
676  pwd=os.getcwd()
677  deadFile=pwd+'/'+prefix+'.QuietChipsFile.xml'
678  deadSummary=pwd+'/'+prefix+'.QuietSummaryFile.xml'
679 
680  numLinesFile = 0
681  numLinesSummary = 0
682  if os.path.exists(deadFile):
683  numLinesFile = sum(1 for line in open(deadFile))
684  if os.path.exists(deadSummary):
685  numLinesSummary = sum(1 for line in open(deadSummary))
686 
687  # if the files exist, but there were no dead strips there won't be COOL file, making the job fail
688  # remove the COOL file of the list of output files. Clunky, but temporal fix
689 
690  if ( numLinesFile == 2 and numLinesSummary == 20 ):
691  dataDic = self._trf.dataDictionary
692  listOfKeys = []
693 
694  for key in dataDic:
695  if key != 'COOL':
696  listOfKeys.append(key)
697 
698  redDict = {key:dataDic[key] for key in listOfKeys}
699  self._trf._dataDictionary = redDict
700 
701  if prefix != '':
702  try:
703  if runArgs['splitHitMap']._value !=1 and 'COOL' in listOfKeys:
704  os.rename('mycool.db',prefix+'.mycool.db')
705  if runArgs['splitHitMap']._value == 2:
706  os.rename('SCTHitMaps.root',prefix+'.SCTHitMaps.root')
707  if 'doNoisyStrip' in runArgs['part']._value:
708  os.rename('SCTLB.root',prefix+'.SCTLB.root')
709  if ('doDeadStrip' in runArgs['part']._value or 'doDeadChip' in runArgs['part']._value or 'doQuietStrip' in runArgs['part']._value or 'doQuietChip' in runArgs['part']._value ):
710  os.rename('SCTBSErrors.root',prefix+'.SCTBSErrors.root')
711  except OSError:
712  self._echologger.warning('failed to rename DB, ROOT or LOG file.' )
713 
714  super(SCTCalibExecutor, self).postExecute()
715 

◆ preExecute()

def sct_calib_tf.SCTCalibExecutor.preExecute (   self,
  input = set(),
  output = set() 
)
Execute runInfo, set environment and check inputtype

Definition at line 319 of file sct_calib_tf.py.

319  def preExecute(self, input=set(), output=set()):
320 
321  """ Execute runInfo, set environment and check inputtype"""
322  # Execute runInfo.py
323  runArgs=self.conf._argdict
324 
325  checkFileList(runArgs['input'])
326  namelist=[]
327  for i in range(0,len(dsDict['input'])):
328  namelist.append(dsDict['input'][i]['file'])
329 
330  self.conf.addToArgdict('inputNames', trfArgClasses.argList(namelist))
331 
332  nName=namelist[0].count('/')
333  fileName=namelist[0].split('/')[nName]
334  projectName=str(fileName.split('.')[0])
335 
336 
338 
339  if 'doRunInfo' not in runArgs:
340  self.conf.addToArgdict('doRunInfo', trfArgClasses.argBool(False))
341  else:
342  if runArgs['doRunInfo']._value:
343  import SCT_CalibAlgs.runInfo as runInfo
344 
345  print ("RunNumber for the runInfo = ", str(RunNumber), " ", Stream)
346  runInfo.main(RunNumber, projectName)
347 
348  if 'splitHitMap' not in runArgs:
349  self.conf.addToArgdict('splitHitMap', trfArgClasses.argInt(0))
350  if 'doRunSelector' not in runArgs:
351  self.conf.addToArgdict('doRunSelector', trfArgClasses.argBool(False))
352 
353 
354  if 'EventNumber' not in runArgs:
355  self.conf.addToArgdict('EventNumber', trfArgClasses.argInt(0))
356 
357  # Set STAGE_SVCCLASS
358  if SvcClass != '' and SvcClass is not None:
359  os.environ['STAGE_SVCCLASS']=SvcClass
360 
361  # Check input type
362  inputtype=dsDict['input'][0]['dataset'].split('.')[4]
363  print ("Input type = ", inputtype)
364  self.conf.addToArgdict('InputType', trfArgClasses.argString(inputtype))
365 
366  # check which parts to be run
367  if 'part' not in runArgs:
368  self.conf.addToArgdict('part', trfArgClasses.argString('doNoisyStrip'))
369 
370  part=runArgs['part']._value
371 
372  for ipart in part:
373  if ipart not in ['doNoisyStrip','doNoiseOccupancy','doDeadChip','doDeadStrip','doQuietChip','doQuietStrip','doHV','doBSErrorDB','doRawOccupancy','doEfficiency','doLorentzAngle','doNoisyLB']:
374  self._errMsg = 'Argument part=%s does not match any of the possible candidates' % ipart
375  raise trfExceptions.TransformValidationException(trfExit.nameToCode('TRF_ARG_ERRO'), self._errMsg)
376 
377  # get prefix
378  if 'prefix' not in runArgs:
379  self.conf.addToArgdict('prefix', trfArgClasses.argString(''))
380 
381  prefix=runArgs['prefix']._value
382 
383  # set job number
384  jobnb=''
385  # find seperator for jobnumber
386  if prefix != '' :
387  sep=prefix.find('._')
388  if ( sep != -1 ) :
389  jobnb=prefix[sep+1:]
390  elif ( prefix.rfind('#_') != -1 ):
391  sep=prefix.rfind('#_')
392  jobnb=prefix[sep+1:]
393 
394  # find seperator for prefix
395  sep=prefix.find('#')
396  if (sep != -1) :
397  prefix=prefix[:sep]
398  elif (prefix.find('._') != -1):
399  sep=prefix.rfind('._')
400  prefix=prefix[:sep]
401 
402  # set prefix and jobnumber
403  prefix+='.'+jobnb
404  runArgs['prefix']._value = prefix
405 
406  # When ATLAS is NOT in standby the SCT is, the hitmap root files have 0 events,
407  # even though the calibration_SCTNoise streams has 10k+ events.
408  # If the noisy strips task is generated, the jobs will fail. A.N has implemented
409  # a condition a t0 level so they won't be defined. However,
410  # when runSelector uses AtlRunQuery to look for the runs that have 10k+ events
411  # in the calibration_SCTNoise stream, those runs that failed or were skipped
412  # will appear as waiting to be uploaded, making the rest keep on hold.
413 
414  # We include a protection against those cases: if the summed number of events
415  # of hitmap files is <10k, we don't execute the noisy strips. Rather, we exit
416  # with 'success' status, so the job won't fail at t0, and update the value
417  # of the last run uploaded as if this run had been uploaded, to avoid the
418  # next run being indefinitely on hold
419  # print 'Number of events: ', NumberOfEvents
420 
421  if ('doNoisyStrip' in part or 'doDeadStrip' in part or 'doDeadChip' in part or 'doQuietStrip' in part or 'doQuietChip' in part) and runArgs['splitHitMap']._value==2 and NumberOfEvents<1:
422  self._isValidated = True
423  self._trf._exitCode = 0
424  self._trf._exitMsg = 'Noisy/dead/quiet strips/chips trying to read root files with 0 events. Gracefully exit and update lastRun counter to %s' %(RunNumber)
425 
426  updateLastRun(RunNumber)
427  emptyDic = {}
428  self._trf._dataDictionary = emptyDic
429 
431  self._trf.generateReport(fast=True)
432  sys.exit(0)
433 
434  if jobnb != '':
435  self.conf.addToArgdict('JobNumber', trfArgClasses.argString(jobnb))
436 
437  # get RunNumber from datasetName
438  if not RunNumber == -1:
439  self.conf.addToArgdict('RunNumber', trfArgClasses.argInt(RunNumber))
440  if not Stream == '':
441  self.conf.addToArgdict('Stream', trfArgClasses.argString(Stream))
442 
443  # Do other prerun actions
444  super(SCTCalibExecutor, self).preExecute(input,output)
445 

◆ validate()

def sct_calib_tf.SCTCalibExecutor.validate (   self)

Definition at line 716 of file sct_calib_tf.py.

716  def validate(self):
717  self._hasValidated = True
718  deferredException = None
719 
720  if 'ignorePatterns' in self.conf._argdict:
721  igPat = self.conf.argdict['ignorePatterns'].value
722  else:
723  igPat = []
724  if 'ignoreFiles' in self.conf._argdict:
725  ignorePatterns = trfValidation.ignorePatterns(files = self.conf._argdict['ignoreFiles'].value, extraSearch=igPat)
726  elif self._errorMaskFiles is not None:
727  ignorePatterns = trfValidation.ignorePatterns(files = self._errorMaskFiles, extraSearch=igPat)
728  else:
729  ignorePatterns = trfValidation.ignorePatterns(files = athenaExecutor._defaultIgnorePatternFile, extraSearch=igPat)
730 
731  # Now actually scan my logfile
732  msg.info('Scanning logfile {0} for errors'.format(self._logFileName))
733  self._logScan = trfValidation.athenaLogFileReport(logfile = self._logFileName, ignoreList = ignorePatterns)
734  worstError = self._logScan.worstError()
735 
736  # In general we add the error message to the exit message, but if it's too long then don't do
737  # that and just say look in the jobReport
738  if worstError['firstError']:
739  if len(worstError['firstError']['message']) > athenaExecutor._exitMessageLimit:
740  if 'CoreDumpSvc' in worstError['firstError']['message']:
741  exitErrorMessage = "Core dump at line {0} (see jobReport for further details)".format(worstError['firstError']['firstLine'])
742  elif 'G4Exception' in worstError['firstError']['message']:
743  exitErrorMessage = "G4 exception at line {0} (see jobReport for further details)".format(worstError['firstError']['firstLine'])
744  else:
745  exitErrorMessage = "Long {0} message at line {1} (see jobReport for further details)".format(worstError['level'], worstError['firstError']['firstLine'])
746  else:
747  exitErrorMessage = "Logfile error in {0}: \"{1}\"".format(self._logFileName, worstError['firstError']['message'])
748  else:
749  exitErrorMessage = "Error level {0} found (see athena logfile for details)".format(worstError['level'])
750 
751  # If we failed on the rc, then abort now
752  if deferredException is not None:
753  # Add any logfile information we have
754  if worstError['nLevel'] >= stdLogLevels['ERROR']:
755  deferredException.errMsg = deferredException.errMsg + "; {0}".format(exitErrorMessage)
756  raise deferredException
757 
758  # ignore instances of "unknown offline id..."
759  # less than ~10/event are admisible
760  # if > 10/event, event is skipped in SCT_CalibEventInfo
761 
762  if worstError['firstError'] is not None:
763  if 'ERROR Unknown offlineId for OnlineId' in worstError['firstError']['message']:
764  worstError['nLevel'] = 30
765  worstError['level'] = 'WARNING'
766 
767  # Very simple: if we get ERROR or worse, we're dead, except if ignoreErrors=True
768  if worstError['nLevel'] == stdLogLevels['ERROR'] and ('ignoreErrors' in self.conf._argdict and self.conf._argdict['ignoreErrors'].value is True):
769  msg.warning('Found ERRORs in the logfile, but ignoring this as ignoreErrors=True (see jobReport for details)')
770  elif worstError['nLevel'] >= stdLogLevels['ERROR']:
771  self._isValidated = False
772  msg.error('Fatal error in athena logfile (level {0})'.format(worstError['level']))
773  raise trfExceptions.TransformLogfileErrorException(trfExit.nameToCode('TRF_EXEC_LOGERROR'),
774  ' Fatal error in athena logfile: "{0}"'.format(exitErrorMessage))
775 
776  # Must be ok if we got here!
777  msg.info('Executor {0} has validated successfully'.format(self.name))
778  self._isValidated = True
779 

Member Data Documentation

◆ _errMsg

sct_calib_tf.SCTCalibExecutor._errMsg
private

Try to avoid validation of output files self.skipOutputFileValidation=True.

This is a try to set event number manually if run over HIST files

Definition at line 374 of file sct_calib_tf.py.

◆ _hasValidated

sct_calib_tf.SCTCalibExecutor._hasValidated
private

Definition at line 717 of file sct_calib_tf.py.

◆ _isValidated

sct_calib_tf.SCTCalibExecutor._isValidated
private

Definition at line 422 of file sct_calib_tf.py.

◆ _logScan

sct_calib_tf.SCTCalibExecutor._logScan
private

Definition at line 733 of file sct_calib_tf.py.


The documentation for this class was generated from the following file:
grepfile.info
info
Definition: grepfile.py:38
runInfo.main
def main(runNum=None, projectName='')
Definition: runInfo.py:10
read
IovVectorMap_t read(const Folder &theFolder, const SelectionCriterion &choice, const unsigned int limit=10)
Definition: openCoraCool.cxx:569
sct_calib_tf.checkFileList
def checkFileList(filelist)
Definition: sct_calib_tf.py:78
validation.validate
def validate(testSampleDir, thisSampleName, testSamplePath, weight_database, outputSamples)
Definition: validation.py:26
vtune_athena.format
format
Definition: vtune_athena.py:14
CaloCellPos2Ntuple.int
int
Definition: CaloCellPos2Ntuple.py:24
python.trfSignal.resetTrfSignalHandlers
def resetTrfSignalHandlers()
Restore signal handlers to the default ones.
Definition: trfSignal.py:40
GetEntries
TGraphErrors * GetEntries(TH2F *histo)
Definition: TRTCalib_makeplots.cxx:4019
XMLtoHeader.count
count
Definition: XMLtoHeader.py:85
LArG4FSStartPointFilterLegacy.execute
execute
Definition: LArG4FSStartPointFilterLegacy.py:20
sct_calib_tf.updateLastRun
def updateLastRun(RunNumber)
Definition: sct_calib_tf.py:106
convertTimingResiduals.sum
sum
Definition: convertTimingResiduals.py:55
plotBeamSpotVxVal.range
range
Definition: plotBeamSpotVxVal.py:195
histSizes.list
def list(name, path='/')
Definition: histSizes.py:38
CxxUtils::set
constexpr std::enable_if_t< is_bitmask_v< E >, E & > set(E &lhs, E rhs)
Convenience function to set bits in a class enum bitmask.
Definition: bitmask.h:232
runSelector.main
def main(runNum=None, procType=None, forceSkipQueue=0, Stream=None)
Definition: runSelector.py:9
python.processes.powheg.ZZ.ZZ.__init__
def __init__(self, base_directory, **kwargs)
Constructor: all process options are set here.
Definition: ZZ.py:18
Trk::open
@ open
Definition: BinningType.h:40
str
Definition: BTagTrackIpAccessor.cxx:11
error
Definition: IImpactPoint3dEstimator.h:70
Trk::split
@ split
Definition: LayerMaterialProperties.h:38