Hide keyboard shortcuts

Hot-keys on this page

r m x p   toggle line displays

j k   next/prev highlighted chunk

0   (zero) top of page

1   (one) first highlighted chunk

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

# 

# Licensed to the Apache Software Foundation (ASF) under one or more 

# contributor license agreements. See the NOTICE file distributed with 

# this work for additional information regarding copyright ownership. 

# The ASF licenses this file to You under the Apache License, Version 2.0 

# (the "License"); you may not use this file except in compliance with 

# the License. You may obtain a copy of the License at 

# 

# http://www.apache.org/licenses/LICENSE-2.0 

# 

# Unless required by applicable law or agreed to in writing, software 

# distributed under the License is distributed on an "AS IS" BASIS, 

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 

# See the License for the specific language governing permissions and 

# limitations under the License. 

# 

 

""" 

Module defining global singleton classes. 

 

This module raises a RuntimeError if an attempt to reload it is made. In that 

way the identities of the classes defined here are fixed and will remain so 

even if pyspark itself is reloaded. In particular, a function like the following 

will still work correctly after pyspark is reloaded: 

 

def foo(arg=pyspark._NoValue): 

if arg is pyspark._NoValue: 

... 

 

See gh-7844 for a discussion of the reload problem that motivated this module. 

 

Note that this approach is taken after from NumPy. 

""" 

 

__ALL__ = ['_NoValue'] 

 

 

# Disallow reloading this module so as to preserve the identities of the 

# classes defined here. 

40 ↛ 41line 40 didn't jump to line 41, because the condition on line 40 was never trueif '_is_loaded' in globals(): 

raise RuntimeError('Reloading pyspark._globals is not allowed') 

_is_loaded = True 

 

 

class _NoValueType(object): 

"""Special keyword value. 

 

The instance of this class may be used as the default value assigned to a 

deprecated keyword in order to check if it has been given a user defined 

value. 

 

This class was copied from NumPy. 

""" 

__instance = None 

 

def __new__(cls): 

# ensure that only one instance exists 

58 ↛ 60line 58 didn't jump to line 60, because the condition on line 58 was never false if not cls.__instance: 

cls.__instance = super(_NoValueType, cls).__new__(cls) 

return cls.__instance 

 

# needed for python 2 to preserve identity through a pickle 

def __reduce__(self): 

return (self.__class__, ()) 

 

def __repr__(self): 

return "<no value>" 

 

 

_NoValue = _NoValueType()