This article delves into the modular programming support mechanisms of Python, including modules and packages. Modular programming is a technique for dividing a complex programming job into smaller, more manageable chunks, called modules. Modules may be used alone or as building pieces to form a bigger programme.
Modularizing code in a big system has several benefits.
application:
-
Simplicity:
Rather than focusing on the entire problem at hand, a module typically focuses on one relatively small portion of the problem. If you’re working on a single module, you’ll have a smaller problem domain to wrap your head around. This makes development easier and less error-prone. -
Maintainability:
Modules are typically designed so that they enforce logical boundaries between different problem domains. If modules are written in a way that minimizes interdependency, there is decreased likelihood that modifications to a single module will have an impact on other parts of the program. (You may even be able to make changes to a module without having any knowledge of the application outside that module.) This makes it more viable for a team of many programmers to work collaboratively on a large application. -
Reusability:
Functionality defined in a single module can be easily reused (through an appropriately defined interface) by other parts of the application. This eliminates the need to duplicate code. -
Scoping:
Modules typically define a separate
namespace
, which helps avoid collisions between identifiers in different areas of a program. (One of the tenets in the
Zen of Python
is
Namespaces are one honking great idea—let’s do more of those!
)
Python’s code promotion tools include functions, modules, and packages.
modularization.
Python Modules: Overview
There are three distinct definitions of a module in
Python:
- A module can be written in Python itself.
-
A module can be written in
C
and loaded dynamically at run-time, like the
re
(
regular expression
) module. -
A
built-in
module is intrinsically contained in the interpreter, like the
itertools
module
.
The import statement is used in all three situations to get access to a module’s data.
Python modules are the primary subject of this discussion. Python modules are awesome because they can be assembled with little effort. In order to run Python code, you need just create a file with the.py suffix and save it to your computer. So long! There is no need for any voodoo syntax or such such nonsense.
Say you have a file named mod.py that contains the following: mod.py
s = "If Comrade Napoleon says it, it must be right."
a = [100, 200, 300]
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
Mod.py defines a number of objects.
:
-
s
(a string) -
a
(a list) -
foo()
(a function) -
Foo
(a class)
If mod.py is in the right place (more on that in a bit), these objects can be accessible using import as long as you know how to use it.
follows:
>>>
>>> import mod
>>> print(mod.s)
If Comrade Napoleon says it, it must be right.
>>> mod.a
[100, 200, 300]
>>> mod.foo(['quux', 'corge', 'grault'])
arg = ['quux', 'corge', 'grault']
>>> x = mod.Foo()
>>> x
<mod.Foo object at 0x03C181F0>
The Module Search Path
Keeping with the preceding example, we will examine the results of the
statement:
import mod
The following folders will be used by the interpreter to look for mod.py when the preceding import line is executed:
sources:
-
The directory from which the input script was run or the
current directory
if the interpreter is being run interactively -
The list of directories contained in the
PYTHONPATH
environment variable, if it is set. (The format for
PYTHONPATH
is OS-dependent but should mimic the
PATH
environment variable.) - An installation-dependent list of directories configured at the time Python is installed
Python’s sys.path variable, acquired through the sys module, stores the finalised route used for searching.
:
>>>
>>> import sys
>>> sys.path
['', 'C:\\Users\\john\\Documents\\Python\\doc', 'C:\\Python36\\Lib\\idlelib',
'C:\\Python36\\python36.zip', 'C:\\Python36\\DLLs', 'C:\\Python36\\lib',
'C:\\Python36', 'C:\\Python36\\lib\\site-packages']
It’s important to remember that the contents of sys.path may vary from one installation to the next. There is a good chance that the above will appear differently on your computer.
Thus, you must execute one of the following to guarantee that your module is
following:
-
Put
mod.py
in the directory where the input script is located or the
current directory
, if interactive -
Modify the
PYTHONPATH
environment variable to contain the directory where
mod.py
is located before starting the interpreter-
Or:
Put
mod.py
in one of the directories already contained in the
PYTHONPATH
variable
-
-
Put
mod.py
in one of the installation-dependent directories, which you may or may not have write-access to, depending on the OS
The module file may be placed in any directory you choose; then, during runtime, you can add that directory to the sys.path environment variable. It might be possible to place mod.py at C:Usersjohn and then run the following command:
statements:
>>>
>>> sys.path.append(r'C:\Users\john')
>>> sys.path
['', 'C:\\Users\\john\\Documents\\Python\\doc', 'C:\\Python36\\Lib\\idlelib',
'C:\\Python36\\python36.zip', 'C:\\Python36\\DLLs', 'C:\\Python36\\lib',
'C:\\Python36', 'C:\\Python36\\lib\\site-packages', 'C:\\Users\\john']
>>> import mod
After importing a module, you may find out where it was stored by looking at its __file__.
attribute:
>>>
>>> import mod
>>> mod.__file__
'C:\\Users\\john\\mod.py'
>>> import re
>>> re.__file__
'C:\\Python36\\lib\\re.py'
There must be a directory in sys.path matching the directory in the __file__ part of the variable.
.
The
The import statement exposes a module’s contents to the calling programme. Several variations on the import declaration are shown.
below.
The already-present example is the simplest possible form.
above:
import <module_name>
It’s important to remember that this doesn’t provide the calling programme unrestricted access to the module’s data. When it comes to objects created inside a module, that module’s private symbol table is equivalent to a global symbol table. In this way, as was previously said, a module introduces a new namespace.
The import line just adds the module’s name to the caller’s symbol table. The module’s private symbol table is where the objects declared there stay.
As seen below, the only way for the caller to access objects inside the module is to use the dot notation prefix module name>.
Mod will be added to the local symbol table after the next import statement. Thus, the term mod has significance in the context of the
context:
>>>
>>> import mod
>>> mod
<module 'mod' from 'C:\\Users\\john\\Documents\\Python\\doc\\mod.py'>
Locally, however, neither s nor foo has any significance outside of the module’s private symbol table.
context:
>>>
>>> s
NameError: name 's' is not defined
>>> foo('quux')
NameError: name 'foo' is not defined
Names of objects declared in the module need the prefix mod to be accessible in the local context.
:
>>>
>>> mod.s
'If Comrade Napoleon says it, it must be right.'
>>> mod.foo('quux')
arg = quux
It is possible to specify many modules in a single import by separating them with commas.
statement:
import <module_name>[, <module_name> ...]
Instead, the import statement may be used to directly import objects from the module into the symbol table of the calling programme.
:
from <module_name> import <name(s)>
When the above sentence has been executed, the caller may make references to name(s)> outside of the scope of the module name>.
prefix:
>>>
>>> from mod import s, foo
>>> s
'If Comrade Napoleon says it, it must be right.'
>>> foo('quux')
arg = quux
>>> from mod import Foo
>>> x = Foo()
>>> x
<mod.Foo object at 0x02E3AD50>
Any preexisting objects with the identical names are wiped out by this import method since the names are entered straight into the caller’s symbol table.
:
>>>
>>> a = ['foo', 'bar', 'baz']
>>> a
['foo', 'bar', 'baz']
>>> from mod import a
>>> a
[100, 200, 300]
Also, you may import the whole of a module without picking and choosing what to bring in.
swoop:
from <module_name> import *
All object names in module name> except those that begin with an underscore (_) will be added to the local symbol table.
For
example:
>>>
>>> from mod import *
>>> s
'If Comrade Napoleon says it, it must be right.'
>>> a
[100, 200, 300]
>>> foo
<function foo at 0x03B449C0>
>>> Foo
<class 'mod.Foo'>
Although it may work, it’s not ideal for use in widely deployed production programmes. Since you are potentially mass-entering names into the local symbol database, it is risky. There is a good risk that you will accidentally overwrite an existing name unless you know them all quite well and are certain that there will be no conflict. Yet, this syntax is highly helpful when playing about with the interactive interpreter for testing or discovery purposes, since it provides immediate access to all of a module’s features without the need for extensive typing.
typing.
In addition, you may import objects one-by-one and manually put their symbols into the local symbol table using a different symbol.
names:
from <module_name> import <name> as <alt_name>[, <name> as <alt_name> …]
As a result, new names may be added to the local symbol table without having to worry about duplicating any
names:
>>>
>>> s = 'foo'
>>> a = ['foo', 'bar', 'baz']
>>> from mod import s as string, a as alist
>>> s
'foo'
>>> string
'If Comrade Napoleon says it, it must be right.'
>>> a
['foo', 'bar', 'baz']
>>> alist
[100, 200, 300]
Instead, you may import a whole module by using the
name:
import <module_name> as <alt_name>
>>>
>>> import mod as my_module
>>> my_module.a
[100, 200, 300]
>>> my_module.foo('qux')
arg = qux
The contents of a module may be imported into a function definition. The import won’t happen until the function is actually invoked in that scenario.
:
>>>
>>> def bar():
... from mod import foo
... foo('corge')
...
>>> bar()
arg = corge
But, with Python 3, you can’t just just import * in any old file.
function:
>>>
>>> def bar():
... from mod import *
...
SyntaxError: import * only allowed at module level
Finally, you may prevent potentially disastrous import errors by using a try statement with an except ImportError clause.
attempts:
>>>
>>> try:
... # Non-existent module
... import baz
... except ImportError:
... print('Module not found')
...
Module not found
>>>
>>> try:
... # Existing module, but non-existent object
... from mod import baz
... except ImportError:
... print('Object not found in module')
...
Object not found in module
The
The list of names that have been assigned to a namespace is returned by the built-in function dir(). When called without parameters, it returns a list of names in the current local symbol table, ordered alphabetically.
:
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> qux = [1, 2, 3, 4, 5]
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'qux']
>>> class Bar():
... pass
...
>>> x = Bar()
>>> dir()
['Bar', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'qux', 'x']
Have a look at how the names of numerous predefined variables and existing names in the namespace are listed in the first call to dir() above. All of the newly declared names (qux, Bar, x) will be reflected in further calls to dir().
When figuring out what an import has contributed to the namespace, this might be helpful.
statement:
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> import mod
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'mod']
>>> mod.s
'If Comrade Napoleon says it, it must be right.'
>>> mod.foo([1, 2, 3])
arg = [1, 2, 3]
>>> from mod import a, Foo
>>> dir()
['Foo', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'a', 'mod']
>>> a
[100, 200, 300]
>>> x = Foo()
>>> x
<mod.Foo object at 0x002EAD50>
>>> from mod import s as string
>>> dir()
['Foo', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'a', 'mod', 'string', 'x']
>>> string
'If Comrade Napoleon says it, it must be right.'
Given a module’s name, dir() returns a list of all of the names that are used inside that module.
module:
>>>
>>> import mod
>>> dir(mod)
['Foo', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__',
'__name__', '__package__', '__spec__', 'a', 'foo', 's']
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from mod import *
>>> dir()
['Foo', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'a', 'foo', 's']
Executing a Module as a Script
There is no reason why any.py file containing a module cannot be run in the same way as a regular Python script.
This time, we’ll use the definition of mod.py that was just provided: mod.py
s = "If Comrade Napoleon says it, it must be right."
a = [100, 200, 300]
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
You may use this in a
script:
C:\Users\john\Documents>python mod.py
C:\Users\john\Documents>
In the absence of any mistakes, we may safely assume that the procedure was successful. Well, so it’s not exactly riveting. To the best of my knowledge, it simply specifies things as they now stand. It does not process them or provide any results.
Let’s fix up the above Python module, mod.py, so that it really produces results when executed as a script.
s = "If Comrade Napoleon says it, it must be right."
a = [100, 200, 300]
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
print(s)
print(a)
foo('quux')
x = Foo()
print(x)
At this point, it ought to be a
interesting:
C:\Users\john\Documents>python mod.py
If Comrade Napoleon says it, it must be right.
[100, 200, 300]
arg = quux
<__main__.Foo object at 0x02F101D0>
Sadly, it now also produces results when imported as a
module:
>>>
>>> import mod
If Comrade Napoleon says it, it must be right.
[100, 200, 300]
arg = quux
<mod.Foo object at 0x0169AD50>
In all likelihood, this is not what you want. It is unusual for an imported module to provide output.
Wouldn’t it be great if there was a way to tell the difference between when a file is loaded as a module and when it is executed on its own?
If you seek, you shall find.
When a.py file is imported into Python, it is treated as a module, and Python assigns the module’s name to the special dunder variable __name__. __name__ is (inventively) set to the string ‘__main__’ if a file is executed in a scripting environment. This allows you to determine which is true at runtime so you can adjust your behaviour appropriately: mod.py
s = "If Comrade Napoleon says it, it must be right."
a = [100, 200, 300]
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
if (__name__ == '__main__'):
print('Executing as standalone script')
print(s)
print(a)
foo('quux')
x = Foo()
print(x)
If you execute this now as a script, you’ll see
output:
C:\Users\john\Documents>python mod.py
Executing as standalone script
If Comrade Napoleon says it, it must be right.
[100, 200, 300]
arg = quux
<__main__.Foo object at 0x03450690>
Importing as a module, however, allows you to
don’t:
>>>
>>> import mod
>>> mod.foo('grault')
arg = grault
If you want to make that your module is doing what it’s supposed to, it should be able to run on its own as a script. Unit testing describes this kind of testing. Take the following as an example: fact.py This module contains the factorial function.
def fact(n):
return 1 if n == 1 else n * fact(n-1)
if (__name__ == '__main__'):
import sys
if len(sys.argv) > 1:
print(fact(int(sys.argv[1])))
You may think of this file as a module, and use the fact()
imported:
>>>
>>> from fact import fact
>>> fact(6)
720
But, if an integer is supplied as a command-line input, it may be executed independently.
testing:
C:\Users\john\Documents>python fact.py 6
720
Reloading a Module
For efficiency, an interpreter session only loads a module once. It works perfectly for the bulk of a module’s content, which consists of function and class definitions. Nevertheless, a module may also have executable statements, which are often used during startup. Keep in mind that these commands will only be carried out on the first module import.
Let’s take a look at mod.py, for example.
a = [100, 200, 300]
print('a =', a)
>>>
>>> import mod
a = [100, 200, 300]
>>> import mod
>>> import mod
>>> mod.a
[100, 200, 300]
For successive imports, the print() instruction is not carried out. Even if the assignment statement isn’t correct, the value of mod.a is shown correctly at the end, therefore it doesn’t matter.
To reload a modified module, you may either restart the interpreter or use the reload() method included in the importlib module.
:
>>>
>>> import mod
a = [100, 200, 300]
>>> import mod
>>> import importlib
>>> importlib.reload(mod)
a = [100, 200, 300]
<module 'mod' from 'C:\\Users\\john\\Documents\\Python\\doc\\mod.py'>
Python Packages
Let’s say you’ve built a huge app with a tonne of different features and components. Dumping all of the modules in one place makes it hard to keep track of them as their number increases. This is especially true if their names or purposes are similar. You may use some way to classify and categorise them. The module namespace may be organised hierarchically by utilising packages and the dot notation. Packages function similarly to modules in that they help prevent name clashes between other modules, such as between global variables.
Since packages use the operating system’s own hierarchical file structure, their development is relatively simple. Let’s say you settle on the following plan:
Here, you’ll find mod1.py and mod2.py in a folder called pkg. module1 contents.py module2 contents.py
def foo():
print('[mod1] foo()')
class Foo:
pass
mod2.py
def bar():
print('[mod2] bar()')
class Bar:
pass
Using this structure, if the pkg directory is accessible (in one of the directories included in sys.path), then you may refer to the two modules with dot notation (pkg.mod1, pkg.mod2) and import them with the syntax you are already acquainted with.
with:
import <module_name>[, <module_name> ...]
>>>
>>> import pkg.mod1, pkg.mod2
>>> pkg.mod1.foo()
[mod1] foo()
>>> x = pkg.mod2.Bar()
>>> x
<pkg.mod2.Bar object at 0x033F7290>
from <module_name> import <name(s)>
>>> x
<pkg.mod2.Bar object at 0x036DFFD0>
These import statements allow you to bring in modules:
well:
from <package_name> import <modules_name>[, <module_name> ...]
from <package_name> import <module_name> as <alt_name>
>>>
>>> from pkg import mod1
>>> mod1.foo()
[mod1] foo()
>>> from pkg import mod2 as quux
>>> quux.bar()
[mod2] bar()
Actually, the package may be imported as
well:
>>>
>>> import pkg
>>> pkg
<module 'pkg' (namespace)>
Yet this won’t help much. Eve=”Toggle REPL prompts and output”>
>>>
>>> from pkg.mod1 import foo
>>> foo()
[mod1] foo()
from <module_name> import <name> as <alt_name>
>>>
>>> from pkg.mod2 import Bar as Qux
>>>
>>>
>>> pkg.mod1
Traceback (most recent call last):
File "<pyshell#34>", line 1, in <module>
pkg.mod1
AttributeError: module 'pkg' has no attribute 'mod1'
>>> pkg.mod1.foo()
Traceback (most recent call last):
File "<pyshell#35>", line 1, in <module>
pkg.mod1.foo()
AttributeError: module 'pkg' has no attribute 'mod1'
>>> pkg.mod2.Bar()
Traceback (most recent call last):
File "<pyshell#36>", line 1, in <module>
pkg.mod2.Bar()
AttributeError: module 'pkg' has no attribute 'mod2'
You must use one of the shown forms in order to successfully import the modules or their contents.
above.
Package Initialization
When a package or a module inside it is imported, the import process triggers any __init .py files existing in the package's directory. For instance, this may be used to initialise package-level data.
Use this __init .py file as an example: __init .py
print(f'Invoking __init__.py for {__name__}')
A = ['quux', 'corge', 'grault']
Let's follow the preceding example and append this file to the pkg directory:
This alters the global list A upon package import.
initialised:
>>>
>>> import pkg
Invoking __init__.py for pkg
>>> pkg.A
['quux', 'corge', 'grault']
If another module in the package imports the global variable, then mod1.py may use it.
def foo():
from pkg import A
print('[mod1] foo() / A = ', A)
class Foo:
pass
>>>
>>> from pkg import mod1
Invoking __init__.py for pkg
>>> mod1.foo()
[mod1] foo() / A = ['quux', 'corge', 'grault']
It is also possible to automate the import of a package's modules by using the __init .py file. Thus far, you've seen that the line import pkg doesn't really import any modules; rather, it only adds the name pkg to the caller's local symbol table. But, the following should be included in __init .py in the package directory:
print(f'Invoking __init__.py for {__name__}')
import pkg.mod1, pkg.mod2
If you run import pkg, the modules mod1 and mod2 will be imported.
automatically:
>>>
>>> import pkg
Invoking __init__.py for pkg
>>> pkg.mod1.foo()
[mod1] foo()
>>> pkg.mod2.bar()
[mod2] bar()
It's important to keep in mind that, according to several sources, a __init .py file must be present in the package directory before the package may be created. It used to be the case. The file __init .py used to be a requirement for Python to recognise the definition of a package. The presence of the file was required, regardless of whether or not it included any initialization code.
It wasn't until Python 3.3 that the concept of Implicit Namespace Packages became available. Using these, a package may be built without the need for a __init .py file. Of course, if package initialization is required, it may still be present. But, it is no longer necessary. For more information on Python namespace packages and their uses, see What Is a Python Namespace Package? to take in knowledge
more.
Importing
For the remainder of this article, the previously specified package will be supplemented by the following modules:
The pkg directory now has definitions for four modules. The files include the following content: mod1.py
Mod2.py
def foo():
print('[mod1] foo()')
class Foo:
pass
def bar():
print('[mod2] bar()')
class Bar:
pass
modified.py
E01A1@@@62 mod4.py
def qux():
print('[mod4] qux()')
class Qux:
pass
(Can you believe their imaginations?)
You already know that when you use import * with a module, everything in that module gets imported into the local symbol table (except for things with names that start with an underscore), as you can see from the example below.
always:
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from pkg.mod3 import *
>>> dir()
['Baz', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'baz']
>>> baz()
[mod3] baz()
>>> Baz
<class 'pkg.mod3.Baz'>
For a parcel, the corresponding pronouncable is
this:
from <package_name> import *
Just what does
do?
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from pkg import *
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
Hmph. Very little. If you were to have any expectations at all, you would have thought that Python would search through the package directory, locate all the modules it could, and import them. Yet, it is clear that this is not the case by default.
Instead, when the sentence from package name> import * is encountered, Python uses the convention that if the __init .py file in the package directory has a list called __all__, it is assumed to be a list of modules that should be imported.
The following would produce a __init .py file in the pkg directory: pkg/ init .py.
__all__ = [
'mod1',
'mod2',
'mod3',
'mod4'
]
Now, when you import * from a package, it brings in the whole set of four.
modules:
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from pkg import *
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'mod1', 'mod2', 'mod3', 'mod4']
>>> mod2.bar()
[mod2] bar()
>>> mod4.Qux
<class 'pkg.mod4.Qux'>
Importing all packages is still not a great practise, and neither is importing all modules. At least with this feature, the package author may decide what to do when the import * directive is used. As the default behaviour for packages is to import nothing, this feature really allows you to completely disable it by not defining __all__ at all.
As an aside, __all__, which controls what is imported through import *, may also be declared in a module. You may change mod1.py to look like this: pkg/mod1.py
__all__ = ['foo']
def foo():
print('[mod1] foo()')
class Foo:
pass
The contents of __all__ will now be imported when using an import * line from pkg.mod1.
:
>>>
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from pkg.mod1 import *
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'foo']
>>> foo()
[mod1] foo()
>>> Foo
Traceback (most recent call last):
File "<pyshell#37>", line 1, in <module>
Foo
NameError: name 'Foo' is not defined
The class Foo is not in __all__, thus although the method foo() is now declared in the local namespace, the class itself is not.
Overall, __all__ is used by both packages and modules to manage what is imported using the import * directive. Nevertheless, the default actions are not the same.
:
-
For a package, when
__all__
is not defined,
import *
does not import anything. -
For a module, when
__all__
is not defined,
import *
imports everything (except—you guessed it—names starting with an underscore).
Subpackages
It is possible to stack subpackages inside packages arbitrarily deeply. Make the following additional change to the directory containing the sample package:
In the same way as before, we define four modules (mod1.py, mod2.py, mod3.py, and mod4.py). But now they are separated into sub pkg1 and sub pkg2 instead of all being housed in the pkg directory.
Everything of the previously shown import functionality remains unchanged. The syntax is the same, except that a. is used to denote a subpackage inside a package.
name:
>>>
>>> import pkg.sub_pkg1.mod1
>>> pkg.sub_pkg1.mod1.foo()
[mod1] foo()
>>> from pkg.sub_pkg1 import mod2
>>> mod2.bar()
[mod2] bar()
>>> from pkg.sub_pkg2.mod3 import baz
>>> baz()
[mod3] baz()
>>> from pkg.sub_pkg2.mod4 import qux as grault
>>> grault()
[mod4] qux()
It's also possible for a module from one subpackage to make use of objects from another (in the event that the sibling contains some functionality that you need). Assume, for the sake of argument, that you need to call the method foo(), which is located in module mod1, from inside module mod3. Either use the absolute import pkg/sub pkg2/mod3.py, or use the relative import
def baz():
print('[mod3] baz()')
class Baz:
pass
from pkg.sub_pkg1.mod1 import foo
foo()
>>>
>>> from pkg.sub_pkg2 import mod3
[mod1] foo()
>>> mod3.foo()
[mod1] foo()
Instead, you may use a relative import, where. means the next-higher-priced bundle. Inside the mod3.py file included in the sub pkg2 package
,
To modify file
-
..
evaluates to the parent package (
pkg
), and -
..sub_pkg1
evaluates to subpackage
sub_pkg1
of the parent package.
pkg/sub pkg2/mod3.py
def baz():
print('[mod3] baz()')
class Baz:
pass
from .. import sub_pkg1
print(sub_pkg1)
from ..sub_pkg1.mod1 import foo
foo()
>>>
>>> from pkg.sub_pkg2 import mod3
<module 'pkg.sub_pkg1' (namespace)>
[mod1] foo()