Skip to content

Commit 01603e4

Browse files
authored
pyupgrade code (#270)
* pyupgrade 3.7+ * black format * black format of docs * update gitignore * updated flake8 ignore * addressed comments * minor flake8 fixes
1 parent 9506f3e commit 01603e4

29 files changed

+150
-178
lines changed

.flake8

+1
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,5 @@ extend-exclude =
33
# OptView and related files need to be fixed eventually
44
pyoptsparse/postprocessing/OptView.py
55
pyoptsparse/postprocessing/OptView_baseclass.py
6+
pyoptsparse/postprocessing/OptView_dash.py
67
pyoptsparse/postprocessing/view_saved_figure.py

.gitignore

+5-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
syntax: glob
2-
build/*
2+
build
3+
doc/_build
34
*.pyc
45
*.so
56
*.out
@@ -8,4 +9,6 @@ build/*
89
*.gz
910
*.c
1011
*.cpp
11-
env
12+
env
13+
pyoptsparse/pySNOPT/source
14+
pyoptsparse/pyNLPQLP/source

doc/guide.rst

+30-28
Original file line numberDiff line numberDiff line change
@@ -22,19 +22,19 @@ The optimization class is created using the following call:
2222

2323
.. code-block:: python
2424
25-
>>> optProb = Optimization('name', objFun)
25+
optProb = Optimization("name", objFun)
2626
2727
The general template of the objective function is as follows:
2828

2929
.. code-block:: python
3030
3131
def obj_fun(xdict):
32-
funcs = {}
33-
funcs['obj_name'] = function(xdict)
34-
funcs['con_name'] = function(xdict)
35-
fail = False # Or True if an analysis failed
32+
funcs = {}
33+
funcs["obj_name"] = function(xdict)
34+
funcs["con_name"] = function(xdict)
35+
fail = False # Or True if an analysis failed
3636
37-
return funcs, fail
37+
return funcs, fail
3838
3939
where:
4040

@@ -51,19 +51,19 @@ to simply call :meth:`addVar <pyoptsparse.pyOpt_optimization.Optimization.addVar
5151

5252
.. code-block:: python
5353
54-
>>> optProb.addVar('var_name')
54+
optProb.addVar("var_name")
5555
5656
This will result in a scalar variable included in the ``x`` dictionary call to ``obj_fun`` which can be accessed by doing
5757

5858
.. code-block:: python
5959
60-
>>> x['var_name']
60+
x["var_name"]
6161
6262
A more complex example will include lower bounds, upper bounds and a non-zero initial value:
6363

6464
.. code-block:: python
6565
66-
>>> optProb.addVar('var_name',lower=-10, upper=5, value=-2)
66+
optProb.addVar("var_name", lower=-10, upper=5, value=-2)
6767
6868
The ``lower`` or ``upper`` keywords may be specified as ``None`` to signify there is no bound on the variable.
6969

@@ -84,7 +84,7 @@ For example, to add 10 variables with no lower bound, and a scale factor of 0.1:
8484

8585
.. code-block:: python
8686
87-
>>> optProb.addVarGroup('con_group', 10, upper=2.5, scale=0.1)
87+
optProb.addVarGroup("con_group", 10, upper=2.5, scale=0.1)
8888
8989
9090
Constraints
@@ -95,22 +95,22 @@ to use the function :meth:`addCon <pyoptsparse.pyOpt_optimization.Optimization.a
9595

9696
.. code-block:: python
9797
98-
>>> optProb.addCon('not_a_real_constraint')
98+
optProb.addCon("not_a_real_constraint")
9999
100100
To include bounds on the constraints, use the ``lower`` and ``upper`` keyword arguments.
101101
If ``lower`` and ``upper`` are the same, it will be treated as an equality constraint:
102102

103103
.. code-block:: python
104104
105-
>>> optProb.addCon('inequality_constraint', upper=10)
106-
>>> optProb.addCon('equality_constraint', lower=5, upper=5)
105+
optProb.addCon("inequality_constraint", upper=10)
106+
optProb.addCon("equality_constraint", lower=5, upper=5)
107107
108108
Like design variables, it is often necessary to scale constraints such that all constraint values are approximately the same order of magnitude.
109109
This can be specified using the ``scale`` keyword:
110110

111111
.. code-block:: python
112112
113-
>>> optProb.addCon('scaled_constraint', upper=10000, scale=1.0/10000)
113+
optProb.addCon("scaled_constraint", upper=10000, scale=1.0 / 10000)
114114
115115
Even if the ``scale`` keyword is given, the ``lower`` and ``upper`` bounds are given in their un-scaled form.
116116
Internally, pyOptSparse will use the scaling factor to produce the following constraint:
@@ -153,14 +153,14 @@ By way of example, the code that generates the hypothetical optimization proble
153153

154154
.. code-block:: python
155155
156-
optProb.addVarGroup('varA', 3)
157-
optProb.addVarGroup('varB', 1)
158-
optProb.addVarGroup('varC', 3)
156+
optProb.addVarGroup("varA", 3)
157+
optProb.addVarGroup("varB", 1)
158+
optProb.addVarGroup("varC", 3)
159159
160-
optProb.addConGroup('conA', 2, upper=0.0, wrt=['varB', 'varC'])
161-
optProb.addConGroup('conB', 2, upper=0.0, wrt=['varC', 'varA'])
162-
optProb.addConGroup('conC', 4, upper=0.0)
163-
optProb.addConGroup('conD', 3, upper=0.0, wrt=['varC'])
160+
optProb.addConGroup("conA", 2, upper=0.0, wrt=["varB", "varC"])
161+
optProb.addConGroup("conB", 2, upper=0.0, wrt=["varC", "varA"])
162+
optProb.addConGroup("conC", 4, upper=0.0)
163+
optProb.addConGroup("conD", 3, upper=0.0, wrt=["varC"])
164164
165165
Note that the order of the ``wrt`` (which stands for with-respect-to) is not significant.
166166
Furthermore, if the ``wrt`` argument is omitted altogether, pyOptSparse assumes that the constraint is dense.
@@ -170,7 +170,7 @@ To do so, use the following call after adding all the design variables, objectiv
170170

171171
.. code-block:: python
172172
173-
>>> optProb.printSparsity()
173+
optProb.printSparsity()
174174
175175
Using the ``wrt`` keyword allows the user to determine the overall sparsity structure of the constraint Jacobian.
176176
However, we have currently assumed that each of the blocks with an ``X`` in is a dense sub-block.
@@ -182,12 +182,12 @@ By way of example, the call instead may be as follows:
182182

183183
.. code-block:: python
184184
185-
jac = sparse.lil_matrix((3,3))
186-
jac[0,0] = 1.0
187-
jac[1,1] = 4.0
188-
jac[2,2] = 5.0
185+
jac = sparse.lil_matrix((3, 3))
186+
jac[0, 0] = 1.0
187+
jac[1, 1] = 4.0
188+
jac[2, 2] = 5.0
189189
190-
optProb.addConGroup('conD', 3, upper=0.0, wrt=['varC'], linear=True, jac={'varC':jac})
190+
optProb.addConGroup("conD", 3, upper=0.0, wrt=["varC"], linear=True, jac={"varC": jac})
191191
192192
We have created a linked list sparse matrix using ``scipy.sparse``.
193193
Any SciPy sparse matrix format can be accepted.
@@ -215,7 +215,7 @@ This is accomplished using a the call to :meth:`addObj <pyoptsparse.pyOpt_optimi
215215

216216
.. code-block:: python
217217
218-
optProb.addObj('obj_name')
218+
optProb.addObj("obj_name")
219219
220220
What this does is tell pyOptSparse that the key ``obj_name`` in the function returns will be taken as the objective.
221221
For optimizers that can do multi-objective optimization (e.g. NSGA2), multiple objectives can be added.
@@ -264,6 +264,7 @@ The first, and most explicit approach is to directly import the optimizer class,
264264
.. code-block:: python
265265
266266
from pyoptsparse import SLSQP
267+
267268
opt = SLSQP(...)
268269
269270
However, in order to easily switch between different optimizers without having to import each class, a convenience function called
@@ -273,6 +274,7 @@ It accepts a string argument in addition to the usual options, and instantiates
273274
.. code-block:: python
274275
275276
from pyoptsparse import OPT
277+
276278
opt = OPT("SLSQP", ...)
277279
278280
Note that the name of the optimizer is case-insensitive, so ``slsqp`` can also be used.

doc/postprocessing.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -150,7 +150,7 @@ To extract the stored information in Python, first initialize a History object:
150150

151151
.. code-block:: python
152152
153-
>>> hist = History('path/to/opt_hist.hst', flag='r')
153+
hist = History("path/to/opt_hist.hst", flag="r")
154154
155155
From here, various information can be extracted, using the various ``get_`` methods.
156156
To extract iteration history, use the function ``getValues()``.

doc/quickstart.rst

+4-4
Original file line numberDiff line numberDiff line change
@@ -33,15 +33,15 @@ Notes:
3333

3434
.. code-block:: python
3535
36-
x = xdict['xvars']
36+
x = xdict["xvars"]
3737
3838
retrieves an array of length 3 which are all the variables for this optimization.
3939

4040
* The line
4141

4242
.. code-block:: python
4343
44-
conval = [0]*2
44+
conval = [0] * 2
4545
4646
creates a list of length 2, which stores the numerical values of the two constraints.
4747
The ``funcs`` dictionary return must contain keys that match the constraint names from
@@ -52,8 +52,8 @@ Notes:
5252

5353
.. code-block:: python
5454
55-
funcs['obj'] = -x[0]*x[1]*x[2]
56-
funcs['con'] = conval
55+
funcs["obj"] = -x[0] * x[1] * x[2]
56+
funcs["con"] = conval
5757
5858
Now the optimization problem can be initialized:
5959

examples/hs015VarPlot.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
db = {}
2121
opts = ["ipopt", "slsqp", "snopt", "conmin", "nlpqlp", "psqp"]
2222
for opt in opts:
23-
fileName = "{}_hs015_Hist.hst".format(opt)
23+
fileName = f"{opt}_hs015_Hist.hst"
2424
try:
2525
db[opt] = History(fileName)
2626
except FileNotFoundError:

pyoptsparse/postprocessing/OptView.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -634,7 +634,7 @@ def onselect_arr(self, evt):
634634
self.arr_data = {}
635635
self.val_names = []
636636
for i, val in enumerate(values):
637-
self.val_names.append(values_orig[0] + "_{0}".format(val))
637+
self.val_names.append(values_orig[0] + f"_{val}")
638638
self.arr_data[self.val_names[i]] = []
639639
for ind_dat in dat:
640640
self.arr_data[self.val_names[i]].append(ind_dat[val])
@@ -768,7 +768,7 @@ def save_tec(self):
768768
for j in range(m):
769769
indiv_data[j] = small_data[j][i]
770770
full_data = np.c_[full_data, indiv_data]
771-
var_names.append(key + "_{}".format(i))
771+
var_names.append(key + f"_{i}")
772772

773773
filename = "OptView_tec.dat"
774774
self._file = open(filename, "w")
@@ -778,7 +778,7 @@ def save_tec(self):
778778
self._file.write('"' + name + '" ')
779779
self._file.write("\n")
780780

781-
self._file.write('Zone T= "OptView_tec_data", ' + "I={}, ".format(num_iters) + "F=POINT\n")
781+
self._file.write('Zone T= "OptView_tec_data", ' + f"I={num_iters}, " + "F=POINT\n")
782782
np.savetxt(self._file, full_data)
783783
self._file.close()
784784

@@ -895,7 +895,7 @@ def on_move(self, event):
895895
iter_count = np.round(event.xdata, 0)
896896
ind = np.where(xdat == iter_count)[0][0]
897897

898-
label = label + "\niter: {0:d}\nvalue: {1}".format(int(iter_count), ydat[ind])
898+
label = label + f"\niter: {int(iter_count):d}\nvalue: {ydat[ind]}"
899899

900900
# Get the width of the window so we can scale the label placement
901901
size = self.f.get_size_inches() * self.f.dpi

pyoptsparse/postprocessing/OptView_baseclass.py

+5-5
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
from ..pyOpt_error import pyOptSparseWarning
1818

1919

20-
class OVBaseClass(object):
20+
class OVBaseClass:
2121

2222
"""
2323
Container for display parameters, properties, and objects.
@@ -235,7 +235,7 @@ def DetermineMajorIterations(self, db, OpenMDAO):
235235

236236
else: # this is if it's OpenMDAO
237237
for i, iter_type in enumerate(self.iter_type):
238-
key = "{}|{}".format(self.solver_name, i + 1) # OpenMDAO uses 1-indexing
238+
key = f"{self.solver_name}|{i + 1}" # OpenMDAO uses 1-indexing
239239
if i in self.deriv_keys:
240240
self.iter_type[i] = 1.0
241241

@@ -256,7 +256,7 @@ def SaveDBData(self, db, data_all, data_major, OpenMDAO, data_str):
256256
# If this is an OpenMDAO file, the keys are of the format
257257
# 'rank0:SNOPT|1', etc
258258
if OpenMDAO:
259-
key = "{}|{}".format(self.solver_name, i + 1) # OpenMDAO uses 1-indexing
259+
key = f"{self.solver_name}|{i + 1}" # OpenMDAO uses 1-indexing
260260
else: # Otherwise the keys are simply a number
261261
key = "%d" % i
262262

@@ -271,7 +271,7 @@ def SaveDBData(self, db, data_all, data_major, OpenMDAO, data_str):
271271

272272
# Format a new_key string where we append a modifier
273273
# if we have multiple history files
274-
new_key = key + "{}".format(self.histIndex)
274+
new_key = key + f"{self.histIndex}"
275275

276276
# If this key is not in the data dictionaries, add it
277277
if new_key not in data_all:
@@ -301,7 +301,7 @@ def SaveOpenMDAOData(self, db):
301301

302302
# We'll rename each item, so we need to get the old item
303303
# name and modify it
304-
item = old_item + "{}".format(self.histIndex)
304+
item = old_item + f"{self.histIndex}"
305305

306306
# Here we just have an open parenthesis, and then we will
307307
# add o, c, or dv. Note that we could add multiple flags

pyoptsparse/pyALPSO/alpso.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ def alpso(dimensions, constraints, neqcons, xtype, x0, xmin, xmax, swarmsize, nh
6060
x0 = np.array(x0)
6161
elif not isinstance(x0, np.ndarray):
6262
pyOptSparseWarning(
63-
("Initial x must be either list or numpy.array, " "all initial positions randomly generated")
63+
"Initial x must be either list or numpy.array, all initial positions randomly generated"
6464
)
6565

6666
#

pyoptsparse/pyALPSO/alpso_ext.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ def alpso(dimensions, constraints, neqcons, xtype, x0, xmin, xmax, swarmsize, nh
6060
x0 = np.array(x0)
6161
elif not isinstance(x0, np.ndarray):
6262
pyOptSparseWarning(
63-
("Initial x must be either list or numpy.array, all initial positions randomly generated")
63+
"Initial x must be either list or numpy.array, all initial positions randomly generated"
6464
)
6565

6666
#

pyoptsparse/pyIPOPT/setup.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ def configuration(parent_package="", top_path=None):
5252
FILES,
5353
library_dirs=[IPOPT_LIB],
5454
libraries=["ipopt"],
55-
extra_link_args=["-Wl,-rpath,%s -L%s" % (IPOPT_LIB, IPOPT_LIB)],
55+
extra_link_args=[f"-Wl,-rpath,{IPOPT_LIB} -L{IPOPT_LIB}"],
5656
include_dirs=[numpy_include, IPOPT_INC],
5757
)
5858
return config

pyoptsparse/pyNSGA2/setup.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -56,14 +56,14 @@ def swig_sources(self, sources, extension):
5656
else:
5757
typ2 = get_swig_target(source)
5858
if typ != typ2:
59-
log.warn("expected %r but source %r defines %r swig target" % (typ, source, typ2))
59+
log.warn(f"expected {typ!r} but source {source!r} defines {typ2!r} swig target")
6060
if typ2 == "c++":
6161
log.warn("resetting swig target to c++ (some targets may have .c extension)")
6262
is_cpp = True
6363
target_ext = ".cpp"
6464
else:
6565
log.warn("assuming that %r has c++ swig target" % (source))
66-
target_file = os.path.join(target_dir, "%s_wrap%s" % (name, target_ext))
66+
target_file = os.path.join(target_dir, f"{name}_wrap{target_ext}")
6767
else:
6868
log.warn(" source %s does not exist: skipping swig'ing." % (source))
6969
name = ext_name
@@ -79,7 +79,7 @@ def swig_sources(self, sources, extension):
7979
target_dir = os.path.dirname(base)
8080
target_file = _find_swig_target(target_dir, name)
8181
if not os.path.isfile(target_file):
82-
raise DistutilsSetupError("%r missing" % (target_file,))
82+
raise DistutilsSetupError(f"{target_file!r} missing")
8383
log.warn(" Yes! Using %r as up-to-date target." % (target_file))
8484
target_dirs.append(target_dir)
8585
new_sources.append(target_file)
@@ -108,7 +108,7 @@ def swig_sources(self, sources, extension):
108108
target = swig_targets[source]
109109
depends = [source] + extension.depends
110110
if self.force or newer_group(depends, target, "newer"):
111-
log.info("%s: %s" % (os.path.basename(swig) + (is_cpp and "++" or ""), source))
111+
log.info("{}: {}".format(os.path.basename(swig) + (is_cpp and "++" or ""), source))
112112
self.spawn(swig_cmd + self.swig_opts + ["-o", target, "-outdir", py_target_dir, source])
113113
else:
114114
log.debug(" skipping '%s' swig interface (up-to-date)" % (source))

pyoptsparse/pyOpt_MPI.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
# isort: off
1414

1515

16-
class COMM(object):
16+
class COMM:
1717
def __init__(self):
1818
self.rank = 0
1919
self.size = 1
@@ -38,7 +38,7 @@ def Barrier(self):
3838
return
3939

4040

41-
class myMPI(object):
41+
class myMPI:
4242
def __init__(self):
4343
self.COMM_WORLD = COMM()
4444
self.SUM = "SUM"

0 commit comments

Comments
 (0)