1
0
mirror of https://github.com/gsi-upm/soil synced 2024-11-12 22:42:28 +00:00

Large set of changes for v0.30

The examples weren't being properly tested in the last commit. When we fixed
that a lot of bugs in the new implementation of environment and agent were
found, which accounts for most of these changes.

The main difference is the mechanism to load simulations from a configuration
file. For that to work, we had to rework our module loading code in
`serialization` and add a `source_file` attribute to configurations (and
simulations, for that matter).
This commit is contained in:
J. Fernando Sánchez 2023-04-14 19:41:24 +02:00
parent 73282530fd
commit feab0ba79e
36 changed files with 739 additions and 875 deletions

View File

@ -6,15 +6,12 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
## [0.30 UNRELEASED]
### Added
* Simple debugging capabilities in `soil.debugging`, with a custom `pdb.Debugger` subclass that exposes commands to list agents and their status and set breakpoints on states (for FSM agents). Try it with `soil --debug <simulation file>`
* Ability to run
* Ability to
* Ability to run mesa simulations
* The `soil.exporters` module to export the results of datacollectors (model.datacollector) into files at the end of trials/simulations
* A modular set of classes for environments/models. Now the ability to configure the agents through an agent definition and a topology through a network configuration is split into two classes (`soil.agents.BaseEnvironment` for agents, `soil.agents.NetworkEnvironment` to add topology).
* FSM agents can now have generators as states. They work similar to normal states, with one caveat. Only `time` values can be yielded, not a state. This is because the state will not change, it will be resumed after the yield, at the appropriate time. The return value *can* be a state, or a `(state, time)` tuple, just like in normal states.
### Changed
* Configuration schema is very different now. Check `soil.config` for more information. We are also using Pydantic for (de)serialization.
* There may be more than one topology/network in the simulation
* Ability
* Configuration schema is very simplified
### Removed
* Any `tsih` and `History` integration in the main classes. To record the state of environments/agents, just use a datacollector. In some cases this may be slower or consume more memory than the previous system. However, few cases actually used the full potential of the history, and it came at the cost of unnecessary complexity and worse performance for the majority of cases.

View File

@ -1,5 +1,3 @@
What are the main changes between version 0.3 and 0.2?
######################################################
@ -22,22 +20,12 @@ It aims to provide more modular and convenient functions, most of which can be u
How are agents assigned to nodes in the network
###############################################
In principle, the generation of the network topology and the assignment of agents to nodes are two separate processes.
There is a mechanism to initialize the agents, a mechanism to initialize the topology, and a mechanism to assign agents to nodes.
However, there are a myriad of ways to do this, and it is not clear which is the best way to do it.
Earlier versions of Soil approached it by providing a fairly complex method of agent and node generation.
The result was a very complex and difficult to understand system, which is was also prone to bugs and changes between versions.
Starting with version 0.3, the approach is to provide a simplified yet flexible system for generating the network topology and assigning agents to nodes.
This is based on these methods:
- `create_network`
- `add_agents` (and `add_agent`)
- `populate_network`
The default implementation of `soil.Environment` accepts some parameters that will automatically do these steps for the most common case.
All other cases can be handled by overriding the `init(self)` method and explicitly using these methods.
The constructor of the `NetworkAgent` class has two arguments: `node_id` and `topology`.
If `topology` is not provided, it will default to `self.model.topology`.
This assignment might err if the model does not have a `topology` attribute, but most Soil environments derive from `NetworkEnvironment`, so they include a topology by default.
If `node_id` is not provided, a random node will be selected from the topology, until a node with no agent is found.
Then, the `node_id` of that node is assigned to the agent.
If no node with no agent is found, a new node is automatically added to the topology.
Can Soil environments include more than one network / topology?

View File

@ -26,14 +26,14 @@ def mygenerator(n=5, n_edges=5):
class GeneratorEnv(Environment):
"""Using a custom generator for the network"""
generator: parameters.function = mygenerator
generator: parameters.function = staticmethod(mygenerator)
def init(self):
self.create_network(network_generator=self.generator, n=10, n_edges=5)
self.init_agents(CounterModel)
self.create_network(generator=self.generator, n=10, n_edges=5)
self.add_agents(CounterModel)
sim = Simulation(model=GeneratorEnv, max_steps=10, interval=1)
if __name__ == '__main__':
sim.run(dry_run=True)
sim.run(dump=False)

View File

@ -30,7 +30,7 @@ from networkx import complete_graph
class TimeoutsEnv(Environment):
def init(self):
self.init_network(generator=complete_graph, n=2)
self.create_network(generator=complete_graph, n=2)
self.add_agent(agent_class=Fibonacci, node_id=0)
self.add_agent(agent_class=Odds, node_id=1)
@ -38,4 +38,4 @@ class TimeoutsEnv(Environment):
sim = Simulation(model=TimeoutsEnv, max_steps=10, interval=1)
if __name__ == "__main__":
sim.run(dry_run=True)
sim.run(dump=False)

View File

@ -56,41 +56,25 @@ class City(EventedEnvironment):
:param int height: Height of the internal grid
:param int width: Width of the internal grid
"""
n_cars = 1
n_passengers = 10
height = 100
width = 100
def init(self):
self.grid = MultiGrid(width=self.width, height=self.height, torus=False)
if not self.agents:
self.add_agents(Driver, k=self.n_cars)
self.add_agents(Passenger, k=self.n_passengers)
def __init__(
self,
*args,
n_cars=1,
n_passengers=10,
height=100,
width=100,
agents=None,
model_reporters=None,
**kwargs,
):
self.grid = MultiGrid(width=width, height=height, torus=False)
if agents is None:
agents = []
for i in range(n_cars):
agents.append({"agent_class": Driver})
for i in range(n_passengers):
agents.append({"agent_class": Passenger})
model_reporters = model_reporters or {
"earnings": "total_earnings",
"n_passengers": "number_passengers",
}
print("REPORTERS", model_reporters)
super().__init__(
*args, agents=agents, model_reporters=model_reporters, **kwargs
)
for agent in self.agents:
self.grid.place_agent(agent, (0, 0))
self.grid.move_to_empty(agent)
self.total_earnings = 0
self.add_model_reporter("total_earnings")
@property
def total_earnings(self):
return sum(d.earnings for d in self.agents(agent_class=Driver))
@report
@property
def number_passengers(self):
return self.count_agents(agent_class=Passenger)
@ -150,6 +134,7 @@ class Driver(Evented, FSM):
while self.move_towards(self.journey.destination, with_passenger=True):
yield
self.earnings += self.journey.tip
self.model.total_earnings += self.journey.tip
self.check_passengers()
return self.wandering
@ -228,13 +213,13 @@ class Passenger(Evented, FSM):
except events.TimedOut:
pass
self.info("Got home safe!")
self.die()
self.die("Got home safe!")
simulation = Simulation(name="RideHailing",
model=City,
seed="carsSeed",
max_time=1000,
model_params=dict(n_passengers=2))
if __name__ == "__main__":

View File

@ -1,7 +1,7 @@
from soil import Simulation
from social_wealth import MoneyEnv, graph_generator
sim = Simulation(name="mesa_sim", dry_run=True, max_steps=10, interval=2, model=MoneyEnv, model_params=dict(generator=graph_generator, N=10, width=50, height=50))
sim = Simulation(name="mesa_sim", dump=False, max_steps=10, interval=2, model=MoneyEnv, model_params=dict(generator=graph_generator, N=10, width=50, height=50))
if __name__ == "__main__":
sim.run()

View File

@ -53,7 +53,7 @@ class MoneyAgent(MesaAgent):
self.give_money()
class SocialMoneyAgent(NetworkAgent, MoneyAgent):
class SocialMoneyAgent(MoneyAgent, NetworkAgent):
wealth = 1
def give_money(self):

View File

@ -91,10 +91,11 @@ class NewsSpread(Environment):
prob_neighbor_cure: probability = 0.05,
def init(self):
self.populate_network([DumbViewer, HerdViewer, WiseViewer], [self.ratio_dumb, self.ratio_herd, self.ratio_wise])
self.populate_network([DumbViewer, HerdViewer, WiseViewer],
[self.ratio_dumb, self.ratio_herd, self.ratio_wise])
from itertools import permutations
from itertools import product
from soil import Simulation
@ -103,27 +104,31 @@ from soil import Simulation
# Because the effect of these agents might also depend on the network structure, we will run our simulations on two different networks:
# one with a small-world structure and one with a connected structure.
for [r1, r2, r3] in permutations([0, 0.5, 1.0], 3):
counter = 0
for [r1, r2] in product([0, 0.5, 1.0], repeat=2):
for (generator, netparams) in {
"barabasi_albert_graph": {"m": 5},
"erdos_renyi_graph": {"p": 0.1},
}.items():
print(r1, r2, r3, generator)
print(r1, r2, 1-r1-r2, generator)
# Create new simulation
netparams["n"] = 500
sim = Simulation(
Simulation(
name='newspread_sim',
model=NewsSpread,
model_params={
"ratio_dumb": r1,
"ratio_herd": r2,
"ratio_wise": r3,
"network_generator": generator,
"network_params": netparams,
"prob_neighbor_spread": 0,
},
num_trials=50,
model_params=dict(
ratio_dumb=r1,
ratio_herd=r2,
ratio_wise=1-r1-r2,
network_generator=generator,
network_params=netparams,
prob_neighbor_spread=0,
),
num_trials=5,
max_steps=300,
dry_run=True,
)
dump=False,
).run()
counter += 1
# Run all the necessary instances
sim.run()
print(f"A total of {counter} simulations were run.")

View File

@ -14,7 +14,7 @@ def mygenerator():
return G
class MyAgent(agents.FSM):
class MyAgent(agents.NetworkAgent, agents.FSM):
times_run = 0
@agents.default_state
@agents.state
@ -29,6 +29,7 @@ class ProgrammaticEnv(Environment):
def init(self):
self.create_network(generator=mygenerator)
assert len(self.G)
self.populate_network(agent_class=MyAgent)
self.add_agent_reporter('times_run')
@ -39,7 +40,7 @@ simulation = Simulation(
seed='Program',
num_trials=1,
max_time=100,
dry_run=True,
dump=False,
)
if __name__ == "__main__":

View File

@ -14,7 +14,7 @@ class CityPubs(Environment):
pub_capacity: parameters.Integer = 10
def init(self):
pubs = {}
self.pubs = {}
for i in range(self.number_of_pubs):
newpub = {
"name": "The awesome pub #{}".format(i),
@ -22,10 +22,11 @@ class CityPubs(Environment):
"capacity": self.pub_capacity,
"occupancy": 0,
}
pubs[newpub["name"]] = newpub
self.add_agent(agent_class=Police, node_id=0)
self["pubs"] = pubs
self.populate_network([{"openness": 0.1}, {"openness": 1}], [self.ratio_extroverted, 1-self.ratio_extroverted], agent_class=Patron)
self.pubs[newpub["name"]] = newpub
self.add_agent(agent_class=Police)
self.populate_network([Patron.w(openness=0.1), Patron.w(openness=1)],
[self.ratio_extroverted, 1-self.ratio_extroverted])
assert all(["agent" in node and isinstance(node["agent"], Patron) for (_, node) in self.G.nodes(data=True)])
def enter(self, pub_id, *nodes):
"""Agents will try to enter. The pub checks if it is possible"""
@ -151,10 +152,10 @@ class Patron(FSM, NetworkAgent):
continue
if friend.befriend(self):
self.befriend(friend, force=True)
self.debug("Hooray! new friend: {}".format(friend.id))
self.debug("Hooray! new friend: {}".format(friend.unique_id))
befriended = True
else:
self.debug("{} does not want to be friends".format(friend.id))
self.debug("{} does not want to be friends".format(friend.unique_id))
return befriended
@ -168,19 +169,20 @@ class Police(FSM):
def patrol(self):
drunksters = list(self.get_agents(drunk=True, state_id=Patron.drunk_in_pub.id))
for drunk in drunksters:
self.info("Kicking out the trash: {}".format(drunk.id))
self.info("Kicking out the trash: {}".format(drunk.unique_id))
drunk.kick_out()
else:
self.info("No trash to take out. Too bad.")
sim = Simulation(
model=CityPubs,
name="pubcrawl",
num_trials=3,
max_steps=10,
dry_run=True,
dump=False,
model_params=dict(
generator=nx.empty_graph,
network_generator=nx.empty_graph,
network_params={"n": 30},
model=CityPubs,
altercations=0,

View File

@ -40,7 +40,7 @@ s = Simulation(
model=RandomEnv,
num_trials=1,
max_time=100,
dry_run=True,
dump=False,
)

View File

@ -5,7 +5,6 @@ from soil.parameters import *
class TerroristEnvironment(Environment):
generator: function = nx.random_geometric_graph
n: Integer = 100
radius: Float = 0.2
@ -37,8 +36,11 @@ class TerroristEnvironment(Environment):
TerroristNetworkModel.w(state_id='leader'),
TrainingAreaModel,
HavenModel
], [self.ratio_civil, self.ratio_leader, self.ratio_trainig, self.ratio_heaven])
], [self.ratio_civil, self.ratio_leader, self.ratio_training, self.ratio_haven])
@staticmethod
def generator(*args, **kwargs):
return nx.random_geometric_graph(*args, **kwargs)
class TerroristSpreadModel(FSM, Geo):
"""
@ -50,10 +52,13 @@ class TerroristSpreadModel(FSM, Geo):
min_vulnerability (optional else zero)
max_vulnerability
prob_interaction
"""
information_spread_intensity = 0.1
terrorist_additional_influence = 0.1
min_vulnerability = 0
max_vulnerability = 1
def init(self):
if self.state_id == self.civilian.id: # Civilian
self.mean_belief = self.model.random.uniform(0.00, 0.5)
@ -75,7 +80,7 @@ class TerroristSpreadModel(FSM, Geo):
if len(neighbours) > 0:
# Only interact with some of the neighbors
interactions = list(
n for n in neighbours if self.random.random() <= self.prob_interaction
n for n in neighbours if self.random.random() <= self.model.prob_interaction
)
influence = sum(self.degree(i) for i in interactions)
mean_belief = sum(
@ -121,7 +126,7 @@ class TerroristSpreadModel(FSM, Geo):
)
# Check if there are any leaders in the group
leaders = list(filter(lambda x: x.state.id == self.leader.id, neighbours))
leaders = list(filter(lambda x: x.state_id == self.leader.id, neighbours))
if not leaders:
# Check if this is the potential leader
# Stop once it's found. Otherwise, set self as leader
@ -132,12 +137,11 @@ class TerroristSpreadModel(FSM, Geo):
def ego_search(self, steps=1, center=False, agent=None, **kwargs):
"""Get a list of nodes in the ego network of *node* of radius *steps*"""
node = agent.node
node = agent.node_id
G = self.subgraph(**kwargs)
return nx.ego_graph(G, node, center=center, radius=steps).nodes()
def degree(self, agent, force=False):
node = agent.node
if (
force
or (not hasattr(self.model, "_degree"))
@ -145,10 +149,9 @@ class TerroristSpreadModel(FSM, Geo):
):
self.model._degree = nx.degree_centrality(self.G)
self.model._last_step = self.now
return self.model._degree[node]
return self.model._degree[agent.node_id]
def betweenness(self, agent, force=False):
node = agent.node
if (
force
or (not hasattr(self.model, "_betweenness"))
@ -156,7 +159,7 @@ class TerroristSpreadModel(FSM, Geo):
):
self.model._betweenness = nx.betweenness_centrality(self.G)
self.model._last_step = self.now
return self.model._betweenness[node]
return self.model._betweenness[agent.node_id]
class TrainingAreaModel(FSM, Geo):
@ -169,13 +172,12 @@ class TrainingAreaModel(FSM, Geo):
Requires TerroristSpreadModel.
"""
def __init__(self, model=None, unique_id=0, state=()):
super().__init__(model=model, unique_id=unique_id, state=state)
self.training_influence = model.environment_params["training_influence"]
if "min_vulnerability" in model.environment_params:
self.min_vulnerability = model.environment_params["min_vulnerability"]
else:
self.min_vulnerability = 0
training_influence = 0.1
min_vulnerability = 0
def init(self):
self.mean_believe = 1
self.vulnerability = 0
@default_state
@state
@ -199,18 +201,19 @@ class HavenModel(FSM, Geo):
Requires TerroristSpreadModel.
"""
def __init__(self, model=None, unique_id=0, state=()):
super().__init__(model=model, unique_id=unique_id, state=state)
self.haven_influence = model.environment_params["haven_influence"]
if "min_vulnerability" in model.environment_params:
self.min_vulnerability = model.environment_params["min_vulnerability"]
else:
self.min_vulnerability = 0
self.max_vulnerability = model.environment_params["max_vulnerability"]
min_vulnerability = 0
haven_influence = 0.1
max_vulnerability = 0.5
def init(self):
self.mean_believe = 0
self.vulnerability = 0
def get_occupants(self, **kwargs):
return self.get_neighbors(agent_class=TerroristSpreadModel, **kwargs)
return self.get_neighbors(agent_class=TerroristSpreadModel,
**kwargs)
@default_state
@state
def civilian(self):
civilians = self.get_occupants(state_id=self.civilian.id)
@ -246,13 +249,10 @@ class TerroristNetworkModel(TerroristSpreadModel):
weight_link_distance
"""
def __init__(self, model=None, unique_id=0, state=()):
super().__init__(model=model, unique_id=unique_id, state=state)
self.vision_range = model.environment_params["vision_range"]
self.sphere_influence = model.environment_params["sphere_influence"]
self.weight_social_distance = model.environment_params["weight_social_distance"]
self.weight_link_distance = model.environment_params["weight_link_distance"]
sphere_influence: float
vision_range: float
weight_social_distance: float
weight_link_distance: float
@state
def terrorist(self):
@ -316,8 +316,8 @@ sim = Simulation(
num_trials=1,
name="TerroristNetworkModel_sim",
max_steps=150,
skip_test=True,
dry_run=True,
skip_test=False,
dump=False,
)
# TODO: integrate visualization

View File

@ -1,14 +1,23 @@
from soil import Environment, Simulation, CounterModel
from soil import Environment, Simulation, CounterModel, report
# Get directory path for current file
import os, sys, inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
class TorvaldsEnv(Environment):
def init(self):
self.create_network(path='torvalds.edgelist')
self.create_network(path=os.path.join(currentdir, 'torvalds.edgelist'))
self.populate_network(CounterModel, skill_level='beginner')
print("Agentes: ", list(self.network_agents))
self.find_one(node_id="Torvalds").skill_level = 'God'
self.find_one(node_id="balkian").skill_level = 'developer'
self.agent(node_id="Torvalds").skill_level = 'God'
self.agent(node_id="balkian").skill_level = 'developer'
self.add_agent_reporter("times")
@report
def god_developers(self):
return self.count_agents(skill_level='God')
sim = Simulation(name='torvalds_example',
max_steps=10,

File diff suppressed because one or more lines are too long

View File

@ -30,7 +30,7 @@ from .decorators import *
def main(
cfg="simulation.yml",
exporters=None,
parallel=None,
num_processes=1,
output="soil_output",
*,
do_run=False,
@ -69,6 +69,11 @@ def main(
"--dry-run",
"--dry",
action="store_true",
help="Do not run the simulation",
)
parser.add_argument(
"--no-dump",
action="store_true",
help="Do not store the results of the simulation to disk, show in terminal instead.",
)
parser.add_argument(
@ -98,12 +103,11 @@ def main(
default=output or "soil_output",
help="folder to write results to. It defaults to the current directory.",
)
if parallel is None:
parser.add_argument(
"--synchronous",
action="store_true",
help="Run trials serially and synchronously instead of in parallel. Defaults to false.",
)
parser.add_argument(
"--num-processes",
default=num_processes,
help="Number of processes to use for parallel execution. Defaults to 1.",
)
parser.add_argument(
"-e",
@ -112,6 +116,17 @@ def main(
default=[],
help="Export environment and/or simulations using this exporter",
)
parser.add_argument(
"--until",
default="",
help="Set maximum time for the simulation to run. ",
)
parser.add_argument(
"--seed",
default=None,
help="Manually set a seed for the simulation.",
)
parser.add_argument(
"--only-convert",
@ -138,9 +153,6 @@ def main(
if args.version:
return
if parallel is None:
parallel = not args.synchronous
exporters = exporters or [
"default",
]
@ -168,42 +180,46 @@ def main(
res = []
try:
exp_params = {}
opts = dict(
dry_run=args.dry_run,
dump=not args.no_dump,
debug=debug,
exporters=exporters,
num_processes=args.num_processes,
outdir=output,
exporter_params=exp_params,
**kwargs)
if args.seed is not None:
opts["seed"] = args.seed
if sim:
logger.info("Loading simulation instance")
sim.dry_run = args.dry_run
sim.exporters = exporters
sim.parallel = parallel
sim.outdir = output
sims = [
sim,
]
for (k, v) in opts.items():
setattr(sim, k, v)
sims = [sim]
else:
logger.info("Loading config file: {}".format(args.file))
if not os.path.exists(args.file):
logger.error("Please, input a valid file")
return
assert opts["debug"] == debug
sims = list(
simulation.iter_from_file(
args.file,
dry_run=args.dry_run,
exporters=exporters,
parallel=parallel,
outdir=output,
exporter_params=exp_params,
**kwargs,
**opts,
)
)
for sim in sims:
assert sim.debug == debug
if args.set:
for s in args.set:
k, v = s.split("=", 1)[:2]
v = eval(v)
tail, *head = k.rsplit(".", 1)[::-1]
target = sim
target = sim.model_params
if head:
for part in head[0].split("."):
try:
@ -219,9 +235,8 @@ def main(
print(sim.to_yaml())
continue
if do_run:
res.append(sim.run())
res.append(sim.run(until=args.until))
else:
print("not running")
res.append(sim)
except Exception as ex:

View File

@ -11,6 +11,8 @@ import inspect
import types
import textwrap
import networkx as nx
import warnings
import sys
from typing import Any
@ -90,7 +92,7 @@ class BaseAgent(MesaAgent, MutableMapping, metaclass=MetaAgent):
Any attribute that is not preceded by an underscore (`_`) will also be added to its state.
"""
def __init__(self, unique_id, model, name=None, interval=None, **kwargs):
def __init__(self, unique_id, model, name=None, init=True, interval=None, **kwargs):
assert isinstance(unique_id, int)
super().__init__(unique_id=unique_id, model=model)
@ -116,6 +118,11 @@ class BaseAgent(MesaAgent, MutableMapping, metaclass=MetaAgent):
for (k, v) in kwargs.items():
setattr(self, k, v)
if init:
self.init()
def init(self):
pass
def __hash__(self):
return hash(self.unique_id)
@ -130,11 +137,10 @@ class BaseAgent(MesaAgent, MutableMapping, metaclass=MetaAgent):
# TODO: refactor to clean up mesa compatibility
@property
def id(self):
msg = "This attribute is deprecated. Use `unique_id` instead"
warnings.warn(msg, DeprecationWarning)
print(msg, file=sys.stderr)
return self.unique_id
@id.setter
def id(self, value):
self.unique_id = value
@classmethod
def from_dict(cls, model, attrs, warn_extra=True):
@ -197,8 +203,10 @@ class BaseAgent(MesaAgent, MutableMapping, metaclass=MetaAgent):
# No environment
return None
def die(self):
self.info(f"agent dying")
def die(self, msg=None):
if msg:
self.info("Agent dying:", msg)
self.debug(f"agent dying")
self.alive = False
try:
self.model.schedule.remove(self)
@ -207,15 +215,16 @@ class BaseAgent(MesaAgent, MutableMapping, metaclass=MetaAgent):
return time.NEVER
def step(self):
raise NotImplementedError("Agent must implement step method")
def _check_alive(self):
if not self.alive:
raise time.DeadAgent(self.unique_id)
super().step()
return time.Delta(self.interval)
def log(self, message, *args, level=logging.INFO, **kwargs):
def log(self, *message, level=logging.INFO, **kwargs):
if not self.logger.isEnabledFor(level):
return
message = message + " ".join(str(i) for i in args)
message = " ".join(str(i) for i in message)
message = "[@{:>4}]\t{:>10}: {}".format(self.now, repr(self), message)
for k, v in kwargs:
message += " {k}={v} ".format(k, v)
@ -388,7 +397,7 @@ class AgentView(Mapping, Set):
def filter_agents(
agents,
agents: dict,
*id_args,
unique_id=None,
state_id=None,

View File

@ -1,4 +1,5 @@
from . import MetaAgent, BaseAgent
from ..time import Delta
from functools import partial, wraps
import inspect
@ -85,8 +86,8 @@ class MetaFSM(MetaAgent):
class FSM(BaseAgent, metaclass=MetaFSM):
def __init__(self, **kwargs):
super(FSM, self).__init__(**kwargs)
def __init__(self, init=True, **kwargs):
super().__init__(**kwargs, init=False)
if not hasattr(self, "state_id"):
if not self._default_state:
raise ValueError(
@ -95,12 +96,15 @@ class FSM(BaseAgent, metaclass=MetaFSM):
self.state_id = self._default_state.id
self._coroutine = None
self.default_interval = Delta(self.model.interval)
self._set_state(self.state_id)
if init:
self.init()
def step(self):
self.debug(f"Agent {self.unique_id} @ state {self.state_id}")
default_interval = super().step()
self._check_alive()
next_state = self._states[self.state_id](self)
when = None
@ -120,7 +124,7 @@ class FSM(BaseAgent, metaclass=MetaFSM):
if next_state is not None:
self._set_state(next_state)
return when or default_interval
return when or self.default_interval
def _set_state(self, state, when=None):
if hasattr(state, "id"):
@ -132,8 +136,8 @@ class FSM(BaseAgent, metaclass=MetaFSM):
self.model.schedule.add(self, when=when)
return state
def die(self):
return self.dead, super().die()
def die(self, *args, **kwargs):
return self.dead, super().die(*args, **kwargs)
@state
def dead(self):

View File

@ -2,23 +2,37 @@ from . import BaseAgent
class NetworkAgent(BaseAgent):
def __init__(self, *args, topology, node_id, **kwargs):
super().__init__(*args, **kwargs)
def __init__(self, *args, topology=None, init=True, node_id=None, **kwargs):
super().__init__(*args, init=False, **kwargs)
assert topology is not None
assert node_id is not None
self.G = topology
self.G = topology or self.model.G
assert self.G
if node_id is None:
nodes = self.random.choices(list(self.G.nodes), k=len(self.G))
for n_id in nodes:
if "agent" not in self.G.nodes[n_id] or self.G.nodes[n_id]["agent"] is None:
node_id = n_id
break
else:
node_id = len(self.G)
self.info(f"All nodes ({len(self.G)}) have an agent assigned, adding a new node to the graph for agent {self.unique_id}")
self.G.add_node(node_id)
assert node_id is not None
self.G.nodes[node_id]["agent"] = self
self.node_id = node_id
if init:
self.init()
def count_neighbors(self, state_id=None, **kwargs):
return len(self.get_neighbors(state_id=state_id, **kwargs))
if init:
self.init()
def iter_neighbors(self, **kwargs):
return self.iter_agents(limit_neighbors=True, **kwargs)
def get_neighbors(self, **kwargs):
return list(self.iter_neighbors())
return list(self.iter_neighbors(**kwargs))
@property
def node(self):
@ -40,7 +54,7 @@ class NetworkAgent(BaseAgent):
for node_id in self.G.neighbors(self.node_id):
agent = self.G.nodes[node_id].get("agent")
if agent is not None:
neighbor_ids.add(agent.id)
neighbor_ids.add(agent.unique_id)
if unique_ids:
unique_ids = unique_ids & neighbor_ids
else:

View File

@ -1,267 +1,2 @@
from __future__ import annotations
from enum import Enum
from pydantic import BaseModel, ValidationError, validator, root_validator
import yaml
import os
import sys
from typing import Any, Callable, Dict, List, Optional, Union, Type
from pydantic import BaseModel, Extra
from . import environment, utils
import networkx as nx
# Could use TypeAlias in python >= 3.10
nodeId = int
class Node(BaseModel):
id: nodeId
state: Optional[Dict[str, Any]] = {}
class Edge(BaseModel):
source: nodeId
target: nodeId
value: Optional[float] = 1
class Topology(BaseModel):
nodes: List[Node]
directed: bool
links: List[Edge]
class NetConfig(BaseModel):
params: Optional[Dict[str, Any]]
fixed: Optional[Union[Topology, nx.Graph]]
path: Optional[str]
class Config:
arbitrary_types_allowed = True
@staticmethod
def default():
return NetConfig(topology=None, params=None)
@root_validator
def validate_all(cls, values):
if "params" not in values and "topology" not in values:
raise ValueError(
"You must specify either a topology or the parameters to generate a graph"
)
return values
class EnvConfig(BaseModel):
@staticmethod
def default():
return EnvConfig()
class SingleAgentConfig(BaseModel):
agent_class: Optional[Union[Type, str]] = None
unique_id: Optional[int] = None
topology: Optional[bool] = False
node_id: Optional[Union[int, str]] = None
state: Optional[Dict[str, Any]] = {}
class FixedAgentConfig(SingleAgentConfig):
n: Optional[int] = 1
hidden: Optional[bool] = False # Do not count this agent towards total agent count
@root_validator
def validate_all(cls, values):
if values.get("unique_id", None) is not None and values.get("n", 1) > 1:
raise ValueError(
f"An unique_id can only be provided when there is only one agent ({values.get('n')} given)"
)
return values
class OverrideAgentConfig(FixedAgentConfig):
filter: Optional[Dict[str, Any]] = None
class Strategy(Enum):
topology = "topology"
total = "total"
class AgentDistro(SingleAgentConfig):
weight: Optional[float] = 1
strategy: Strategy = Strategy.topology
class AgentConfig(SingleAgentConfig):
n: Optional[int] = None
distribution: Optional[List[AgentDistro]] = None
fixed: Optional[List[FixedAgentConfig]] = None
override: Optional[List[OverrideAgentConfig]] = None
@staticmethod
def default():
return AgentConfig()
@root_validator
def validate_all(cls, values):
if "distribution" in values and (
"n" not in values and "topology" not in values
):
raise ValueError(
"You need to provide the number of agents or a topology to extract the value from."
)
return values
class Config(BaseModel, extra=Extra.allow):
version: Optional[str] = "1"
name: str = "Unnamed Simulation"
description: Optional[str] = None
group: str = None
dir_path: Optional[str] = None
num_trials: int = 1
max_time: float = 100
max_steps: int = -1
num_processes: int = 1
interval: float = 1
seed: str = ""
dry_run: bool = False
skip_test: bool = False
model_class: Union[Type, str] = environment.Environment
model_params: Optional[Dict[str, Any]] = {}
visualization_params: Optional[Dict[str, Any]] = {}
@classmethod
def from_raw(cls, cfg):
if isinstance(cfg, Config):
return cfg
if cfg.get("version", "1") == "1" and any(
k in cfg for k in ["agents", "agent_class", "topology", "environment_class"]
):
return convert_old(cfg)
return Config(**cfg)
def convert_old(old, strict=True):
"""
Try to convert old style configs into the new format.
This is still a work in progress and might not work in many cases.
"""
utils.logger.warning(
"The old configuration format is deprecated. The converted file MAY NOT yield the right results"
)
new = old.copy()
network = {}
if "topology" in old:
del new["topology"]
network["topology"] = old["topology"]
if "network_params" in old and old["network_params"]:
del new["network_params"]
for (k, v) in old["network_params"].items():
if k == "path":
network["path"] = v
else:
network.setdefault("params", {})[k] = v
topology = None
if network:
topology = network
agents = {"fixed": [], "distribution": []}
def updated_agent(agent):
"""Convert an agent definition"""
newagent = dict(agent)
return newagent
by_weight = []
fixed = []
override = []
if "environment_agents" in new:
for agent in new["environment_agents"]:
agent.setdefault("state", {})["group"] = "environment"
if "agent_id" in agent:
agent["state"]["name"] = agent["agent_id"]
del agent["agent_id"]
agent["hidden"] = True
agent["topology"] = False
fixed.append(updated_agent(agent))
del new["environment_agents"]
if "agent_class" in old:
del new["agent_class"]
agents["agent_class"] = old["agent_class"]
if "default_state" in old:
del new["default_state"]
agents["state"] = old["default_state"]
if "network_agents" in old:
agents["topology"] = True
agents.setdefault("state", {})["group"] = "network"
for agent in new["network_agents"]:
agent = updated_agent(agent)
if "agent_id" in agent:
agent["state"]["name"] = agent["agent_id"]
del agent["agent_id"]
fixed.append(agent)
else:
by_weight.append(agent)
del new["network_agents"]
if "agent_class" in old and (not fixed and not by_weight):
agents["topology"] = True
by_weight = [{"agent_class": old["agent_class"], "weight": 1}]
# TODO: translate states properly
if "states" in old:
del new["states"]
states = old["states"]
if isinstance(states, dict):
states = states.items()
else:
states = enumerate(states)
for (k, v) in states:
override.append({"filter": {"node_id": k}, "state": v})
agents["override"] = override
agents["fixed"] = fixed
agents["distribution"] = by_weight
model_params = {}
if "environment_params" in new:
del new["environment_params"]
model_params = dict(old["environment_params"])
if "environment_class" in old:
del new["environment_class"]
new["model_class"] = old["environment_class"]
if "dump" in old:
del new["dump"]
new["dry_run"] = not old["dump"]
model_params["topology"] = topology
model_params["agents"] = agents
return Config(version="2", model_params=model_params, **new)
def load_config(cfg):
return cfg

View File

@ -9,7 +9,7 @@ class SoilCollector(MDC):
if 'agent_count' not in model_reporters:
model_reporters['agent_count'] = lambda m: m.schedule.get_agent_count()
if 'state_id' not in agent_reporters:
agent_reporters['agent_id'] = lambda agent: agent.get('state_id', None)
agent_reporters['agent_id'] = lambda agent: getattr(agent, 'state_id', None)
super().__init__(model_reporters=model_reporters,
agent_reporters=agent_reporters,

View File

@ -8,6 +8,7 @@ from textwrap import indent
from functools import wraps
from .agents import FSM, MetaFSM
from mesa import Model, Agent
def wrapcmd(func):
@ -15,14 +16,22 @@ def wrapcmd(func):
def wrapper(self, arg: str, temporary=False):
sys.settrace(self.trace_dispatch)
lastself = self
known = globals()
known.update(self.curframe.f_globals)
known.update(self.curframe.f_locals)
known["agent"] = known.get("self", None)
known["model"] = known.get("self", {}).get("model")
known["attrs"] = arg.strip().split()
exec(func.__code__, known, known)
this = known.get("self", None)
if isinstance(this, Model):
known["model"] = this
elif isinstance(this, Agent):
known["agent"] = this
known["model"] = this.model
known["self"] = lastself
return exec(func.__code__, known, known)
return wrapper
@ -57,6 +66,7 @@ class Debug(pdb.Pdb):
do_sl = do_soil_list
def do_continue_state(self, arg):
"""Continue until next time this state is reached"""
self.do_break_state(arg, temporary=True)
return self.do_continue("")
@ -80,6 +90,49 @@ class Debug(pdb.Pdb):
do_aa = do_soil_agent
def do_break_step(self, arg: str):
"""
Break before the next step.
"""
try:
known = globals()
known.update(self.curframe.f_globals)
known.update(self.curframe.f_locals)
func = getattr(known["model"], "step")
except AttributeError as ex:
self.error(f"The model does not have a step function: {ex}")
return
if hasattr(func, "__func__"):
func = func.__func__
code = func.__code__
# use co_name to identify the bkpt (function names
# could be aliased, but co_name is invariant)
funcname = code.co_name
lineno = code.co_firstlineno
filename = code.co_filename
# Check for reasonable breakpoint
line = self.checkline(filename, lineno)
if not line:
raise ValueError("no line found")
# now set the break point
existing = self.get_breaks(filename, line)
if existing:
self.message("Breakpoint already exists at %s:%d" % (filename, line))
return
cond = f"self.schedule.steps > {model.schedule.steps}"
err = self.set_break(filename, line, True, cond, funcname)
if err:
self.error(err)
else:
bp = self.get_breaks(filename, line)[-1]
self.message("Breakpoint %d at %s:%d" % (bp.number, bp.file, bp.line))
return self.do_continue("")
do_bstep = do_break_step
def do_break_state(self, arg: str, instances=None, temporary=False):
"""
Break before a specified state is stepped into.

View File

@ -1,4 +1,6 @@
def report(f: property):
print(f.fget)
setattr(f.fget, "add_to_report", True)
if isinstance(f, property):
setattr(f.fget, "add_to_report", True)
else:
setattr(f, "add_to_report", True)
return f

View File

@ -19,8 +19,7 @@ from mesa import Model, Agent
from . import agents as agentmod, datacollection, serialization, utils, time, network, events
# TODO: add metaclass to read attributes of a model
# TODO: read "report" attributes from the model
# TODO: maybe add metaclass to read attributes of a model
class BaseEnvironment(Model):
"""
@ -35,10 +34,31 @@ class BaseEnvironment(Model):
:meth:`soil.environment.Environment.get` method.
"""
def __new__(cls, *args: Any, seed="default", dir_path=None, **kwargs: Any) -> Any:
def __new__(cls,
*args: Any,
seed="default",
dir_path=None,
collector_class: type = datacollection.SoilCollector,
agent_reporters: Optional[Any] = None,
model_reporters: Optional[Any] = None,
tables: Optional[Any] = None,
**kwargs: Any) -> Any:
"""Create a new model with a default seed value"""
self = super().__new__(cls, *args, seed=seed, **kwargs)
self.dir_path = dir_path or os.getcwd()
collector_class = serialization.deserialize(collector_class)
self.datacollector = collector_class(
model_reporters=model_reporters,
agent_reporters=agent_reporters,
tables=tables,
)
for k in dir(cls):
v = getattr(cls, k)
if isinstance(v, property):
v = v.fget
if getattr(v, "add_to_report", False):
self.add_model_reporter(k, v)
return self
def __init__(
@ -69,18 +89,12 @@ class BaseEnvironment(Model):
schedule_class = time.TimedActivation
else:
schedule_class = serialization.deserialize(schedule_class)
self.schedule = schedule_class(self)
self.interval = interval
self.schedule = schedule_class(self)
self.logger = utils.logger.getChild(self.id)
collector_class = serialization.deserialize(collector_class)
self.datacollector = collector_class(
model_reporters=model_reporters,
agent_reporters=agent_reporters,
tables=tables,
)
for (k, v) in env_params.items():
self[k] = v
@ -96,7 +110,7 @@ class BaseEnvironment(Model):
def agents(self):
return agentmod.AgentView(self.schedule._agents)
def find_one(self, *args, **kwargs):
def agent(self, *args, **kwargs):
return agentmod.AgentView(self.schedule._agents).one(*args, **kwargs)
def count_agents(self, *args, **kwargs):
@ -109,6 +123,8 @@ class BaseEnvironment(Model):
raise Exception(
"The environment has not been scheduled, so it has no sense of time"
)
def init_agents(self):
pass
def add_agent(self, agent_class, unique_id=None, **agent):
if unique_id is None:
@ -127,6 +143,8 @@ class BaseEnvironment(Model):
return a
def add_agents(self, agent_classes: List[type], k, weights: Optional[List[float]] = None, **kwargs):
if isinstance(agent_classes, type):
agent_classes = [agent_classes]
if weights is None:
weights = [1] * len(agent_classes)
@ -150,12 +168,27 @@ class BaseEnvironment(Model):
Advance one step in the simulation, and update the data collection and scheduler appropriately
"""
super().step()
# self.logger.info(
# "--- Step: {:^5} - Time: {now:^5} ---", steps=self.schedule.steps, now=self.now
# )
self.schedule.step()
self.datacollector.collect(self)
msg = "Model data:\n"
max_width = max(len(k) for k in self.datacollector.model_vars.keys())
for (k, v) in self.datacollector.model_vars.items():
msg += f"\t{k:<{max_width}}: {v[-1]:>6}\n"
self.logger.info(f"--- Steps: {self.schedule.steps:^5} - Time: {self.now:^5} --- " + msg)
def add_model_reporter(self, name, func=None):
if not func:
func = lambda env: getattr(env, name)
self.datacollector._new_model_reporter(name, func)
def add_agent_reporter(self, name, agent_type=None):
if agent_type:
reporter = lambda a: getattr(a, name) if isinstance(a, agent_type) else None
else:
reporter = name
self.datacollector._new_agent_reporter(name, reporter)
def __getitem__(self, key):
try:
return getattr(self, key)
@ -192,18 +225,19 @@ class NetworkEnvironment(BaseEnvironment):
and methods to associate agents to nodes and vice versa.
"""
def __init__(
self, *args,
topology: Optional[Union[nx.Graph, str]] = None,
agent_class: Optional[Type[agentmod.Agent]] = None,
network_generator: Optional[Callable] = None,
network_params: Optional[Dict] = None, **kwargs
):
def __init__(self,
*args,
topology: Optional[Union[nx.Graph, str]] = None,
agent_class: Optional[Type[agentmod.Agent]] = None,
network_generator: Optional[Callable] = None,
network_params: Optional[Dict] = {},
init=True,
**kwargs):
self.topology = topology
self.network_generator = network_generator
self.network_params = network_params
if topology or network_params or network_generator:
self.create_network(topology, network_params=network_params, network_generator=network_generator)
self.create_network(topology, generator=network_generator, **network_params)
else:
self.G = nx.Graph()
super().__init__(*args, **kwargs, init=False)
@ -211,23 +245,35 @@ class NetworkEnvironment(BaseEnvironment):
self.agent_class = agent_class
if agent_class:
self.agent_class = serialization.deserialize(agent_class)
self.init()
if self.agent_class:
self.populate_network(self.agent_class)
self._check_agent_nodes()
if init:
self.init()
def add_agent(self, agent_class, *args, node_id=None, topology=None, **kwargs):
if node_id is None and topology is None:
return super().add_agent(agent_class, *args, **kwargs)
try:
a = super().add_agent(agent_class, *args, node_id=node_id, **kwargs)
except TypeError:
self.logger.warning(f"Agent constructor for {agent_class} does not have a node_id attribute. Might be a bug.")
a = super().add_agent(agent_class, *args, **kwargs)
self.G.nodes[node_id]["agent"] = a
return a
def add_agents(self, *args, k=None, **kwargs):
if not k and not self.G:
raise ValueError("Cannot add agents to an empty network")
super().add_agents(*args, k=k or len(self.G), **kwargs)
def create_network(self, topology=None, network_generator=None, path=None, network_params=None):
def create_network(self, topology=None, generator=None, path=None, **network_params):
if topology is not None:
topology = network.from_topology(topology, dir_path=self.dir_path)
elif path is not None:
topology = network.from_topology(path, dir_path=self.dir_path)
elif network_generator is not None:
topology = network.from_params(network_generator, dir_path=self.dir_path, **network_params)
elif generator is not None:
topology = network.from_params(generator=generator, dir_path=self.dir_path, **network_params)
else:
raise ValueError("topology must be a networkx.Graph or a string, or network_generator must be provided")
self.G = topology
@ -235,21 +281,15 @@ class NetworkEnvironment(BaseEnvironment):
def init_agents(self, *args, **kwargs):
"""Initialize the agents from a"""
super().init_agents(*args, **kwargs)
for agent in self.schedule._agents.values():
self._assign_node(agent)
def _assign_node(self, agent):
"""
Make sure the node for a given agent has the proper attributes.
"""
if hasattr(agent, "node_id"):
self.G.nodes[agent.node_id]["agent"] = agent
@property
def network_agents(self):
for a in self.schedule._agents.values():
if isinstance(a, agentmod.NetworkAgent):
yield a
"""Return agents still alive and assigned to a node in the network."""
for (id, data) in self.G.nodes(data=True):
if "agent" in data:
agent = data["agent"]
if getattr(agent, "alive", True):
yield agent
def add_node(self, agent_class, unique_id=None, node_id=None, **kwargs):
if unique_id is None:
@ -265,7 +305,6 @@ class NetworkEnvironment(BaseEnvironment):
self.G.add_node(node_id)
assert "agent" not in self.G.nodes[node_id]
self.G.nodes[node_id]["agent"] = None # Reserve
a = self.add_agent(
unique_id=unique_id,
@ -277,17 +316,32 @@ class NetworkEnvironment(BaseEnvironment):
a["visible"] = True
return a
def add_agent(self, agent_class, *args, **kwargs):
if issubclass(agent_class, agentmod.NetworkAgent) and "node_id" not in kwargs:
return self.add_node(agent_class, *args, **kwargs)
a = super().add_agent(agent_class, *args, **kwargs)
if hasattr(a, "node_id"):
assigned = self.G.nodes[a.node_id].get("agent")
if not assigned:
self.G.nodes[a.node_id]["agent"] = a
elif assigned != a:
raise ValueError(f"Node {a.node_id} already has an agent assigned: {assigned}")
return a
def _check_agent_nodes(self):
"""
Detect nodes that have agents assigned to them.
"""
for (id, data) in self.G.nodes(data=True):
if "agent_id" in data:
agent = self.agents(data["agent_id"])
self.G.nodes[id]["agent"] = agent
assert not getattr(agent, "node_id", None) or agent.node_id == id
agent.node_id = id
for agent in self.agents():
if hasattr(agent, "node_id"):
node_id = agent["node_id"]
if node_id not in self.G.nodes:
raise ValueError(f"Agent {agent} is assigned to node {agent.node_id} which is not in the network")
node = self.G.nodes[node_id]
if node.get("agent") is not None and node["agent"] != agent:
raise ValueError(f"Node {node_id} already has a different agent assigned to it")
self.G.nodes[node_id]["agent"] = agent
def add_agents(self, agent_classes: List[type], k=None, weights: Optional[List[float]] = None, **kwargs):
if k is None:
k = len(self.G)
if not k:
raise ValueError("Cannot add agents to an empty network")
super().add_agents(agent_classes, k=k, weights=weights, **kwargs)
def agent_for_node_id(self, node_id):
return self.G.nodes[node_id].get("agent")
@ -301,11 +355,15 @@ class NetworkEnvironment(BaseEnvironment):
weights = [1] * len(agent_class)
assert len(self.G)
classes = self.random.choices(agent_class, weights, k=len(self.G))
toadd = []
for (cls, (node_id, node)) in zip(classes, self.G.nodes(data=True)):
if "agent" in node:
continue
a = self.add_agent(node_id=node_id, topology=self.G, agent_class=cls, **agent_params)
node["agent"] = a
node["agent"] = None # Reserve
toadd.append(dict(node_id=node_id, topology=self.G, agent_class=cls, **agent_params))
for d in toadd:
a = self.add_agent(**d)
self.G.nodes[d["node_id"]]["agent"] = a
assert all("agent" in node for (_, node) in self.G.nodes(data=True))
assert len(list(self.network_agents))

View File

@ -38,7 +38,7 @@ class DryRunner(BytesIO):
except UnicodeDecodeError:
pass
logger.info(
"**Not** written to {} (dry run mode):\n\n{}\n\n".format(
"**Not** written to {} (no_dump mode):\n\n{}\n\n".format(
self.__fname, content
)
)
@ -51,12 +51,12 @@ class Exporter:
if you don't plan to implement all the methods.
"""
def __init__(self, simulation, outdir=None, dry_run=None, copy_to=None):
def __init__(self, simulation, outdir=None, dump=True, copy_to=None):
self.simulation = simulation
outdir = outdir or os.path.join(os.getcwd(), "soil_output")
self.outdir = os.path.join(outdir, simulation.group or "", simulation.name)
self.dry_run = dry_run
if copy_to is None and dry_run:
self.dump = dump
if copy_to is None and not dump:
copy_to = sys.stdout
self.copy_to = copy_to
@ -77,7 +77,7 @@ class Exporter:
pass
def output(self, f, mode="w", **kwargs):
if self.dry_run:
if not self.dump:
f = DryRunner(f, copy_to=self.copy_to)
else:
try:
@ -108,16 +108,16 @@ class SQLite(Exporter):
"""Writes sqlite results"""
def sim_start(self):
if self.dry_run:
logger.info("NOT dumping results")
if not self.dump:
logger.debug("NOT dumping results")
return
self.dbpath = os.path.join(self.outdir, f"{self.simulation.name}.sqlite")
logger.info("Dumping results to %s", self.dbpath)
try_backup(self.dbpath, remove=True)
def trial_end(self, env):
if self.dry_run:
logger.info("Running in DRY_RUN mode, the database will NOT be created")
if not self.dump:
logger.info("Running in NO DUMP mode, the database will NOT be created")
return
with timer(
@ -147,8 +147,8 @@ class csv(Exporter):
# TODO: reimplement GEXF exporting without history
class gexf(Exporter):
def trial_end(self, env):
if self.dry_run:
logger.info("Not dumping GEXF in dry_run mode")
if not self.dump:
logger.info("Not dumping GEXF (NO_DUMP mode)")
return
with timer(
@ -224,8 +224,8 @@ class YAML(Exporter):
"""Writes the configuration of the simulation to a YAML file"""
def sim_start(self):
if self.dry_run:
logger.info("NOT dumping results")
if not self.dump:
logger.debug("NOT dumping results")
return
with self.output(self.simulation.name + ".dumped.yml") as f:
logger.info(f"Dumping simulation configuration to {self.outdir}")
@ -235,7 +235,7 @@ class default(Exporter):
"""Default exporter. Writes sqlite results, as well as the simulation YAML"""
def __init__(self, *args, exporter_cls=[], **kwargs):
exporter_cls = exporter_cls or [YAML, SQLite, summary]
exporter_cls = exporter_cls or [YAML, SQLite]
self.inner = [cls(*args, **kwargs) for cls in exporter_cls]
def sim_start(self):

View File

@ -4,14 +4,15 @@ import ast
import sys
import re
import importlib
import importlib.machinery, importlib.util
from glob import glob
from itertools import product, chain
from .config import Config
import yaml
import networkx as nx
from . import config
from jinja2 import Template
@ -90,24 +91,56 @@ def load_files(*patterns, **kwargs):
for i in glob(pattern, **kwargs, recursive=True):
for cfg in load_file(i):
path = os.path.abspath(i)
yield Config.from_raw(cfg), path
yield cfg, path
def load_config(cfg):
if isinstance(cfg, Config):
yield cfg, os.getcwd()
elif isinstance(cfg, dict):
yield Config.from_raw(cfg), os.getcwd()
if isinstance(cfg, dict):
yield config.load_config(cfg), os.getcwd()
else:
yield from load_files(cfg)
builtins = importlib.import_module("builtins")
KNOWN_MODULES = [
"soil",
]
KNOWN_MODULES = {
'soil': None,
}
MODULE_FILES = {}
def add_source_file(file):
"""Add a file to the list of known modules"""
file = os.path.abspath(file)
if file in MODULE_FILES:
logger.warning(f"File {file} already added as module {MODULE_FILES[file]}. Reloading")
remove_source_file(file)
modname = f"imported_module_{len(MODULE_FILES)}"
loader = importlib.machinery.SourceFileLoader(modname, file)
spec = importlib.util.spec_from_loader(loader.name, loader)
my_module = importlib.util.module_from_spec(spec)
loader.exec_module(my_module)
MODULE_FILES[file] = modname
KNOWN_MODULES[modname] = my_module
def remove_source_file(file):
"""Remove a file from the list of known modules"""
file = os.path.abspath(file)
modname = None
try:
modname = MODULE_FILES.pop(file)
KNOWN_MODULES.pop(modname)
except KeyError as ex:
raise ValueError(f"File {file} had not been added as a module: {ex}")
def get_module(modname):
"""Get a module from the list of known modules"""
if modname not in KNOWN_MODULES or KNOWN_MODULES[modname] is None:
module = importlib.import_module(modname)
KNOWN_MODULES[modname] = module
return KNOWN_MODULES[modname]
def name(value, known_modules=KNOWN_MODULES):
"""Return a name that can be imported, to serialize/deserialize an object"""
@ -124,9 +157,7 @@ def name(value, known_modules=KNOWN_MODULES):
if known_modules and modname in known_modules:
return tname
for kmod in known_modules:
if not kmod:
continue
module = importlib.import_module(kmod)
module = get_module(kmod)
if hasattr(module, tname):
return tname
return "{}.{}".format(modname, tname)
@ -177,7 +208,7 @@ def deserializer(type_, known_modules=KNOWN_MODULES):
match = IS_CLASS.match(type_)
if match:
modname, tname = match.group(1).rsplit(".", 1)
module = importlib.import_module(modname)
module = get_module(modname)
cls = getattr(module, tname)
return getattr(cls, "deserialize", cls)
@ -195,7 +226,7 @@ def deserializer(type_, known_modules=KNOWN_MODULES):
errors = []
for modname, tname in options:
try:
module = importlib.import_module(modname)
module = get_module(modname)
cls = getattr(module, tname)
return getattr(cls, "deserialize", cls)
except (ImportError, AttributeError) as ex:

View File

@ -10,7 +10,7 @@ import networkx as nx
from textwrap import dedent
from dataclasses import dataclass, field, asdict
from dataclasses import dataclass, field, asdict, replace
from typing import Any, Dict, Union, Optional, List
@ -22,7 +22,7 @@ import pickle
from . import serialization, exporters, utils, basestring, agents
from .environment import Environment
from .utils import logger, run_and_return_exceptions
from .config import Config, convert_old
from .debugging import set_trace
_AVOID_RUNNING = False
_QUEUED = []
@ -31,24 +31,50 @@ _QUEUED = []
def do_not_run():
global _AVOID_RUNNING
_AVOID_RUNNING = True
yield
_AVOID_RUNNING = False
try:
logger.debug("NOT RUNNING")
yield
finally:
logger.debug("RUNNING AGAIN")
_AVOID_RUNNING = False
def _iter_queued():
while _QUEUED:
(cls, args, kwargs) = _QUEUED.pop(0)
yield replace(cls, **kwargs)
# TODO: change documentation for simulation
@dataclass
class Simulation:
"""
Parameters
---------
config (optional): :class:`config.Config`
name of the Simulation
A simulation is a collection of agents and a model. It is responsible for running the model and agents, and collecting data from them.
kwargs: parameters to use to initialize a new configuration, if one not been provided.
Args:
version: The version of the simulation. This is used to determine how to load the simulation.
name: The name of the simulation.
description: A description of the simulation.
group: The group that the simulation belongs to.
model: The model to use for the simulation. This can be a string or a class.
model_params: The parameters to pass to the model.
seed: The seed to use for the simulation.
dir_path: The directory path to use for the simulation.
max_time: The maximum time to run the simulation.
max_steps: The maximum number of steps to run the simulation.
interval: The interval to use for the simulation.
num_trials: The number of trials (times) to run the simulation.
num_processes: The number of processes to use for the simulation. If greater than one, simulations will be performed in parallel. This may make debugging and error handling difficult.
tables: The tables to use in the simulation datacollector
agent_reporters: The agent reporters to use in the datacollector
model_reporters: The model reporters to use in the datacollector
dry_run: Whether or not to run the simulation. If True, the simulation will not be run.
source_file: Python file to use to find additional classes.
"""
version: str = "2"
name: str = "Unnamed simulation"
source_file: Optional[str] = None
name: Optional[str] = None
description: Optional[str] = ""
group: str = None
model: Union[str, type] = "soil.Environment"
@ -67,24 +93,17 @@ class Simulation:
outdir: Optional[str] = None
exporter_params: Optional[Dict[str, Any]] = field(default_factory=dict)
dry_run: bool = False
dump: bool = False
extra: Dict[str, Any] = field(default_factory=dict)
skip_test: Optional[bool] = False
debug: Optional[bool] = False
@classmethod
def from_dict(cls, env, **kwargs):
ignored = {
k: v for k, v in env.items() if k not in inspect.signature(cls).parameters
}
d = {k: v for k, v in env.items() if k not in ignored}
if ignored:
d.setdefault("extra", {}).update(ignored)
if ignored:
logger.warning(f'Ignoring these parameters (added to "extra"): { ignored }')
d.update(kwargs)
return cls(**d)
def __post_init__(self):
if self.name is None:
if isinstance(self.model, str):
self.name = self.model
else:
self.name = self.model.__class__.__name__
def run_simulation(self, *args, **kwargs):
return self.run(*args, **kwargs)
@ -102,13 +121,14 @@ class Simulation:
)
if _AVOID_RUNNING:
_QUEUED.append((self, args, kwargs))
return list()
return list(self.run_gen(*args, **kwargs))
return []
return list(self._run_gen(*args, **kwargs))
def run_gen(
def _run_gen(
self,
num_processes=1,
dry_run=None,
dump=None,
exporters=None,
outdir=None,
exporter_params={},
@ -123,6 +143,8 @@ class Simulation:
logger.info("Output directory: %s", outdir)
if dry_run is None:
dry_run = self.dry_run
if dump is None:
dump = self.dump
if exporters is None:
exporters = self.exporters
if not exporter_params:
@ -134,33 +156,50 @@ class Simulation:
known_modules=[
"soil.exporters",
],
dry_run=dry_run,
dump=dump and not dry_run,
outdir=outdir,
**exporter_params,
)
with utils.timer("simulation {}".format(self.name)):
for exporter in exporters:
exporter.sim_start()
if self.source_file:
source_file = self.source_file
if not os.path.isabs(source_file):
source_file = os.path.abspath(os.path.join(self.dir_path, source_file))
serialization.add_source_file(source_file)
try:
for env in utils.run_parallel(
func=self.run_trial,
iterable=range(int(self.num_trials)),
num_processes=num_processes,
log_level=log_level,
**kwargs,
):
with utils.timer("simulation {}".format(self.name)):
for exporter in exporters:
exporter.sim_start()
if dry_run:
def func(*args, **kwargs):
return None
else:
func = self.run_trial
for env in utils.run_parallel(
func=self.run_trial,
iterable=range(int(self.num_trials)),
num_processes=num_processes,
log_level=log_level,
**kwargs,
):
if env is None and dry_run:
continue
for exporter in exporters:
exporter.trial_end(env)
yield env
for exporter in exporters:
exporter.trial_start(env)
for exporter in exporters:
exporter.trial_end(env)
yield env
for exporter in exporters:
exporter.sim_end()
exporter.sim_end()
finally:
pass
# TODO: reintroduce
# if self.source_file:
# serialization.remove_source_file(self.source_file)
def get_env(self, trial_id=0, model_params=None, **kwargs):
"""Create an environment for a trial of the simulation"""
@ -188,6 +227,7 @@ class Simulation:
id=f"{self.name}_trial_{trial_id}",
seed=f"{self.seed}_trial_{trial_id}",
dir_path=self.dir_path,
interval=self.interval,
agent_reporters=agent_reporters,
model_reporters=model_reporters,
tables=tables,
@ -223,6 +263,9 @@ class Simulation:
def is_done():
return prev() or model.schedule.time >= until
if not model.schedule.agents:
raise Exception("No agents in model. This is probably a bug. Make sure that the model has agents scheduled after its initialization.")
if self.max_steps and self.max_steps > 0 and hasattr(model.schedule, "steps"):
prev_steps = is_done
@ -235,24 +278,21 @@ class Simulation:
dedent(
f"""
Model stats:
Agents (total: { model.schedule.get_agent_count() }):
- { (newline + ' - ').join(str(a) for a in model.schedule.agents) }
Agent count: { model.schedule.get_agent_count() }):
Topology size: { len(model.G) if hasattr(model, "G") else 0 }
"""
)
)
if self.debug:
set_trace()
while not is_done():
utils.logger.debug(
f'Simulation time {model.schedule.time}/{until}. Next: {getattr(model.schedule, "next_time", model.schedule.time + self.interval)}'
f'Simulation time {model.schedule.time}/{until}.'
)
model.step()
if (
model.schedule.time < until
): # Simulation ended (no more steps) before the expected time
model.schedule.time = until
return model
def to_dict(self):
@ -271,14 +311,19 @@ def iter_from_file(*files, **kwargs):
yield from iter_from_config(f, **kwargs)
def from_file(*args, **kwargs):
return list(iter_from_file(*args, **kwargs))
def iter_from_config(*cfgs, **kwargs):
for config in cfgs:
configs = list(serialization.load_config(config))
for config, path in configs:
d = dict(config)
d.update(kwargs)
if "dir_path" not in d:
d["dir_path"] = os.path.dirname(path)
yield Simulation.from_dict(d, **kwargs)
yield Simulation(**d)
def from_config(conf_or_path):
@ -293,7 +338,10 @@ def iter_from_py(pyfile, module_name='custom_simulation', **kwargs):
import importlib
import inspect
added = False
sims = []
assert not _AVOID_RUNNING
with do_not_run():
assert _AVOID_RUNNING
spec = importlib.util.spec_from_file_location(module_name, pyfile)
folder = os.path.dirname(pyfile)
if folder not in sys.path:
@ -304,28 +352,27 @@ def iter_from_py(pyfile, module_name='custom_simulation', **kwargs):
module = importlib.util.module_from_spec(spec)
sys.modules[module_name] = module
spec.loader.exec_module(module)
# import pdb;pdb.set_trace()
loaded = False
sims = []
for (_name, sim) in inspect.getmembers(module, lambda x: isinstance(x, Simulation)):
loaded = True
sims.append(sim)
for (_name, sim) in inspect.getmembers(module, lambda x: inspect.isclass(x) and issubclass(x, Simulation)):
loaded = True
sims.append(sim(**kwargs))
if not loaded:
raise AttributeError(f"No valid configurations found in {pyfile}")
for sim in _iter_queued():
sims.append(sim)
if not sims:
for (_name, sim) in inspect.getmembers(module, lambda x: inspect.isclass(x) and issubclass(x, Simulation)):
sims.append(sim(**kwargs))
del sys.modules[module_name]
assert not _AVOID_RUNNING
if not sims:
raise AttributeError(f"No valid configurations found in {pyfile}")
if added:
sys.path.remove(folder)
yield from sims
for sim in sims:
yield replace(sim, **kwargs)
def from_py(pyfile):
return next(iter_from_py(pyfile))
def run_from_file(*files, **kwargs):
for sim in iter_from_file(*files):
logger.info(f"Using config(s): {sim.name}")

View File

@ -97,7 +97,8 @@ class TimedActivation(BaseScheduler):
self._next = {}
self._queue = []
self._shuffle = shuffle
self.step_interval = 1
# self.step_interval = getattr(self.model, "interval", 1)
self.step_interval = self.model.interval
self.logger = logger.getChild(f"time_{ self.model }")
def add(self, agent: MesaAgent, when=None):

View File

@ -1,49 +0,0 @@
---
version: '2'
name: simple
group: tests
dir_path: "/tmp/"
num_trials: 3
max_time: 100
interval: 1
seed: "CompleteSeed!"
model_class: Environment
model_params:
topology:
params:
generator: complete_graph
n: 4
agents:
agent_class: CounterModel
state:
group: network
times: 1
topology: true
distribution:
- agent_class: CounterModel
weight: 0.25
state:
state_id: 0
times: 1
- agent_class: AggregatedCounter
weight: 0.5
state:
times: 2
override:
- filter:
node_id: 1
state:
name: 'Node 1'
- filter:
node_id: 2
state:
name: 'Node 2'
fixed:
- agent_class: BaseAgent
hidden: true
topology: false
state:
name: 'Environment Agent 1'
times: 10
group: environment
am_i_complete: true

View File

@ -1,37 +0,0 @@
---
name: simple
group: tests
dir_path: "/tmp/"
num_trials: 3
max_time: 100
interval: 1
seed: "CompleteSeed!"
network_params:
generator: complete_graph
n: 4
network_agents:
- agent_class: CounterModel
weight: 0.25
state:
state_id: 0
times: 1
- agent_class: AggregatedCounter
weight: 0.5
state:
times: 2
environment_agents:
- agent_id: 'Environment Agent 1'
agent_class: BaseAgent
state:
times: 10
environment_class: Environment
environment_params:
am_i_complete: true
agent_class: CounterModel
default_state:
times: 1
states:
1:
name: 'Node 1'
2:
name: 'Node 2'

View File

@ -22,7 +22,9 @@ class TestAgents(TestCase):
def test_die_raises_exception(self):
"""A dead agent should raise an exception if it is stepped after death"""
d = Dead(unique_id=0, model=environment.Environment())
assert d.alive
d.step()
assert not d.alive
with pytest.raises(stime.DeadAgent):
d.step()
@ -161,3 +163,15 @@ class TestAgents(TestCase):
assert sum(pings) == sum(range(time)) * 2
# It is the same as pings, without the leading 0
assert sum(pongs) == sum(range(time)) * 2
def test_agent_filter(self):
e = environment.Environment()
e.add_agent(agent_class=agents.BaseAgent)
e.add_agent(agent_class=agents.Evented)
base = list(e.agents(agent_class=agents.BaseAgent))
assert len(base) == 2
ev = list(e.agents(agent_class=agents.Evented))
assert len(ev) == 1
assert ev[0].unique_id == 1
null = list(e.agents(unique_ids=[0, 1], agent_class=agents.NetworkAgent))
assert not null

View File

@ -23,86 +23,18 @@ def isequal(a, b):
assert a == b
@skip("new versions of soil do not rely on configuration files")
# @skip("new versions of soil do not rely on configuration files")
class TestConfig(TestCase):
def test_conversion(self):
expected = serialization.load_file(join(ROOT, "complete_converted.yml"))[0]
old = serialization.load_file(join(ROOT, "old_complete.yml"))[0]
converted_defaults = config.convert_old(old, strict=False)
converted = converted_defaults.dict(exclude_unset=True)
isequal(converted, expected)
def test_configuration_changes(self):
"""
The configuration should not change after running
the simulation.
"""
config = serialization.load_file(join(EXAMPLES, "complete.yml"))[0]
s = simulation.from_config(config)
init_config = copy.copy(s.to_dict())
s.run_simulation(dry_run=True)
nconfig = s.to_dict()
# del nconfig['to
isequal(init_config, nconfig)
def test_topology_config(self):
netconfig = config.NetConfig(**{"path": join(ROOT, "test.gexf")})
net = network.from_config(netconfig, dir_path=ROOT)
assert len(net.nodes) == 2
assert len(net.edges) == 1
def test_env_from_config(self):
"""
Simple configuration that tests that the graph is loaded, and that
network agents are initialized properly.
"""
cfg = {
"name": "CounterAgent",
"model_params": {
"topology": join(ROOT, "test.gexf"),
"agent_class": "CounterModel",
},
# 'states': [{'times': 10}, {'times': 20}],
"max_time": 2,
"dry_run": True,
"num_trials": 1,
}
s = simulation.from_config(cfg)
env = s.get_env()
assert len(env.G.nodes) == 2
assert len(env.G.edges) == 1
assert len(env.agents) == 2
assert env.agents[0].G == env.G
def test_agents_from_config(self):
"""We test that the known complete configuration produces
the right agents in the right groups"""
cfg = serialization.load_file(join(ROOT, "complete_converted.yml"))[0]
s = simulation.from_config(cfg)
env = s.get_env()
assert len(env.G.nodes) == 4
assert len(env.agents(group="network")) == 4
assert len(env.agents(group="environment")) == 1
def test_yaml(self):
"""
The YAML version of a newly created configuration should be equivalent
to the configuration file used.
Values not present in the original config file should have reasonable
defaults.
"""
with utils.timer("loading"):
config = serialization.load_file(join(EXAMPLES, "complete.yml"))[0]
s = simulation.from_config(config)
with utils.timer("serializing"):
serial = s.to_yaml()
with utils.timer("recovering"):
recovered = yaml.load(serial, Loader=yaml.FullLoader)
for (k, v) in config.items():
assert recovered[k] == v
def test_torvalds_config(self):
sim = simulation.from_config(os.path.join(ROOT, "test_config.yml"))
assert sim.interval == 2
envs = sim.run()
assert len(envs) == 1
env = envs[0]
assert env.interval == 2
assert env.count_agents() == 3
assert env.now == 20
def make_example_test(path, cfg):
@ -116,7 +48,7 @@ def make_example_test(path, cfg):
s.num_trials = 1
if cfg.skip_test and not FORCE_TESTS:
self.skipTest('Example ignored.')
envs = s.run_simulation(dry_run=True)
envs = s.run_simulation(dump=False)
assert envs
for env in envs:
assert env

5
tests/test_config.yml Normal file
View File

@ -0,0 +1,5 @@
---
source_file: "../examples/torvalds_sim.py"
model: "TorvaldsEnv"
max_steps: 10
interval: 2

View File

@ -1,9 +1,12 @@
from unittest import TestCase
from unittest.case import SkipTest
import os
from os.path import join
from glob import glob
from soil import simulation, config, do_not_run
from soil import simulation
ROOT = os.path.abspath(os.path.dirname(__file__))
EXAMPLES = join(ROOT, "..", "examples")
@ -16,44 +19,54 @@ class TestExamples(TestCase):
pass
def get_test_for_sim(sim, path):
def get_test_for_sims(sims, path):
root = os.getcwd()
iterations = sim.max_steps * sim.num_trials
if iterations < 0 or iterations > 1000:
sim.max_steps = 100
sim.num_trials = 1
def wrapped(self):
envs = sim.run_simulation(dry_run=True)
assert envs
for env in envs:
assert env
try:
n = sim.model_params["network_params"]["n"]
assert len(list(env.network_agents)) == n
except KeyError:
pass
assert env.schedule.steps > 0 # It has run
assert env.schedule.steps <= sim.max_steps # But not further than allowed
run = False
for sim in sims:
if sim.skip_test and not FORCE_TESTS:
continue
run = True
iterations = sim.max_steps * sim.num_trials
if iterations < 0 or iterations > 1000:
sim.max_steps = 100
sim.num_trials = 1
envs = sim.run_simulation(dump=False)
assert envs
for env in envs:
assert env
assert env.now > 0
try:
n = sim.model_params["network_params"]["n"]
assert len(list(env.network_agents)) == n
except KeyError:
pass
assert env.schedule.steps > 0 # It has run
assert env.schedule.steps <= sim.max_steps # But not further than allowed
if not run:
raise SkipTest("Example ignored because all simulations are set up to be skipped.")
return wrapped
def add_example_tests():
sim_paths = []
sim_paths = {}
for path in glob(join(EXAMPLES, '**', '*.yml')):
if "soil_output" in path:
continue
if path not in sim_paths:
sim_paths[path] = []
for sim in simulation.iter_from_config(path):
sim_paths.append((sim, path))
sim_paths[path].append(sim)
for path in glob(join(EXAMPLES, '**', '*_sim.py')):
if path not in sim_paths:
sim_paths[path] = []
for sim in simulation.iter_from_py(path):
sim_paths.append((sim, path))
sim_paths[path].append(sim)
for (sim, path) in sim_paths:
if sim.skip_test and not FORCE_TESTS:
continue
test_case = get_test_for_sim(sim, path)
for (path, sims) in sim_paths.items():
test_case = get_test_for_sims(sims, path)
fname = os.path.basename(path)
test_case.__name__ = "test_example_file_%s" % fname
test_case.__doc__ = "%s should be a valid configuration" % fname

View File

@ -10,6 +10,8 @@ from soil import environment
from soil import simulation
from soil import agents
from mesa import Agent as MesaAgent
class Dummy(exporters.Exporter):
started = False
@ -41,14 +43,15 @@ class Exporters(TestCase):
# ticks every step
class SimpleEnv(environment.Environment):
def init(self):
self.add_agent(agent_class=agents.BaseAgent)
self.add_agent(agent_class=MesaAgent)
num_trials = 5
max_time = 2
s = simulation.Simulation(num_trials=num_trials, max_time=max_time, name="exporter_sim", dry_run=True, model=SimpleEnv)
s = simulation.Simulation(num_trials=num_trials, max_time=max_time, name="exporter_sim",
dump=False, model=SimpleEnv)
for env in s.run_simulation(exporters=[Dummy], dry_run=True):
for env in s.run_simulation(exporters=[Dummy], dump=False):
assert len(env.agents) == 1
assert Dummy.started
@ -60,18 +63,20 @@ class Exporters(TestCase):
assert Dummy.total_time == max_time * num_trials
def test_writing(self):
"""Try to write CSV, sqlite and YAML (without dry_run)"""
"""Try to write CSV, sqlite and YAML (without no_dump)"""
n_trials = 5
n_nodes = 4
max_time = 2
config = {
"name": "exporter_sim",
"model_params": {
"network_generator": "complete_graph",
"network_params": {"n": 4},
"network_params": {"n": n_nodes},
"agent_class": "CounterModel",
},
"max_time": 2,
"max_time": max_time,
"num_trials": n_trials,
"dry_run": False,
"dump": True,
}
output = io.StringIO()
s = simulation.from_config(config)
@ -87,7 +92,7 @@ class Exporters(TestCase):
"constant": lambda x: 1,
},
},
dry_run=False,
dump=True,
outdir=tmpdir,
exporter_params={"copy_to": output},
)
@ -100,12 +105,13 @@ class Exporters(TestCase):
try:
for e in envs:
db = sqlite3.connect(os.path.join(simdir, f"{s.name}.sqlite"))
dbpath = os.path.join(simdir, f"{s.name}.sqlite")
db = sqlite3.connect(dbpath)
cur = db.cursor()
agent_entries = cur.execute("SELECT * from agents").fetchall()
env_entries = cur.execute("SELECT * from env").fetchall()
assert len(agent_entries) > 0
assert len(env_entries) > 0
agent_entries = cur.execute("SELECT times FROM agents WHERE times > 0").fetchall()
env_entries = cur.execute("SELECT constant from env WHERE constant == 1").fetchall()
assert len(agent_entries) == n_nodes * n_trials * max_time
assert len(env_entries) == n_trials * max_time
with open(os.path.join(simdir, "{}.env.csv".format(e.id))) as f:
result = f.read()

View File

@ -6,9 +6,11 @@ import networkx as nx
from functools import partial
from os.path import join
from soil import simulation, Environment, agents, network, serialization, utils, config
from soil import simulation, Environment, agents, network, serialization, utils, config, from_file
from soil.time import Delta
from mesa import Agent as MesaAgent
ROOT = os.path.abspath(os.path.dirname(__file__))
EXAMPLES = join(ROOT, "..", "examples")
@ -30,11 +32,12 @@ class TestMain(TestCase):
config = {
"model_params": {
"topology": join(ROOT, "test.gexf"),
"agent_class": "NetworkAgent",
}
"agent_class": MesaAgent,
},
"max_time": 1
}
s = simulation.from_config(config)
s.run_simulation(dry_run=True)
s.run_simulation(dump=False)
def test_network_agent(self):
"""
@ -75,7 +78,6 @@ class TestMain(TestCase):
def test_init_and_count_agents(self):
"""Agents should be properly initialized and counting should filter them properly"""
# TODO: separate this test into two or more test cases
env = Environment(topology=join(ROOT, "test.gexf"))
env.populate_network([CustomAgent.w(weight=1), CustomAgent.w(weight=3)])
assert env.agents[0].weight == 1
@ -91,7 +93,7 @@ class TestMain(TestCase):
try:
os.chdir(os.path.dirname(pyfile))
s = simulation.from_py(pyfile)
env = s.run_simulation(dry_run=True)[0]
env = s.run_simulation(dump=False)[0]
for a in env.network_agents:
skill_level = a["skill_level"]
if a.node_id == "Torvalds":
@ -151,7 +153,6 @@ class TestMain(TestCase):
def step(self):
nonlocal n_runs
n_runs += 1
return super().step()
n_trials = 50
max_time = 2
@ -160,7 +161,7 @@ class TestMain(TestCase):
num_trials=n_trials,
max_time=max_time,
)
runs = list(s.run_simulation(dry_run=True))
runs = list(s.run_simulation(dump=False))
over = list(x.now for x in runs if x.now > 2)
assert len(runs) == n_trials
assert len(over) == 0
@ -203,3 +204,21 @@ class TestMain(TestCase):
assert when == 2
when = a.step()
assert when == Delta(a.interval)
def test_load_sim(self):
"""Make sure at least one of the examples can be loaded"""
sims = from_file(os.path.join(EXAMPLES, "newsspread", "newsspread_sim.py"))
assert len(sims) == 3*3*2
for sim in sims:
assert sim
assert sim.name == "newspread_sim"
assert sim.num_trials == 5
assert sim.max_steps == 300
assert not sim.dump
assert sim.model_params
assert "ratio_dumb" in sim.model_params
assert "ratio_herd" in sim.model_params
assert "ratio_wise" in sim.model_params
assert "network_generator" in sim.model_params
assert "network_params" in sim.model_params
assert "prob_neighbor_spread" in sim.model_params

View File

@ -79,8 +79,8 @@ class TestNetwork(TestCase):
env = environment.Environment(name="Test", topology=G)
env.populate_network(agents.NetworkAgent)
a2 = env.find_one(node_id=2)
a3 = env.find_one(node_id=3)
a2 = env.agent(node_id=2)
a3 = env.agent(node_id=3)
assert len(a2.subgraph(limit_neighbors=True)) == 2
assert len(a3.subgraph(limit_neighbors=True)) == 1
assert len(a3.subgraph(limit_neighbors=True, center=False)) == 0