=> Bootstrap dependency digest>=20211023: found digest-20220214 ===> Skipping vulnerability checks. WARNING: No /usr/pkg/pkgdb/pkg-vulnerabilities file found. WARNING: To fix run: `/usr/sbin/pkg_admin -K /usr/pkg/pkgdb fetch-pkg-vulnerabilities'. ===> Building for py312-scrapy-2.12.0 * Building wheel... running bdist_wheel /usr/pkg/lib/python3.12/site-packages/setuptools/_distutils/cmd.py:124: SetuptoolsDeprecationWarning: bdist_wheel.universal is deprecated !! ******************************************************************************** With Python 2.7 end-of-life, support for building universal wheels (i.e., wheels that support both Python 2 and Python 3) is being obviated. Please discontinue using this option, or if you still need it, file an issue with pypa/setuptools describing your use case. By 2025-Aug-30, you need to update your project and remove deprecated calls or your builds will no longer be supported. ******************************************************************************** !! self.finalize_options() running build running build_py creating build/lib/scrapy copying scrapy/__init__.py -> build/lib/scrapy copying scrapy/__main__.py -> build/lib/scrapy copying scrapy/addons.py -> build/lib/scrapy copying scrapy/cmdline.py -> build/lib/scrapy copying scrapy/crawler.py -> build/lib/scrapy copying scrapy/dupefilters.py -> build/lib/scrapy copying scrapy/exceptions.py -> build/lib/scrapy copying scrapy/exporters.py -> build/lib/scrapy copying scrapy/extension.py -> build/lib/scrapy copying scrapy/interfaces.py -> build/lib/scrapy copying scrapy/item.py -> build/lib/scrapy copying scrapy/link.py -> build/lib/scrapy copying scrapy/logformatter.py -> build/lib/scrapy copying scrapy/mail.py -> build/lib/scrapy copying scrapy/middleware.py -> build/lib/scrapy copying scrapy/pqueues.py -> build/lib/scrapy copying scrapy/resolver.py -> build/lib/scrapy copying scrapy/responsetypes.py -> build/lib/scrapy copying scrapy/robotstxt.py -> build/lib/scrapy copying scrapy/shell.py -> build/lib/scrapy copying scrapy/signalmanager.py -> build/lib/scrapy copying scrapy/signals.py -> build/lib/scrapy copying scrapy/spiderloader.py -> build/lib/scrapy copying scrapy/squeues.py -> build/lib/scrapy copying scrapy/statscollectors.py -> build/lib/scrapy creating build/lib/scrapy/commands copying scrapy/commands/__init__.py -> build/lib/scrapy/commands copying scrapy/commands/bench.py -> build/lib/scrapy/commands copying scrapy/commands/check.py -> build/lib/scrapy/commands copying scrapy/commands/crawl.py -> build/lib/scrapy/commands copying scrapy/commands/edit.py -> build/lib/scrapy/commands copying scrapy/commands/fetch.py -> build/lib/scrapy/commands copying scrapy/commands/genspider.py -> build/lib/scrapy/commands copying scrapy/commands/list.py -> build/lib/scrapy/commands copying scrapy/commands/parse.py -> build/lib/scrapy/commands copying scrapy/commands/runspider.py -> build/lib/scrapy/commands copying scrapy/commands/settings.py -> build/lib/scrapy/commands copying scrapy/commands/shell.py -> build/lib/scrapy/commands copying scrapy/commands/startproject.py -> build/lib/scrapy/commands copying scrapy/commands/version.py -> build/lib/scrapy/commands copying scrapy/commands/view.py -> build/lib/scrapy/commands creating build/lib/scrapy/contracts copying scrapy/contracts/__init__.py -> build/lib/scrapy/contracts copying scrapy/contracts/default.py -> build/lib/scrapy/contracts creating build/lib/scrapy/core copying scrapy/core/__init__.py -> build/lib/scrapy/core copying scrapy/core/engine.py -> build/lib/scrapy/core copying scrapy/core/scheduler.py -> build/lib/scrapy/core copying scrapy/core/scraper.py -> build/lib/scrapy/core copying scrapy/core/spidermw.py -> build/lib/scrapy/core creating build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/__init__.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/ajaxcrawl.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/cookies.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/defaultheaders.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/downloadtimeout.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpauth.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcache.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcompression.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpproxy.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/offsite.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/redirect.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/retry.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/robotstxt.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/stats.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/useragent.py -> build/lib/scrapy/downloadermiddlewares creating build/lib/scrapy/extensions copying scrapy/extensions/__init__.py -> build/lib/scrapy/extensions copying scrapy/extensions/closespider.py -> build/lib/scrapy/extensions copying scrapy/extensions/corestats.py -> build/lib/scrapy/extensions copying scrapy/extensions/debug.py -> build/lib/scrapy/extensions copying scrapy/extensions/feedexport.py -> build/lib/scrapy/extensions copying scrapy/extensions/httpcache.py -> build/lib/scrapy/extensions copying scrapy/extensions/logstats.py -> build/lib/scrapy/extensions copying scrapy/extensions/memdebug.py -> build/lib/scrapy/extensions copying scrapy/extensions/memusage.py -> build/lib/scrapy/extensions copying scrapy/extensions/periodic_log.py -> build/lib/scrapy/extensions copying scrapy/extensions/postprocessing.py -> build/lib/scrapy/extensions copying scrapy/extensions/spiderstate.py -> build/lib/scrapy/extensions copying scrapy/extensions/statsmailer.py -> build/lib/scrapy/extensions copying scrapy/extensions/telnet.py -> build/lib/scrapy/extensions copying scrapy/extensions/throttle.py -> build/lib/scrapy/extensions creating build/lib/scrapy/http copying scrapy/http/__init__.py -> build/lib/scrapy/http copying scrapy/http/cookies.py -> build/lib/scrapy/http copying scrapy/http/headers.py -> build/lib/scrapy/http creating build/lib/scrapy/linkextractors copying scrapy/linkextractors/__init__.py -> build/lib/scrapy/linkextractors copying scrapy/linkextractors/lxmlhtml.py -> build/lib/scrapy/linkextractors creating build/lib/scrapy/loader copying scrapy/loader/__init__.py -> build/lib/scrapy/loader creating build/lib/scrapy/pipelines copying scrapy/pipelines/__init__.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/files.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/images.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/media.py -> build/lib/scrapy/pipelines creating build/lib/scrapy/selector copying scrapy/selector/__init__.py -> build/lib/scrapy/selector copying scrapy/selector/unified.py -> build/lib/scrapy/selector creating build/lib/scrapy/settings copying scrapy/settings/__init__.py -> build/lib/scrapy/settings copying scrapy/settings/default_settings.py -> build/lib/scrapy/settings creating build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/__init__.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/depth.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/httperror.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/offsite.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/referer.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/urllength.py -> build/lib/scrapy/spidermiddlewares creating build/lib/scrapy/spiders copying scrapy/spiders/__init__.py -> build/lib/scrapy/spiders copying scrapy/spiders/crawl.py -> build/lib/scrapy/spiders copying scrapy/spiders/feed.py -> build/lib/scrapy/spiders copying scrapy/spiders/init.py -> build/lib/scrapy/spiders copying scrapy/spiders/sitemap.py -> build/lib/scrapy/spiders creating build/lib/scrapy/utils copying scrapy/utils/__init__.py -> build/lib/scrapy/utils copying scrapy/utils/_compression.py -> build/lib/scrapy/utils copying scrapy/utils/asyncgen.py -> build/lib/scrapy/utils copying scrapy/utils/benchserver.py -> build/lib/scrapy/utils copying scrapy/utils/boto.py -> build/lib/scrapy/utils copying scrapy/utils/conf.py -> build/lib/scrapy/utils copying scrapy/utils/console.py -> build/lib/scrapy/utils copying scrapy/utils/curl.py -> build/lib/scrapy/utils copying scrapy/utils/datatypes.py -> build/lib/scrapy/utils copying scrapy/utils/decorators.py -> build/lib/scrapy/utils copying scrapy/utils/defer.py -> build/lib/scrapy/utils copying scrapy/utils/deprecate.py -> build/lib/scrapy/utils copying scrapy/utils/display.py -> build/lib/scrapy/utils copying scrapy/utils/engine.py -> build/lib/scrapy/utils copying scrapy/utils/ftp.py -> build/lib/scrapy/utils copying scrapy/utils/gz.py -> build/lib/scrapy/utils copying scrapy/utils/httpobj.py -> build/lib/scrapy/utils copying scrapy/utils/iterators.py -> build/lib/scrapy/utils copying scrapy/utils/job.py -> build/lib/scrapy/utils copying scrapy/utils/log.py -> build/lib/scrapy/utils copying scrapy/utils/misc.py -> build/lib/scrapy/utils copying scrapy/utils/ossignal.py -> build/lib/scrapy/utils copying scrapy/utils/project.py -> build/lib/scrapy/utils copying scrapy/utils/python.py -> build/lib/scrapy/utils copying scrapy/utils/reactor.py -> build/lib/scrapy/utils copying scrapy/utils/request.py -> build/lib/scrapy/utils copying scrapy/utils/response.py -> build/lib/scrapy/utils copying scrapy/utils/serialize.py -> build/lib/scrapy/utils copying scrapy/utils/signal.py -> build/lib/scrapy/utils copying scrapy/utils/sitemap.py -> build/lib/scrapy/utils copying scrapy/utils/spider.py -> build/lib/scrapy/utils copying scrapy/utils/ssl.py -> build/lib/scrapy/utils copying scrapy/utils/template.py -> build/lib/scrapy/utils copying scrapy/utils/test.py -> build/lib/scrapy/utils copying scrapy/utils/testproc.py -> build/lib/scrapy/utils copying scrapy/utils/testsite.py -> build/lib/scrapy/utils copying scrapy/utils/trackref.py -> build/lib/scrapy/utils copying scrapy/utils/url.py -> build/lib/scrapy/utils copying scrapy/utils/versions.py -> build/lib/scrapy/utils creating build/lib/scrapy/core/downloader copying scrapy/core/downloader/__init__.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/contextfactory.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/middleware.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/tls.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/webclient.py -> build/lib/scrapy/core/downloader creating build/lib/scrapy/core/http2 copying scrapy/core/http2/__init__.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/agent.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/protocol.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/stream.py -> build/lib/scrapy/core/http2 creating build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/__init__.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/datauri.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/file.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/ftp.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http10.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http11.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http2.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/s3.py -> build/lib/scrapy/core/downloader/handlers creating build/lib/scrapy/http/request copying scrapy/http/request/__init__.py -> build/lib/scrapy/http/request copying scrapy/http/request/form.py -> build/lib/scrapy/http/request copying scrapy/http/request/json_request.py -> build/lib/scrapy/http/request copying scrapy/http/request/rpc.py -> build/lib/scrapy/http/request creating build/lib/scrapy/http/response copying scrapy/http/response/__init__.py -> build/lib/scrapy/http/response copying scrapy/http/response/html.py -> build/lib/scrapy/http/response copying scrapy/http/response/json.py -> build/lib/scrapy/http/response copying scrapy/http/response/text.py -> build/lib/scrapy/http/response copying scrapy/http/response/xml.py -> build/lib/scrapy/http/response running egg_info writing Scrapy.egg-info/PKG-INFO writing dependency_links to Scrapy.egg-info/dependency_links.txt writing entry points to Scrapy.egg-info/entry_points.txt writing requirements to Scrapy.egg-info/requires.txt writing top-level names to Scrapy.egg-info/top_level.txt reading manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' no previously-included directories found matching 'docs/build' warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '*.py[cod]' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'Scrapy.egg-info/SOURCES.txt' /usr/pkg/lib/python3.12/site-packages/setuptools/command/build_py.py:212: _Warning: Package 'scrapy.templates.project' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project' to be distributed and are already explicitly excluding 'scrapy.templates.project' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/pkg/lib/python3.12/site-packages/setuptools/command/build_py.py:212: _Warning: Package 'scrapy.templates.project.module' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module' to be distributed and are already explicitly excluding 'scrapy.templates.project.module' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/pkg/lib/python3.12/site-packages/setuptools/command/build_py.py:212: _Warning: Package 'scrapy.templates.project.module.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.project.module.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/pkg/lib/python3.12/site-packages/setuptools/command/build_py.py:212: _Warning: Package 'scrapy.templates.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) copying scrapy/VERSION -> build/lib/scrapy copying scrapy/mime.types -> build/lib/scrapy copying scrapy/py.typed -> build/lib/scrapy creating build/lib/scrapy/templates/project copying scrapy/templates/project/scrapy.cfg -> build/lib/scrapy/templates/project creating build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/__init__.py -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/items.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/middlewares.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/pipelines.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/settings.py.tmpl -> build/lib/scrapy/templates/project/module creating build/lib/scrapy/templates/project/module/spiders copying scrapy/templates/project/module/spiders/__init__.py -> build/lib/scrapy/templates/project/module/spiders creating build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/basic.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/crawl.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/csvfeed.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/xmlfeed.tmpl -> build/lib/scrapy/templates/spiders installing to build/bdist.netbsd-9.0-amd64/wheel running install running install_lib creating build/bdist.netbsd-9.0-amd64/wheel creating build/bdist.netbsd-9.0-amd64/wheel/scrapy copying build/lib/scrapy/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/__main__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/addons.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/cmdline.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/crawler.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/dupefilters.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/exceptions.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/exporters.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/extension.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/interfaces.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/item.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/link.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/logformatter.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/mail.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/middleware.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/pqueues.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/resolver.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/responsetypes.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/robotstxt.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/shell.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/signalmanager.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/signals.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/spiderloader.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/squeues.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/statscollectors.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/commands copying build/lib/scrapy/commands/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/bench.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/check.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/crawl.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/edit.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/fetch.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/genspider.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/list.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/parse.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/runspider.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/settings.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/shell.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/startproject.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/version.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands copying build/lib/scrapy/commands/view.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/commands creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/contracts copying build/lib/scrapy/contracts/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/contracts copying build/lib/scrapy/contracts/default.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/contracts creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/core copying build/lib/scrapy/core/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core copying build/lib/scrapy/core/engine.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core copying build/lib/scrapy/core/scheduler.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core copying build/lib/scrapy/core/scraper.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core copying build/lib/scrapy/core/spidermw.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/contextfactory.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/middleware.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/tls.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader copying build/lib/scrapy/core/downloader/webclient.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/datauri.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/file.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/ftp.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http10.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http11.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http2.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/s3.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/downloader/handlers creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/agent.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/protocol.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/http2 copying build/lib/scrapy/core/http2/stream.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/core/http2 creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/ajaxcrawl.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/cookies.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/defaultheaders.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/downloadtimeout.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpauth.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcache.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcompression.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpproxy.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/offsite.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/redirect.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/retry.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/robotstxt.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/stats.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/useragent.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/downloadermiddlewares creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/closespider.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/corestats.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/debug.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/feedexport.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/httpcache.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/logstats.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/memdebug.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/memusage.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/periodic_log.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/postprocessing.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/spiderstate.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/statsmailer.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/telnet.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions copying build/lib/scrapy/extensions/throttle.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/extensions creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/http copying build/lib/scrapy/http/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http copying build/lib/scrapy/http/cookies.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http copying build/lib/scrapy/http/headers.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/http/request copying build/lib/scrapy/http/request/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/form.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/json_request.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/request copying build/lib/scrapy/http/request/rpc.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/request creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/http/response copying build/lib/scrapy/http/response/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/html.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/json.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/text.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/response copying build/lib/scrapy/http/response/xml.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/http/response creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/linkextractors copying build/lib/scrapy/linkextractors/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/linkextractors copying build/lib/scrapy/linkextractors/lxmlhtml.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/linkextractors creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/loader copying build/lib/scrapy/loader/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/loader creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/files.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/images.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/pipelines copying build/lib/scrapy/pipelines/media.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/pipelines creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/selector copying build/lib/scrapy/selector/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/selector copying build/lib/scrapy/selector/unified.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/selector creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/settings copying build/lib/scrapy/settings/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/settings copying build/lib/scrapy/settings/default_settings.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/settings creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/depth.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/httperror.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/offsite.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/referer.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/urllength.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spidermiddlewares creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/crawl.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/feed.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/init.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spiders copying build/lib/scrapy/spiders/sitemap.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/spiders creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/utils copying build/lib/scrapy/utils/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/_compression.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/asyncgen.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/benchserver.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/boto.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/conf.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/console.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/curl.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/datatypes.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/decorators.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/defer.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/deprecate.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/display.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/engine.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/ftp.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/gz.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/httpobj.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/iterators.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/job.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/log.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/misc.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/ossignal.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/project.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/python.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/reactor.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/request.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/response.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/serialize.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/signal.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/sitemap.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/spider.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/ssl.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/template.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/test.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/testproc.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/testsite.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/trackref.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/url.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/utils/versions.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/utils copying build/lib/scrapy/VERSION -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/mime.types -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy copying build/lib/scrapy/py.typed -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/templates creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/templates/project copying build/lib/scrapy/templates/project/scrapy.cfg -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/items.py.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/middlewares.py.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/pipelines.py.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/settings.py.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/templates/project/module/spiders copying build/lib/scrapy/templates/project/module/spiders/__init__.py -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/project/module/spiders creating build/bdist.netbsd-9.0-amd64/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/basic.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/crawl.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/csvfeed.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/xmlfeed.tmpl -> build/bdist.netbsd-9.0-amd64/wheel/./scrapy/templates/spiders running install_egg_info Copying Scrapy.egg-info to build/bdist.netbsd-9.0-amd64/wheel/./Scrapy-2.12.0-py3.12.egg-info running install_scripts creating build/bdist.netbsd-9.0-amd64/wheel/scrapy-2.12.0.dist-info/WHEEL creating '/pbulk/work/www/py-scrapy/work/scrapy-2.12.0/dist/.tmp-x2nnlnep/scrapy-2.12.0-py2.py3-none-any.whl' and adding 'build/bdist.netbsd-9.0-amd64/wheel' to it adding 'scrapy/VERSION' adding 'scrapy/__init__.py' adding 'scrapy/__main__.py' adding 'scrapy/addons.py' adding 'scrapy/cmdline.py' adding 'scrapy/crawler.py' adding 'scrapy/dupefilters.py' adding 'scrapy/exceptions.py' adding 'scrapy/exporters.py' adding 'scrapy/extension.py' adding 'scrapy/interfaces.py' adding 'scrapy/item.py' adding 'scrapy/link.py' adding 'scrapy/logformatter.py' adding 'scrapy/mail.py' adding 'scrapy/middleware.py' adding 'scrapy/mime.types' adding 'scrapy/pqueues.py' adding 'scrapy/py.typed' adding 'scrapy/resolver.py' adding 'scrapy/responsetypes.py' adding 'scrapy/robotstxt.py' adding 'scrapy/shell.py' adding 'scrapy/signalmanager.py' adding 'scrapy/signals.py' adding 'scrapy/spiderloader.py' adding 'scrapy/squeues.py' adding 'scrapy/statscollectors.py' adding 'scrapy/commands/__init__.py' adding 'scrapy/commands/bench.py' adding 'scrapy/commands/check.py' adding 'scrapy/commands/crawl.py' adding 'scrapy/commands/edit.py' adding 'scrapy/commands/fetch.py' adding 'scrapy/commands/genspider.py' adding 'scrapy/commands/list.py' adding 'scrapy/commands/parse.py' adding 'scrapy/commands/runspider.py' adding 'scrapy/commands/settings.py' adding 'scrapy/commands/shell.py' adding 'scrapy/commands/startproject.py' adding 'scrapy/commands/version.py' adding 'scrapy/commands/view.py' adding 'scrapy/contracts/__init__.py' adding 'scrapy/contracts/default.py' adding 'scrapy/core/__init__.py' adding 'scrapy/core/engine.py' adding 'scrapy/core/scheduler.py' adding 'scrapy/core/scraper.py' adding 'scrapy/core/spidermw.py' adding 'scrapy/core/downloader/__init__.py' adding 'scrapy/core/downloader/contextfactory.py' adding 'scrapy/core/downloader/middleware.py' adding 'scrapy/core/downloader/tls.py' adding 'scrapy/core/downloader/webclient.py' adding 'scrapy/core/downloader/handlers/__init__.py' adding 'scrapy/core/downloader/handlers/datauri.py' adding 'scrapy/core/downloader/handlers/file.py' adding 'scrapy/core/downloader/handlers/ftp.py' adding 'scrapy/core/downloader/handlers/http.py' adding 'scrapy/core/downloader/handlers/http10.py' adding 'scrapy/core/downloader/handlers/http11.py' adding 'scrapy/core/downloader/handlers/http2.py' adding 'scrapy/core/downloader/handlers/s3.py' adding 'scrapy/core/http2/__init__.py' adding 'scrapy/core/http2/agent.py' adding 'scrapy/core/http2/protocol.py' adding 'scrapy/core/http2/stream.py' adding 'scrapy/downloadermiddlewares/__init__.py' adding 'scrapy/downloadermiddlewares/ajaxcrawl.py' adding 'scrapy/downloadermiddlewares/cookies.py' adding 'scrapy/downloadermiddlewares/defaultheaders.py' adding 'scrapy/downloadermiddlewares/downloadtimeout.py' adding 'scrapy/downloadermiddlewares/httpauth.py' adding 'scrapy/downloadermiddlewares/httpcache.py' adding 'scrapy/downloadermiddlewares/httpcompression.py' adding 'scrapy/downloadermiddlewares/httpproxy.py' adding 'scrapy/downloadermiddlewares/offsite.py' adding 'scrapy/downloadermiddlewares/redirect.py' adding 'scrapy/downloadermiddlewares/retry.py' adding 'scrapy/downloadermiddlewares/robotstxt.py' adding 'scrapy/downloadermiddlewares/stats.py' adding 'scrapy/downloadermiddlewares/useragent.py' adding 'scrapy/extensions/__init__.py' adding 'scrapy/extensions/closespider.py' adding 'scrapy/extensions/corestats.py' adding 'scrapy/extensions/debug.py' adding 'scrapy/extensions/feedexport.py' adding 'scrapy/extensions/httpcache.py' adding 'scrapy/extensions/logstats.py' adding 'scrapy/extensions/memdebug.py' adding 'scrapy/extensions/memusage.py' adding 'scrapy/extensions/periodic_log.py' adding 'scrapy/extensions/postprocessing.py' adding 'scrapy/extensions/spiderstate.py' adding 'scrapy/extensions/statsmailer.py' adding 'scrapy/extensions/telnet.py' adding 'scrapy/extensions/throttle.py' adding 'scrapy/http/__init__.py' adding 'scrapy/http/cookies.py' adding 'scrapy/http/headers.py' adding 'scrapy/http/request/__init__.py' adding 'scrapy/http/request/form.py' adding 'scrapy/http/request/json_request.py' adding 'scrapy/http/request/rpc.py' adding 'scrapy/http/response/__init__.py' adding 'scrapy/http/response/html.py' adding 'scrapy/http/response/json.py' adding 'scrapy/http/response/text.py' adding 'scrapy/http/response/xml.py' adding 'scrapy/linkextractors/__init__.py' adding 'scrapy/linkextractors/lxmlhtml.py' adding 'scrapy/loader/__init__.py' adding 'scrapy/pipelines/__init__.py' adding 'scrapy/pipelines/files.py' adding 'scrapy/pipelines/images.py' adding 'scrapy/pipelines/media.py' adding 'scrapy/selector/__init__.py' adding 'scrapy/selector/unified.py' adding 'scrapy/settings/__init__.py' adding 'scrapy/settings/default_settings.py' adding 'scrapy/spidermiddlewares/__init__.py' adding 'scrapy/spidermiddlewares/depth.py' adding 'scrapy/spidermiddlewares/httperror.py' adding 'scrapy/spidermiddlewares/offsite.py' adding 'scrapy/spidermiddlewares/referer.py' adding 'scrapy/spidermiddlewares/urllength.py' adding 'scrapy/spiders/__init__.py' adding 'scrapy/spiders/crawl.py' adding 'scrapy/spiders/feed.py' adding 'scrapy/spiders/init.py' adding 'scrapy/spiders/sitemap.py' adding 'scrapy/templates/project/scrapy.cfg' adding 'scrapy/templates/project/module/__init__.py' adding 'scrapy/templates/project/module/items.py.tmpl' adding 'scrapy/templates/project/module/middlewares.py.tmpl' adding 'scrapy/templates/project/module/pipelines.py.tmpl' adding 'scrapy/templates/project/module/settings.py.tmpl' adding 'scrapy/templates/project/module/spiders/__init__.py' adding 'scrapy/templates/spiders/basic.tmpl' adding 'scrapy/templates/spiders/crawl.tmpl' adding 'scrapy/templates/spiders/csvfeed.tmpl' adding 'scrapy/templates/spiders/xmlfeed.tmpl' adding 'scrapy/utils/__init__.py' adding 'scrapy/utils/_compression.py' adding 'scrapy/utils/asyncgen.py' adding 'scrapy/utils/benchserver.py' adding 'scrapy/utils/boto.py' adding 'scrapy/utils/conf.py' adding 'scrapy/utils/console.py' adding 'scrapy/utils/curl.py' adding 'scrapy/utils/datatypes.py' adding 'scrapy/utils/decorators.py' adding 'scrapy/utils/defer.py' adding 'scrapy/utils/deprecate.py' adding 'scrapy/utils/display.py' adding 'scrapy/utils/engine.py' adding 'scrapy/utils/ftp.py' adding 'scrapy/utils/gz.py' adding 'scrapy/utils/httpobj.py' adding 'scrapy/utils/iterators.py' adding 'scrapy/utils/job.py' adding 'scrapy/utils/log.py' adding 'scrapy/utils/misc.py' adding 'scrapy/utils/ossignal.py' adding 'scrapy/utils/project.py' adding 'scrapy/utils/python.py' adding 'scrapy/utils/reactor.py' adding 'scrapy/utils/request.py' adding 'scrapy/utils/response.py' adding 'scrapy/utils/serialize.py' adding 'scrapy/utils/signal.py' adding 'scrapy/utils/sitemap.py' adding 'scrapy/utils/spider.py' adding 'scrapy/utils/ssl.py' adding 'scrapy/utils/template.py' adding 'scrapy/utils/test.py' adding 'scrapy/utils/testproc.py' adding 'scrapy/utils/testsite.py' adding 'scrapy/utils/trackref.py' adding 'scrapy/utils/url.py' adding 'scrapy/utils/versions.py' adding 'scrapy-2.12.0.dist-info/AUTHORS' adding 'scrapy-2.12.0.dist-info/LICENSE' adding 'scrapy-2.12.0.dist-info/METADATA' adding 'scrapy-2.12.0.dist-info/WHEEL' adding 'scrapy-2.12.0.dist-info/entry_points.txt' adding 'scrapy-2.12.0.dist-info/top_level.txt' adding 'scrapy-2.12.0.dist-info/RECORD' removing build/bdist.netbsd-9.0-amd64/wheel Successfully built scrapy-2.12.0-py2.py3-none-any.whl