# HG changeset patch # User Goffi # Date 1680954882 -7200 # Node ID 524856bd7b198ef9f38b9be12dc451d791168671 # Parent c4464d7ae97b52dc47dcc28c58f2af6211f25781 massive refactoring to switch from camelCase to snake_case: historically, Libervia (SàT before) was using camelCase as allowed by PEP8 when using a pre-PEP8 code, to use the same coding style as in Twisted. However, snake_case is more readable and it's better to follow PEP8 best practices, so it has been decided to move on full snake_case. Because Libervia has a huge codebase, this ended with a ugly mix of camelCase and snake_case. To fix that, this patch does a big refactoring by renaming every function and method (including bridge) that are not coming from Twisted or Wokkel, to use fully snake_case. This is a massive change, and may result in some bugs. diff -r c4464d7ae97b -r 524856bd7b19 doc/components.rst --- a/doc/components.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/components.rst Sat Apr 08 13:54:42 2023 +0200 @@ -384,7 +384,7 @@ The encoding is explained in the documentation of the following method: -.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.getJIDAndNode +.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.get_jid_and_node .. [#AP_chars] Most if not all AP implementations use webfinger `acct` URI as a de-facto @@ -859,10 +859,10 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Publication of AP items can be tested using the following method (with can be accessed -through the ``APSend`` bridge method, client is then replaced by the ``profile`` name, as +through the ``ap_send`` bridge method, client is then replaced by the ``profile`` name, as last argument): -.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.publishMessage +.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.publish_message The method can be used either with CLI's :ref:`debug bridge method ` or with any D-Bus tool like ``qdbus`` or ``d-feet`` (only if you @@ -875,7 +875,7 @@ ``https://example.net/@pierre/106986412193109832``. To send a reply to this message, Louise can use the following command:: - $ li debug bridge method -c APSend '"{\"node\": \"https://example.net/@pierre/106986412193109832\", \"content\": \"A lille hello from XMPP\"}","pierre\\40example.net@ap.example.org", "louise"' + $ li debug bridge method -c ap_send '"{\"node\": \"https://example.net/@pierre/106986412193109832\", \"content\": \"A lille hello from XMPP\"}","pierre\\40example.net@ap.example.org", "louise"' Note the double escaping, one for the shell argument, and the other to specify JSON object. diff -r c4464d7ae97b -r 524856bd7b19 doc/developer.rst --- a/doc/developer.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/developer.rst Sat Apr 08 13:54:42 2023 +0200 @@ -60,9 +60,9 @@ analysers are checked, and the first one matching is used to determine if the node must be synchronised or not. -Analysers can be registered by any plugins using ``registerAnalyser`` method: +Analysers can be registered by any plugins using ``register_analyser`` method: -.. automethod:: sat.plugins.plugin_pubsub_cache.PubsubCache.registerAnalyser +.. automethod:: sat.plugins.plugin_pubsub_cache.PubsubCache.register_analyser If no analyser is found, ``to_sync`` is false, or an error happens during the caching, the node won't be synchronised and the pubsub service will always be requested. diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/blog.rst --- a/doc/libervia-cli/blog.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/blog.rst Sat Apr 08 13:54:42 2023 +0200 @@ -333,11 +333,11 @@ $ li blog import dotclear -Import a Dotclear blog:: +import a Dotclear blog:: $ li blog import dotclear /path/to/dotclear.dump -Import a Dotclear blog without uploading images:: +import a Dotclear blog without uploading images:: $ li blog import --no-images-upload dotclear /path/to/dotclear.dump diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/debug.rst --- a/doc/libervia-cli/debug.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/debug.rst Sat Apr 08 13:54:42 2023 +0200 @@ -17,7 +17,7 @@ You profile is automatically set if the method requires it (using the value of ``-p PROFILE, --profile PROFILE``), so you must not specify it as an extra argument. -You can refer to `Bridge API documentation`_ to get core methods signatures +You can refer to `bridge API documentation`_ to get core methods signatures .. _Bridge API documentation: https://wiki.goffi.org/wiki/Bridge_API @@ -26,12 +26,12 @@ -------- Send a message using a single shell arguments for all Python arguments. We -use first the method name (``messageSend``), then the required arguments (see `Bridge +use first the method name (``message_send``), then the required arguments (see `bridge API documentation`_ for details), without the profile as it is automatically set. We specify them as Python in one shell argument, so we use single quote (``\```)first for s hell string, and inside it we use double quote (``"``) for Python strings:: - $ li debug bridge method messageSend '"louise@example.org", {"": "test message"}, {}, "auto", {}' + $ li debug bridge method message_send '"louise@example.org", {"": "test message"}, {}, "auto", {}' .. note:: @@ -39,7 +39,7 @@ Get version string of Libervia:: - $ li debug bridge method getVersion + $ li debug bridge method version_get bridge signal @@ -56,7 +56,7 @@ store the level, so we can easily change it if we want to use an other level for tests. Note the use of quotes (to escape both for shell and Python):: - $ LEVEL='info'; li debug bridge signal -c actionNew '{"xmlui": '"'"'test message\non\nseveral\nlines'"'"'}' '""' -1 + $ LEVEL='info'; li debug bridge signal -c action_new '{"xmlui": '"'"'test message\non\nseveral\nlines'"'"'}' '""' -1 monitor diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/event.rst --- a/doc/libervia-cli/event.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/event.rst Sat Apr 08 13:54:42 2023 +0200 @@ -35,10 +35,10 @@ If your organsise an item, the ``--rsvp`` flag should be used: it will use the default RSVP form which ask for attendance. If you want to request more information to your guest, ``--rsvp_json JSON`` can be used: the JSON argument is a data dict as described in -``dataDict2dataForm`` function where the ``namespace`` key is not necessary (it's set +``data_dict_2_data_form`` function where the ``namespace`` key is not necessary (it's set automatically): -.. autofunction:: sat.tools.xml_tools.dataDict2dataForm +.. autofunction:: sat.tools.xml_tools.data_dict_2_data_form If the event links to an other one, ``--external JID NODE ITEM`` can be user diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/list.rst --- a/doc/libervia-cli/list.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/list.rst Sat Apr 08 13:54:42 2023 +0200 @@ -68,7 +68,7 @@ import ====== -Import lists from an external source. This works in the same way as +import lists from an external source. This works in the same way as :ref:`libervia-cli_blog_import`: you need to specify an importer and a data location. If you let both positional argument empty, you'll get list of importers, if you specify importer but not data location, you'll get a description on how the importer works. @@ -99,7 +99,7 @@ $ li list import bugzilla -Import lists from a Bugzilla XML export file at ``~/bugzilla_export.xml`` to the +import lists from a Bugzilla XML export file at ``~/bugzilla_export.xml`` to the ``pubsub.example.org`` PubSub service. We use default lists node and want a progression bar:: diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/merge-request.rst --- a/doc/libervia-cli/merge-request.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/merge-request.rst Sat Apr 08 13:54:42 2023 +0200 @@ -66,7 +66,7 @@ import ====== -Import a merge request into your project. You mainly have to be in the project repository +import a merge request into your project. You mainly have to be in the project repository (or specify it using ``-r PATH, --repository PATH``) and to specify the id of the patch to import (using ``-i ITEM, --item ITEM``). The behaviour depends of the type of the patch, for Mercurial, the patch will be imported as `MQ`_ patch. @@ -76,6 +76,6 @@ example ------- -Import the merge request with id 321:: +import the merge request with id 321:: $ li merge-request import -i 321 diff -r c4464d7ae97b -r 524856bd7b19 doc/libervia-cli/pubsub_node.rst --- a/doc/libervia-cli/pubsub_node.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/pubsub_node.rst Sat Apr 08 13:54:42 2023 +0200 @@ -93,7 +93,7 @@ import ====== -Import a raw XML containing items to create in the node. The path to the XML file is used +import a raw XML containing items to create in the node. The path to the XML file is used as positional argument. The XML file must contain full `` element for each item to import. The output of ``pubsub get`` can be used directly. @@ -103,7 +103,7 @@ example ------- -Import a node backup which has previously been saved using ``li blog get -M -1 -n +import a node backup which has previously been saved using ``li blog get -M -1 -n some_node > some_node_backup.xml``:: $ li pubsub node import -n some_node ~/some_node_backup.xml diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/base_constructor.py --- a/sat/bridge/bridge_constructor/base_constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/base_constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,7 +74,7 @@ function[option] = value return function - def getDefault(self, name): + def get_default(self, name): """Return default values of a function in a dict @param name: Name of the function to get @return: dict, each key is the integer param number (no key if no default value)""" @@ -106,7 +106,7 @@ flags.append(option) return flags - def getArgumentsDoc(self, name): + def get_arguments_doc(self, name): """Return documentation of arguments @param name: Name of the function to get @return: dict, each key is the integer param number (no key if no argument doc), value is a tuple (name, doc)""" @@ -131,7 +131,7 @@ doc_dict[idx] = (value_match.group(1), value_match.group(2)) return doc_dict - def getDoc(self, name): + def get_doc(self, name): """Return documentation of the method @param name: Name of the function to get @return: string documentation, or None""" @@ -139,7 +139,7 @@ return self.bridge_template.get(name, "doc") return None - def argumentsParser(self, signature): + def arguments_parser(self, signature): """Generator which return individual arguments signatures from a global signature""" start = 0 i = 0 @@ -176,19 +176,19 @@ yield signature[start:i] start = i - def getArguments(self, signature, name=None, default=None, unicode_protect=False): + def get_arguments(self, signature, name=None, default=None, unicode_protect=False): """Return arguments to user given a signature @param signature: signature in the short form (using s,a,i,b etc) - @param name: dictionary of arguments name like given by getArgumentsDoc - @param default: dictionary of default values, like given by getDefault + @param name: dictionary of arguments name like given by get_arguments_doc + @param default: dictionary of default values, like given by get_default @param unicode_protect: activate unicode protection on strings (return strings as unicode(str)) @return (str): arguments that correspond to a signature (e.g.: "sss" return "arg1, arg2, arg3") """ idx = 0 attr_string = [] - for arg in self.argumentsParser(signature): + for arg in self.arguments_parser(signature): attr_string.append( ( "str(%(name)s)%(default)s" @@ -206,7 +206,7 @@ return ", ".join(attr_string) - def getTemplatePath(self, template_file): + def get_template_path(self, template_file): """return template path corresponding to file name @param template_file(str): name of template file @@ -232,12 +232,12 @@ def generate(self, side): """generate bridge - call generateCoreSide or generateFrontendSide if they exists + call generate_core_side or generateFrontendSide if they exists else call generic self._generate method """ try: if side == "core": - method = self.generateCoreSide + method = self.generate_core_side elif side == "frontend": if not self.FRONTEND_ACTIVATE: print("This constructor only handle core, please use core side") @@ -272,8 +272,8 @@ for section in sections: function = self.getValues(section) print(("Adding %s %s" % (section, function["type"]))) - default = self.getDefault(section) - arg_doc = self.getArgumentsDoc(section) + default = self.get_default(section) + arg_doc = self.get_arguments_doc(section) async_ = "async" in self.getFlags(section) completion = { "sig_in": function["sig_in"] or "", @@ -281,10 +281,10 @@ "category": "plugin" if function["category"] == "plugin" else "core", "name": section, # arguments with default values - "args": self.getArguments( + "args": self.get_arguments( function["sig_in"], name=arg_doc, default=default ), - "args_no_default": self.getArguments(function["sig_in"], name=arg_doc), + "args_no_default": self.get_arguments(function["sig_in"], name=arg_doc), } extend_method = getattr( @@ -305,7 +305,7 @@ for env, v in os.environ.items() if env.startswith(C.ENV_OVERRIDE) } - template_path = self.getTemplatePath(TEMPLATE) + template_path = self.get_template_path(TEMPLATE) try: with open(template_path) as template: for line in template: @@ -332,9 +332,9 @@ sys.exit(1) # now we write to final file - self.finalWrite(DEST, bridge) + self.final_write(DEST, bridge) - def finalWrite(self, filename, file_buf): + def final_write(self, filename, file_buf): """Write the final generated file in [dest dir]/filename @param filename: name of the file to generate diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/bridge_constructor.py --- a/sat/bridge/bridge_constructor/bridge_constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/bridge_constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,7 +32,7 @@ class BridgeConstructor(object): - def importConstructors(self): + def import_constructors(self): constructors_dir = os.path.dirname(constructors.__file__) self.protocoles = {} for dir_ in os.listdir(constructors_dir): @@ -120,7 +120,7 @@ return parser.parse_args() def go(self): - self.importConstructors() + self.import_constructors() args = self.parse_args() template_parser = Parser() try: diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/bridge_template.ini --- a/sat/bridge/bridge_constructor/bridge_template.ini Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/bridge_template.ini Sat Apr 08 13:54:42 2023 +0200 @@ -20,7 +20,7 @@ doc=Connection is finished or lost doc_param_0=%(doc_profile)s -[newContact] +[contact_new] type=signal category=core sig_in=sa{ss}ass @@ -34,7 +34,7 @@ doc_param_2=groups: Roster's groups where the contact is doc_param_3=%(doc_profile)s -[messageNew] +[message_new] type=signal category=core sig_in=sdssa{ss}a{ss}sss @@ -54,7 +54,7 @@ - info_type: subtype for info messages doc_param_8=%(doc_profile)s -[messageEncryptionStarted] +[message_encryption_started] type=signal category=core sig_in=sss @@ -68,7 +68,7 @@ - directed_devices: list or resource where session is encrypted doc_param_2=%(doc_profile_key)s -[messageEncryptionStopped] +[message_encryption_stopped] type=signal category=core sig_in=sa{ss}s @@ -79,7 +79,7 @@ - namespace: namespace of the encryption plugin doc_param_2=%(doc_profile_key)s -[presenceUpdate] +[presence_update] type=signal category=core sig_in=ssia{ss}s @@ -99,7 +99,7 @@ doc_param_1=entity_jid: JID from which the subscription is coming doc_param_2=%(doc_profile)s -[paramUpdate] +[param_update] type=signal category=core sig_in=ssss @@ -109,7 +109,7 @@ doc_param_2=category: Category of the updated parameter doc_param_3=%(doc_profile)s -[contactDeleted] +[contact_deleted] type=signal category=core sig_in=ss @@ -117,7 +117,7 @@ doc_param_0=entity_jid: JID of the contact removed from roster doc_param_1=%(doc_profile)s -[actionNew] +[action_new] type=signal category=core sig_in=a{ss}sis @@ -137,7 +137,7 @@ doc_param_2=%(doc_security_limit)s doc_param_3=%(doc_profile)s -[entityDataUpdated] +[entity_data_updated] type=signal category=core sig_in=ssss @@ -147,7 +147,7 @@ doc_param_2=value: New value doc_param_3=%(doc_profile)s -[progressStarted] +[progress_started] type=signal category=core sig_in=sa{ss}s @@ -160,7 +160,7 @@ C.META_TYPE_FILE: file transfer doc_param_2=%(doc_profile)s -[progressFinished] +[progress_finished] type=signal category=core sig_in=sa{ss}s @@ -170,11 +170,11 @@ - hash: value of the computed hash - hash_algo: alrorithm used to compute hash - hash_verified: C.BOOL_TRUE if hash is verified and OK - C.BOOL_FALSE if hash was not received ([progressError] will be used if there is a mismatch) + C.BOOL_FALSE if hash was not received ([progress_error] will be used if there is a mismatch) - url: url linked to the progression (e.g. download url after a file upload) doc_param_2=%(doc_profile)s -[progressError] +[progress_error] type=signal category=core sig_in=sss @@ -194,7 +194,7 @@ ;methods -[getReady] +[ready_get] async= type=method category=core @@ -202,14 +202,14 @@ sig_out= doc=Return when backend is initialised -[getVersion] +[version_get] type=method category=core sig_in= sig_out=s doc=Get "Salut à Toi" full version -[getFeatures] +[features_get] type=method category=core sig_in=s @@ -221,7 +221,7 @@ plugin import name is used as key, data is an other dict managed by the plugin async= -[profileNameGet] +[profile_name_get] type=method category=core sig_in=s @@ -231,7 +231,7 @@ doc_param_0=%(doc_profile_key)s doc_return=Real profile name -[profilesListGet] +[profiles_list_get] type=method category=core sig_in=bb @@ -242,7 +242,7 @@ doc_param_1=components: get components profiles doc=Get list of profiles -[profileSetDefault] +[profile_set_default] type=method category=core sig_in=s @@ -250,7 +250,7 @@ doc_param_0=%(doc_profile)s doc=Set default profile -[getEntityData] +[entity_data_get] type=method category=core sig_in=sass @@ -262,7 +262,7 @@ doc_return=dictionary of asked key, if key doesn't exist, the resulting dictionary will not have the key -[getEntitiesData] +[entities_data_get] type=method category=core sig_in=asass @@ -275,7 +275,7 @@ values are serialised if key doesn't exist for a jid, the resulting dictionary will not have it -[profileCreate] +[profile_create] async= type=method category=core @@ -293,7 +293,7 @@ - CancelError: profile creation canceled - NotFound: component entry point is not available -[asyncDeleteProfile] +[profile_delete_async] async= type=method category=core @@ -325,7 +325,7 @@ - False if the XMPP connection has been initiated (it may still fail) - failure if the profile authentication failed -[profileStartSession] +[profile_start_session] async= type=method category=core @@ -340,7 +340,7 @@ - True if the profile session was already started - False else -[profileIsSessionStarted] +[profile_is_session_started] type=method category=core sig_in=s @@ -359,7 +359,7 @@ doc=Disconnect a profile doc_param_0=%(doc_profile_key)s -[isConnected] +[is_connected] type=method category=core sig_in=s @@ -368,7 +368,7 @@ doc=Tell if a profile is connected doc_param_0=%(doc_profile_key)s -[contactGet] +[contact_get] async= type=method category=core @@ -378,10 +378,10 @@ doc=Return informations in roster about a contact doc_param_1=%(doc_profile_key)s doc_return=tuple with the following values: - - list of attributes as in [newContact] + - list of attributes as in [contact_new] - groups where the contact is -[getContacts] +[contacts_get] async= type=method category=core @@ -392,10 +392,10 @@ doc_param_0=%(doc_profile_key)s doc_return=array of tuples with the following values: - JID of the contact - - list of attributes as in [newContact] + - list of attributes as in [contact_new] - groups where the contact is -[getContactsFromGroup] +[contacts_get_from_group] type=method category=core sig_in=ss @@ -406,7 +406,7 @@ doc_param_1=%(doc_profile_key)s doc_return=array of jids -[getMainResource] +[main_resource_get] type=method category=core sig_in=ss @@ -417,7 +417,7 @@ doc_param_1=%(doc_profile_key)s doc_return=the resource connected of the contact with highest priority, or "" -[getPresenceStatuses] +[presence_statuses_get] type=method category=core sig_in=s @@ -426,9 +426,9 @@ doc=Return presence information of all contacts doc_param_0=%(doc_profile_key)s doc_return=Dict of presence with bare JID of contact as key, and value as follow: - A dict where key is the resource and the value is a tuple with (show, priority, statuses) as for [presenceUpdate] + A dict where key is the resource and the value is a tuple with (show, priority, statuses) as for [presence_update] -[getWaitingSub] +[sub_waiting_get] type=method category=core sig_in=s @@ -438,7 +438,7 @@ doc_param_0=%(doc_profile_key)s doc_return=Dict where contact JID is the key, and value is the subscription type -[messageSend] +[message_send] async= type=method category=core @@ -458,7 +458,7 @@ doc_param_4=extra: (serialised) optional data that can be used by a plugin to build more specific messages doc_param_5=%(doc_profile_key)s -[messageEncryptionStart] +[message_encryption_start] async= type=method category=core @@ -474,7 +474,7 @@ else a ConflictError will be raised doc_param_3=%(doc_profile_key)s -[messageEncryptionStop] +[message_encryption_stop] async= type=method category=core @@ -484,7 +484,7 @@ doc_param_0=to_jid: JID of the recipient (full jid if encryption must be stopped for one device only) doc_param_1=%(doc_profile_key)s -[messageEncryptionGet] +[message_encryption_get] type=method category=core sig_in=ss @@ -499,21 +499,21 @@ following key can be present if suitable: - directed_devices: list or resource where session is encrypted -[encryptionNamespaceGet] +[encryption_namespace_get] type=method category=core sig_in=s sig_out=s doc=Get algorithm namespace from its name -[encryptionPluginsGet] +[encryption_plugins_get] type=method category=core sig_in= sig_out=s doc=Retrieve registered plugins for encryption -[encryptionTrustUIGet] +[encryption_trust_ui_get] async= type=method category=core @@ -525,7 +525,7 @@ doc_param_2=%(doc_profile_key)s doc_return=(XMLUI) UI of the trust management -[setPresence] +[presence_set] type=method category=core sig_in=ssa{ss}s @@ -536,8 +536,8 @@ param_3_default="@DEFAULT@" doc=Set presence information for the profile doc_param_0=to_jid: the JID to who we send the presence data (emtpy string for broadcast) -doc_param_1=show: as for [presenceUpdate] -doc_param_2=statuses: as for [presenceUpdate] +doc_param_1=show: as for [presence_update] +doc_param_2=statuses: as for [presence_update] doc_param_3=%(doc_profile_key)s [subscription] @@ -551,7 +551,7 @@ doc_param_1=entity: as for [subscribe] doc_param_2=%(doc_profile_key)s -[getConfig] +[config_get] type=method category=core sig_in=ss @@ -560,7 +560,7 @@ doc_param_0=section: section of the configuration file (empty string for DEFAULT) doc_param_1=name: name of the option -[setParam] +[param_set] type=method category=core sig_in=sssis @@ -574,7 +574,7 @@ doc_param_3=%(doc_security_limit)s doc_param_4=%(doc_profile_key)s -[getParamA] +[param_get_a] type=method category=core sig_in=ssss @@ -582,12 +582,12 @@ param_2_default="value" param_3_default="@DEFAULT@" doc=Helper method to get a parameter's attribute *when profile is connected* -doc_param_0=name: as for [setParam] -doc_param_1=category: as for [setParam] +doc_param_0=name: as for [param_set] +doc_param_1=category: as for [param_set] doc_param_2=attribute: Name of the attribute doc_param_3=%(doc_profile_key)s -[privateDataGet] +[private_data_get] async= type=method category=core @@ -599,7 +599,7 @@ doc_param_2=%(doc_profile_key)s doc_return=serialised data -[privateDataSet] +[private_data_set] async= type=method category=core @@ -611,7 +611,7 @@ doc_param_2=data: serialised data doc_param_3=%(doc_profile_key)s -[privateDataDelete] +[private_data_delete] async= type=method category=core @@ -622,7 +622,7 @@ doc_param_1=key: key of the data to delete doc_param_3=%(doc_profile_key)s -[asyncGetParamA] +[param_get_a_async] async= type=method category=core @@ -632,13 +632,13 @@ param_3_default=-1 param_4_default="@DEFAULT@" doc=Helper method to get a parameter's attribute -doc_param_0=name: as for [setParam] -doc_param_1=category: as for [setParam] +doc_param_0=name: as for [param_set] +doc_param_1=category: as for [param_set] doc_param_2=attribute: Name of the attribute doc_param_3=%(doc_security_limit)s doc_param_4=%(doc_profile_key)s -[asyncGetParamsValuesFromCategory] +[params_values_from_category_get_async] async= type=method category=code @@ -649,13 +649,13 @@ param_3_default="" param_4_default="@DEFAULT@" doc=Get "attribute" for all params of a category -doc_param_0=category: as for [setParam] +doc_param_0=category: as for [param_set] doc_param_1=%(doc_security_limit)s doc_param_2=app: name of the frontend requesting the parameters, or '' to get all parameters doc_param_3=extra: extra options/filters doc_param_4=%(doc_profile_key)s -[getParamsUI] +[param_ui_get] async= type=method category=core @@ -671,7 +671,7 @@ doc_param_2=extra: extra options/filters doc_param_3=%(doc_profile_key)s -[getParamsCategories] +[params_categories_get] type=method category=core sig_in= @@ -679,7 +679,7 @@ doc=Get all categories currently existing in parameters doc_return=list of categories -[paramsRegisterApp] +[params_register_app] type=method category=core sig_in=sis @@ -691,7 +691,7 @@ doc_param_1=%(doc_security_limit)s doc_param_2=app: name of the frontend registering the parameters -[historyGet] +[history_get] async= type=method category=core @@ -712,9 +712,9 @@ - not_types: type must not be one of those, values are separated by spaces - before_uid: check only message received before message with given uid doc_param_5=%(doc_profile)s -doc_return=Ordered list (by timestamp) of data as in [messageNew] (without final profile) +doc_return=Ordered list (by timestamp) of data as in [message_new] (without final profile) -[addContact] +[contact_add] type=method category=core sig_in=ss @@ -724,7 +724,7 @@ doc_param_0=entity_jid: JID to add to roster doc_param_1=%(doc_profile_key)s -[updateContact] +[contact_update] type=method category=core sig_in=ssass @@ -736,7 +736,7 @@ doc_param_2=groups: list of group where the entity is doc_param_3=%(doc_profile_key)s -[delContact] +[contact_del] async= type=method category=core @@ -747,7 +747,7 @@ doc_param_0=entity_jid: JID to remove from roster doc_param_1=%(doc_profile_key)s -[rosterResync] +[roster_resync] async= type=method category=core @@ -757,7 +757,7 @@ doc=Do a full resynchronisation of roster with server doc_param_0=%(doc_profile_key)s -[launchAction] +[action_launch] async= type=method category=core @@ -771,7 +771,7 @@ doc_return=dict where key can be: - xmlui: a XMLUI need to be displayed -[actionsGet] +[actions_get] type=method category=core sig_in=s @@ -779,9 +779,9 @@ param_0_default="@DEFAULT@" doc=Get all not yet answered actions doc_param_0=%(doc_profile_key)s -doc_return=list of data as for [actionNew] (without the profile) +doc_return=list of data as for [action_new] (without the profile) -[progressGet] +[progress_get] type=method category=core sig_in=ss @@ -794,7 +794,7 @@ - size: end position (optional if not known) other metadata may be present -[progressGetAllMetadata] +[progress_get_all_metadata] type=method category=core sig_in=s @@ -803,9 +803,9 @@ doc_param_0=%(doc_profile)s or C.PROF_KEY_ALL for all profiles doc_return= a dict which map profile to progress_dict progress_dict map progress_id to progress_metadata - progress_metadata is the same dict as sent by [progressStarted] + progress_metadata is the same dict as sent by [progress_started] -[progressGetAll] +[progress_get_all] type=method category=core sig_in=s @@ -814,9 +814,9 @@ doc_param_0=%(doc_profile)s or C.PROF_KEY_ALL for all profiles doc_return= a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_data is the same dict as returned by [progressGet] + progress_data is the same dict as returned by [progress_get] -[menusGet] +[menus_get] type=method category=core sig_in=si @@ -832,7 +832,7 @@ - menu_path_i18n: translated path of the menu - extra: extra data, like icon name -[menuLaunch] +[menu_launch] async= type=method category=core @@ -847,7 +847,7 @@ doc_return=dict where key can be: - xmlui: a XMLUI need to be displayed -[menuHelpGet] +[menu_help_get] type=method category=core sig_in=ss @@ -858,7 +858,7 @@ doc_param_1=language: language in which the menu should be translated (empty string for default) doc_return=Translated help string -[discoInfos] +[disco_infos] async= type=method category=core @@ -884,7 +884,7 @@ * desc - list of values -[discoItems] +[disco_items] async= type=method category=core @@ -900,7 +900,7 @@ doc_param_3=%(doc_profile_key)s doc_return=array of tuple (entity, node identifier, name) -[discoFindByFeatures] +[disco_find_by_features] async= type=method category=core @@ -927,7 +927,7 @@ - own entities (i.e. entities linked to profile's jid) - roster entities -[saveParamsTemplate] +[params_template_save] type=method category=core sig_in=s @@ -936,7 +936,7 @@ doc_param_0=filename: output filename doc_return=boolean (True in case of success) -[loadParamsTemplate] +[params_template_load] type=method category=core sig_in=s @@ -945,7 +945,7 @@ doc_param_0=filename: input filename doc_return=boolean (True in case of success) -[sessionInfosGet] +[session_infos_get] async= type=method category=core @@ -957,7 +957,7 @@ jid: current full jid started: date of creation of the session (Epoch time) -[devicesInfosGet] +[devices_infos_get] async= type=method category=core @@ -970,7 +970,7 @@ doc_return=list of known devices, where each item is a dict with a least following keys: resource: device resource -[namespacesGet] +[namespaces_get] type=method category=core sig_in= @@ -978,7 +978,7 @@ doc=Get a dict to short name => whole namespaces doc_return=namespaces mapping -[imageCheck] +[image_check] type=method category=core sig_in=s @@ -986,7 +986,7 @@ doc=Analyze an image a return a report doc_return=serialized report -[imageResize] +[image_resize] async= type=method category=core @@ -999,7 +999,7 @@ doc_return=path of the new image with desired size the image must be deleted once not needed anymore -[imageGeneratePreview] +[image_generate_preview] async= type=method category=core @@ -1010,7 +1010,7 @@ doc_param_1=%(doc_profile_key)s doc_return=path to the preview in cache -[imageConvert] +[image_convert] async= type=method category=core diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/dbus-xml/constructor.py --- a/sat/bridge/bridge_constructor/constructors/dbus-xml/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus-xml/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,9 +38,9 @@ "a{sa{s(sia{ss})}}": "PresenceStatusT", } - def generateCoreSide(self): + def generate_core_side(self): try: - doc = minidom.parse(self.getTemplatePath(self.template)) + doc = minidom.parse(self.get_template_path(self.template)) interface_elt = doc.getElementsByTagName("interface")[0] except IOError: print("Can't access template") @@ -60,8 +60,8 @@ new_elt.setAttribute("name", section) idx = 0 - args_doc = self.getArgumentsDoc(section) - for arg in self.argumentsParser(function["sig_in"] or ""): + args_doc = self.get_arguments_doc(section) + for arg in self.arguments_parser(function["sig_in"] or ""): arg_elt = doc.createElement("arg") arg_elt.setAttribute( "name", args_doc[idx][0] if idx in args_doc else "arg_%i" % idx @@ -99,4 +99,4 @@ interface_elt.appendChild(new_elt) # now we write to final file - self.finalWrite(self.core_dest, [doc.toprettyxml()]) + self.final_write(self.core_dest, [doc.toprettyxml()]) diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/dbus/constructor.py --- a/sat/bridge/bridge_constructor/constructors/dbus/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -81,7 +81,7 @@ "debug": "" if not self.args.debug else 'log.debug ("%s")\n%s' % (completion["name"], 8 * " "), - "args_result": self.getArguments(function["sig_in"], name=arg_doc), + "args_result": self.get_arguments(function["sig_in"], name=arg_doc), "async_args": "callback=None, errback=None", "async_comma": ", " if function["sig_in"] else "", "error_handler": """if callback is None: diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/dbus/dbus_core_template.py --- a/sat/bridge/bridge_constructor/constructors/dbus/dbus_core_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/dbus_core_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,8 +30,8 @@ log = getLogger(__name__) # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -118,13 +118,13 @@ ##METHODS_PART## -class Bridge: +class bridge: def __init__(self): log.info("Init DBus...") self._obj = DBusObject(const_OBJ_PATH) - async def postInit(self): + async def post_init(self): try: conn = await client.connect(reactor) except error.DBusException as e: @@ -145,13 +145,13 @@ log.debug(f"registering DBus bridge method [{name}]") self._obj.register_method(name, callback) - def emitSignal(self, name, *args): + def emit_signal(self, name, *args): self._obj.emitSignal(name, *args) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to D-Bus Bridge""" + """Dynamically add a method to D-Bus bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug(f"Adding method {name!r} to D-Bus bridge") self._obj.plugin_iface.addMethod( @@ -164,8 +164,8 @@ setattr(self._obj, f"dbus_{name}", MethodType(caller, self._obj)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): - """Dynamically add a signal to D-Bus Bridge""" + def add_signal(self, name, int_suffix, signature, doc={}): + """Dynamically add a signal to D-Bus bridge""" log.debug(f"Adding signal {name!r} to D-Bus bridge") self._obj.plugin_iface.addSignal(Signal(name, signature)) - setattr(Bridge, name, partialmethod(Bridge.emitSignal, name)) + setattr(bridge, name, partialmethod(bridge.emit_signal, name)) diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/dbus/dbus_frontend_template.py --- a/sat/bridge/bridge_constructor/constructors/dbus/dbus_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/dbus_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,8 +32,8 @@ # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -66,9 +66,9 @@ return BridgeException(name, message, condition) -class Bridge: +class bridge: - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): try: self.sessions_bus = dbus.SessionBus() self.db_object = self.sessions_bus.get_object(const_INT_PREFIX, @@ -105,7 +105,7 @@ except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): # We first check if we have an async call. We detect this in two ways: # - if we have the 'callback' and 'errback' keyword arguments # - or if the last two arguments are callable @@ -156,11 +156,11 @@ return self.db_plugin_iface.get_dbus_method(name)(*args, **kwargs) raise e - return getPluginMethod + return get_plugin_method ##METHODS_PART## -class AIOBridge(Bridge): +class AIOBridge(bridge): def register_signal(self, functionName, handler, iface="core"): loop = asyncio.get_running_loop() @@ -173,7 +173,7 @@ return object.__getattribute__(self, name) except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): loop = asyncio.get_running_loop() fut = loop.create_future() method = getattr(self.db_plugin_iface, name) @@ -191,7 +191,7 @@ ) except ValueError as e: if e.args[0].startswith("Unable to guess signature"): - # same hack as for Bridge.__getattribute__ + # same hack as for bridge.__getattribute__ log.warning("using hack to work around inspection issue") proxy = self.db_plugin_iface.proxy_object IN_PROGRESS = proxy.INTROSPECT_STATE_INTROSPECT_IN_PROGRESS @@ -209,12 +209,12 @@ raise e return fut - return getPluginMethod + return get_plugin_method - def bridgeConnect(self): + def bridge_connect(self): loop = asyncio.get_running_loop() fut = loop.create_future() - super().bridgeConnect( + super().bridge_connect( callback=lambda: loop.call_soon_threadsafe(fut.set_result, None), errback=lambda e: loop.call_soon_threadsafe(fut.set_exception, e) ) diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/embedded/constructor.py --- a/sat/bridge/bridge_constructor/constructors/embedded/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ "debug": "" if not self.args.debug else 'log.debug ("%s")\n%s' % (completion["name"], 8 * " "), - "args_result": self.getArguments(function["sig_in"], name=arg_doc), + "args_result": self.get_arguments(function["sig_in"], name=arg_doc), "args_comma": ", " if function["sig_in"] else "", } ) @@ -96,5 +96,5 @@ def core_completion_signal(self, completion, function, default, arg_doc, async_): completion.update( - {"args_result": self.getArguments(function["sig_in"], name=arg_doc)} + {"args_result": self.get_arguments(function["sig_in"], name=arg_doc)} ) diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/embedded/embedded_frontend_template.py --- a/sat/bridge/bridge_constructor/constructors/embedded/embedded_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/embedded_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -17,4 +17,4 @@ # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . -from sat.bridge.embedded import Bridge +from sat.bridge.embedded import bridge diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/embedded/embedded_template.py --- a/sat/bridge/bridge_constructor/constructors/embedded/embedded_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/embedded_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ self._methods_cbs = {} self._signals_cbs = {"core": {}, "plugin": {}} - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): callback() def register_method(self, name, callback): @@ -85,7 +85,7 @@ else: cb(*args, **kwargs) - def addMethod(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={}): + def add_method(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={}): # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method [{}] to embedded bridge".format(name)) self.register_method(name, method) @@ -97,7 +97,7 @@ ), ) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): setattr( self.__class__, name, @@ -116,7 +116,7 @@ bridge = None -def Bridge(): +def bridge(): global bridge if bridge is None: bridge = _Bridge() diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/mediawiki/constructor.py --- a/sat/bridge/bridge_constructor/constructors/mediawiki/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/mediawiki/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ self.core_template = "mediawiki_template.tpl" self.core_dest = "mediawiki.wiki" - def _addTextDecorations(self, text): + def _add_text_decorations(self, text): """Add text decorations like coloration or shortcuts""" def anchor_link(match): @@ -42,43 +42,43 @@ return re.sub(r"\[(\w+)\]", anchor_link, text) - def _wikiParameter(self, name, sig_in): + def _wiki_parameter(self, name, sig_in): """Format parameters with the wiki syntax @param name: name of the function @param sig_in: signature in @return: string of the formated parameters""" - arg_doc = self.getArgumentsDoc(name) - arg_default = self.getDefault(name) - args_str = self.getArguments(sig_in) + arg_doc = self.get_arguments_doc(name) + arg_default = self.get_default(name) + args_str = self.get_arguments(sig_in) args = args_str.split(", ") if args_str else [] # ugly but it works :) wiki = [] for i in range(len(args)): if i in arg_doc: name, doc = arg_doc[i] doc = "\n:".join(doc.rstrip("\n").split("\n")) - wiki.append("; %s: %s" % (name, self._addTextDecorations(doc))) + wiki.append("; %s: %s" % (name, self._add_text_decorations(doc))) else: wiki.append("; arg_%d: " % i) if i in arg_default: wiki.append(":''DEFAULT: %s''" % arg_default[i]) return "\n".join(wiki) - def _wikiReturn(self, name): + def _wiki_return(self, name): """Format return doc with the wiki syntax @param name: name of the function """ - arg_doc = self.getArgumentsDoc(name) + arg_doc = self.get_arguments_doc(name) wiki = [] if "return" in arg_doc: wiki.append("\n|-\n! scope=row | return value\n|") wiki.append( "
\n".join( - self._addTextDecorations(arg_doc["return"]).rstrip("\n").split("\n") + self._add_text_decorations(arg_doc["return"]).rstrip("\n").split("\n") ) ) return "\n".join(wiki) - def generateCoreSide(self): + def generate_core_side(self): signals_part = [] methods_part = [] sections = self.bridge_template.sections() @@ -114,13 +114,13 @@ "sig_out": function["sig_out"] or "", "category": function["category"], "name": section, - "doc": self.getDoc(section) or "FIXME: No description available", + "doc": self.get_doc(section) or "FIXME: No description available", "async": async_msg if "async" in self.getFlags(section) else "", "deprecated": deprecated_msg if "deprecated" in self.getFlags(section) else "", - "parameters": self._wikiParameter(section, function["sig_in"]), - "return": self._wikiReturn(section) + "parameters": self._wiki_parameter(section, function["sig_in"]), + "return": self._wiki_return(section) if function["type"] == "method" else "", } @@ -148,7 +148,7 @@ # at this point, signals_part, and methods_part should be filled, # we just have to place them in the right part of the template core_bridge = [] - template_path = self.getTemplatePath(self.core_template) + template_path = self.get_template_path(self.core_template) try: with open(template_path) as core_template: for line in core_template: @@ -165,4 +165,4 @@ sys.exit(1) # now we write to final file - self.finalWrite(self.core_dest, core_bridge) + self.final_write(self.core_dest, core_bridge) diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/pb/constructor.py --- a/sat/bridge/bridge_constructor/constructors/pb/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -26,7 +26,7 @@ CORE_FORMATS = { "signals": """\ def {name}(self, {args}): - {debug}self.sendSignal("{name}", {args_no_def})\n""" + {debug}self.send_signal("{name}", {args_no_def})\n""" } FRONTEND_TEMPLATE = "pb_frontend_template.py" @@ -49,7 +49,7 @@ } def core_completion_signal(self, completion, function, default, arg_doc, async_): - completion["args_no_def"] = self.getArguments(function["sig_in"], name=arg_doc) + completion["args_no_def"] = self.get_arguments(function["sig_in"], name=arg_doc) completion["debug"] = ( "" if not self.args.debug @@ -60,7 +60,7 @@ completion.update( { "args_comma": ", " if function["sig_in"] else "", - "args_no_def": self.getArguments(function["sig_in"], name=arg_doc), + "args_no_def": self.get_arguments(function["sig_in"], name=arg_doc), "callback": "callback" if function["sig_out"] else "lambda __: callback()", diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/pb/pb_core_template.py --- a/sat/bridge/bridge_constructor/constructors/pb/pb_core_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/pb_core_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,17 +55,17 @@ def __init__(self): self.signals_handlers = [] - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(HandlerWrapper(signals_handler)) log.info("registered signal handler") - def sendSignalEb(self, failure_, signal_name): + def send_signal_eb(self, failure_, signal_name): if not failure_.check(pb.PBConnectionLost): log.error( f"Error while sending signal {signal_name}: {failure_}", ) - def sendSignal(self, name, args, kwargs): + def send_signal(self, name, args, kwargs): to_remove = [] for wrapper in self.signals_handlers: handler = wrapper.handler @@ -74,13 +74,13 @@ except pb.DeadReferenceError: to_remove.append(wrapper) else: - d.addErrback(self.sendSignalEb, name) + d.addErrback(self.send_signal_eb, name) if to_remove: for wrapper in to_remove: log.debug("Removing signal handler for dead frontend") self.signals_handlers.remove(wrapper) - def _bridgeDeactivateSignals(self): + def _bridge_deactivate_signals(self): if hasattr(self, "signals_paused"): log.warning("bridge signals already deactivated") if self.signals_handler: @@ -90,7 +90,7 @@ self.signals_handlers = [] log.debug("bridge signals have been deactivated") - def _bridgeReactivateSignals(self): + def _bridge_reactivate_signals(self): try: self.signals_handlers = self.signals_paused except AttributeError: @@ -102,31 +102,31 @@ ##METHODS_PART## -class Bridge(object): +class bridge(object): def __init__(self): log.info("Init Perspective Broker...") self.root = PBRoot() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" log.info(f"using UNIX Socket at {socket_path}") reactor.listenUNIX( str(socket_path), pb.PBServerFactory(self.root), mode=0o600 ) elif conn_type == "socket": - port = int(getConf("port", 8789)) + port = int(get_conf("port", 8789)) log.info(f"using TCP Socket at port {port}") reactor.listenTCP(port, pb.PBServerFactory(self.root)) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") - def sendSignal(self, name, *args, **kwargs): - self.root.sendSignal(name, args, kwargs) + def send_signal(self, name, *args, **kwargs): + self.root.send_signal(name, args, kwargs) - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(signals_handler) log.info("registered signal handler") @@ -135,32 +135,32 @@ setattr(self.root, "remote_" + name, callback) #  self.root.register_method(name, callback) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to PB Bridge""" + """Dynamically add a method to PB bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method {name} to PB bridge".format(name=name)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): log.debug("Adding signal {name} to PB bridge".format(name=name)) setattr( - self, name, lambda *args, **kwargs: self.sendSignal(name, *args, **kwargs) + self, name, lambda *args, **kwargs: self.send_signal(name, *args, **kwargs) ) - def bridgeDeactivateSignals(self): + def bridge_deactivate_signals(self): """Stop sending signals to bridge Mainly used for mobile frontends, when the frontend is paused """ - self.root._bridgeDeactivateSignals() + self.root._bridge_deactivate_signals() - def bridgeReactivateSignals(self): + def bridge_reactivate_signals(self): """Send again signals to bridge - Should only be used after bridgeDeactivateSignals has been called + Should only be used after bridge_deactivate_signals has been called """ - self.root._bridgeReactivateSignals() + self.root._bridge_reactivate_signals() ##SIGNALS_PART## diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/bridge_constructor/constructors/pb/pb_frontend_template.py --- a/sat/bridge/bridge_constructor/constructors/pb/pb_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/pb_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ setattr(self, method_name, handler) -class Bridge(object): +class bridge(object): def __init__(self): self.signals_handler = SignalsHandler() @@ -75,7 +75,7 @@ ) ) - def remoteCallback(self, result, callback): + def remote_callback(self, result, callback): """call callback with argument or None if result is not None not argument is used, @@ -112,11 +112,11 @@ callback = args.pop() d = self.root.callRemote(name, *args, **kwargs) if callback is not None: - d.addCallback(self.remoteCallback, callback) + d.addCallback(self.remote_callback, callback) if errback is not None: d.addErrback(errback) - def _initBridgeEb(self, failure_): + def _init_bridge_eb(self, failure_): log.error("Can't init bridge: {msg}".format(msg=failure_)) return failure_ @@ -127,28 +127,28 @@ """ self.root = root d = root.callRemote("initBridge", self.signals_handler) - d.addErrback(self._initBridgeEb) + d.addErrback(self._init_bridge_eb) return d - def getRootObjectEb(self, failure_): + def get_root_object_eb(self, failure_): """Call errback with appropriate bridge error""" if failure_.check(ConnectionRefusedError, ConnectError): raise exceptions.BridgeExceptionNoService else: raise failure_ - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): factory = pb.PBClientFactory() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" reactor.connectUNIX(str(socket_path), factory) elif conn_type == "socket": - host = getConf("host", "localhost") - port = int(getConf("port", 8789)) + host = get_conf("host", "localhost") + port = int(get_conf("port", 8789)) reactor.connectTCP(host, port, factory) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") @@ -156,7 +156,7 @@ d.addCallback(self._set_root) if callback is not None: d.addCallback(lambda __: callback()) - d.addErrback(self.getRootObjectEb) + d.addErrback(self.get_root_object_eb) if errback is not None: d.addErrback(lambda failure_: errback(failure_.value)) return d @@ -175,7 +175,7 @@ return super().register_signal(name, async_handler, iface) -class AIOBridge(Bridge): +class AIOBridge(bridge): def __init__(self): self.signals_handler = AIOSignalsHandler() @@ -192,8 +192,8 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - async def bridgeConnect(self): - d = super().bridgeConnect(callback=None, errback=None) + async def bridge_connect(self): + d = super().bridge_connect(callback=None, errback=None) return await d.asFuture(asyncio.get_event_loop()) ##ASYNC_METHODS_PART## diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/dbus_bridge.py --- a/sat/bridge/dbus_bridge.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/dbus_bridge.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,8 +30,8 @@ log = getLogger(__name__) # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -88,87 +88,87 @@ core_iface = DBusInterface( const_INT_PREFIX + const_CORE_SUFFIX, - Method('actionsGet', arguments='s', returns='a(a{ss}si)'), - Method('addContact', arguments='ss', returns=''), - Method('asyncDeleteProfile', arguments='s', returns=''), - Method('asyncGetParamA', arguments='sssis', returns='s'), - Method('asyncGetParamsValuesFromCategory', arguments='sisss', returns='a{ss}'), + Method('action_launch', arguments='sa{ss}s', returns='a{ss}'), + Method('actions_get', arguments='s', returns='a(a{ss}si)'), + Method('config_get', arguments='ss', returns='s'), Method('connect', arguments='ssa{ss}', returns='b'), - Method('contactGet', arguments='ss', returns='(a{ss}as)'), - Method('delContact', arguments='ss', returns=''), - Method('devicesInfosGet', arguments='ss', returns='s'), - Method('discoFindByFeatures', arguments='asa(ss)bbbbbs', returns='(a{sa(sss)}a{sa(sss)}a{sa(sss)})'), - Method('discoInfos', arguments='ssbs', returns='(asa(sss)a{sa(a{ss}as)})'), - Method('discoItems', arguments='ssbs', returns='a(sss)'), + Method('contact_add', arguments='ss', returns=''), + Method('contact_del', arguments='ss', returns=''), + Method('contact_get', arguments='ss', returns='(a{ss}as)'), + Method('contact_update', arguments='ssass', returns=''), + Method('contacts_get', arguments='s', returns='a(sa{ss}as)'), + Method('contacts_get_from_group', arguments='ss', returns='as'), + Method('devices_infos_get', arguments='ss', returns='s'), + Method('disco_find_by_features', arguments='asa(ss)bbbbbs', returns='(a{sa(sss)}a{sa(sss)}a{sa(sss)})'), + Method('disco_infos', arguments='ssbs', returns='(asa(sss)a{sa(a{ss}as)})'), + Method('disco_items', arguments='ssbs', returns='a(sss)'), Method('disconnect', arguments='s', returns=''), - Method('encryptionNamespaceGet', arguments='s', returns='s'), - Method('encryptionPluginsGet', arguments='', returns='s'), - Method('encryptionTrustUIGet', arguments='sss', returns='s'), - Method('getConfig', arguments='ss', returns='s'), - Method('getContacts', arguments='s', returns='a(sa{ss}as)'), - Method('getContactsFromGroup', arguments='ss', returns='as'), - Method('getEntitiesData', arguments='asass', returns='a{sa{ss}}'), - Method('getEntityData', arguments='sass', returns='a{ss}'), - Method('getFeatures', arguments='s', returns='a{sa{ss}}'), - Method('getMainResource', arguments='ss', returns='s'), - Method('getParamA', arguments='ssss', returns='s'), - Method('getParamsCategories', arguments='', returns='as'), - Method('getParamsUI', arguments='isss', returns='s'), - Method('getPresenceStatuses', arguments='s', returns='a{sa{s(sia{ss})}}'), - Method('getReady', arguments='', returns=''), - Method('getVersion', arguments='', returns='s'), - Method('getWaitingSub', arguments='s', returns='a{ss}'), - Method('historyGet', arguments='ssiba{ss}s', returns='a(sdssa{ss}a{ss}ss)'), - Method('imageCheck', arguments='s', returns='s'), - Method('imageConvert', arguments='ssss', returns='s'), - Method('imageGeneratePreview', arguments='ss', returns='s'), - Method('imageResize', arguments='sii', returns='s'), - Method('isConnected', arguments='s', returns='b'), - Method('launchAction', arguments='sa{ss}s', returns='a{ss}'), - Method('loadParamsTemplate', arguments='s', returns='b'), - Method('menuHelpGet', arguments='ss', returns='s'), - Method('menuLaunch', arguments='sasa{ss}is', returns='a{ss}'), - Method('menusGet', arguments='si', returns='a(ssasasa{ss})'), - Method('messageEncryptionGet', arguments='ss', returns='s'), - Method('messageEncryptionStart', arguments='ssbs', returns=''), - Method('messageEncryptionStop', arguments='ss', returns=''), - Method('messageSend', arguments='sa{ss}a{ss}sss', returns=''), - Method('namespacesGet', arguments='', returns='a{ss}'), - Method('paramsRegisterApp', arguments='sis', returns=''), - Method('privateDataDelete', arguments='sss', returns=''), - Method('privateDataGet', arguments='sss', returns='s'), - Method('privateDataSet', arguments='ssss', returns=''), - Method('profileCreate', arguments='sss', returns=''), - Method('profileIsSessionStarted', arguments='s', returns='b'), - Method('profileNameGet', arguments='s', returns='s'), - Method('profileSetDefault', arguments='s', returns=''), - Method('profileStartSession', arguments='ss', returns='b'), - Method('profilesListGet', arguments='bb', returns='as'), - Method('progressGet', arguments='ss', returns='a{ss}'), - Method('progressGetAll', arguments='s', returns='a{sa{sa{ss}}}'), - Method('progressGetAllMetadata', arguments='s', returns='a{sa{sa{ss}}}'), - Method('rosterResync', arguments='s', returns=''), - Method('saveParamsTemplate', arguments='s', returns='b'), - Method('sessionInfosGet', arguments='s', returns='a{ss}'), - Method('setParam', arguments='sssis', returns=''), - Method('setPresence', arguments='ssa{ss}s', returns=''), + Method('encryption_namespace_get', arguments='s', returns='s'), + Method('encryption_plugins_get', arguments='', returns='s'), + Method('encryption_trust_ui_get', arguments='sss', returns='s'), + Method('entities_data_get', arguments='asass', returns='a{sa{ss}}'), + Method('entity_data_get', arguments='sass', returns='a{ss}'), + Method('features_get', arguments='s', returns='a{sa{ss}}'), + Method('history_get', arguments='ssiba{ss}s', returns='a(sdssa{ss}a{ss}ss)'), + Method('image_check', arguments='s', returns='s'), + Method('image_convert', arguments='ssss', returns='s'), + Method('image_generate_preview', arguments='ss', returns='s'), + Method('image_resize', arguments='sii', returns='s'), + Method('is_connected', arguments='s', returns='b'), + Method('main_resource_get', arguments='ss', returns='s'), + Method('menu_help_get', arguments='ss', returns='s'), + Method('menu_launch', arguments='sasa{ss}is', returns='a{ss}'), + Method('menus_get', arguments='si', returns='a(ssasasa{ss})'), + Method('message_encryption_get', arguments='ss', returns='s'), + Method('message_encryption_start', arguments='ssbs', returns=''), + Method('message_encryption_stop', arguments='ss', returns=''), + Method('message_send', arguments='sa{ss}a{ss}sss', returns=''), + Method('namespaces_get', arguments='', returns='a{ss}'), + Method('param_get_a', arguments='ssss', returns='s'), + Method('param_get_a_async', arguments='sssis', returns='s'), + Method('param_set', arguments='sssis', returns=''), + Method('param_ui_get', arguments='isss', returns='s'), + Method('params_categories_get', arguments='', returns='as'), + Method('params_register_app', arguments='sis', returns=''), + Method('params_template_load', arguments='s', returns='b'), + Method('params_template_save', arguments='s', returns='b'), + Method('params_values_from_category_get_async', arguments='sisss', returns='a{ss}'), + Method('presence_set', arguments='ssa{ss}s', returns=''), + Method('presence_statuses_get', arguments='s', returns='a{sa{s(sia{ss})}}'), + Method('private_data_delete', arguments='sss', returns=''), + Method('private_data_get', arguments='sss', returns='s'), + Method('private_data_set', arguments='ssss', returns=''), + Method('profile_create', arguments='sss', returns=''), + Method('profile_delete_async', arguments='s', returns=''), + Method('profile_is_session_started', arguments='s', returns='b'), + Method('profile_name_get', arguments='s', returns='s'), + Method('profile_set_default', arguments='s', returns=''), + Method('profile_start_session', arguments='ss', returns='b'), + Method('profiles_list_get', arguments='bb', returns='as'), + Method('progress_get', arguments='ss', returns='a{ss}'), + Method('progress_get_all', arguments='s', returns='a{sa{sa{ss}}}'), + Method('progress_get_all_metadata', arguments='s', returns='a{sa{sa{ss}}}'), + Method('ready_get', arguments='', returns=''), + Method('roster_resync', arguments='s', returns=''), + Method('session_infos_get', arguments='s', returns='a{ss}'), + Method('sub_waiting_get', arguments='s', returns='a{ss}'), Method('subscription', arguments='sss', returns=''), - Method('updateContact', arguments='ssass', returns=''), + Method('version_get', arguments='', returns='s'), Signal('_debug', 'sa{ss}s'), - Signal('actionNew', 'a{ss}sis'), + Signal('action_new', 'a{ss}sis'), Signal('connected', 'ss'), - Signal('contactDeleted', 'ss'), + Signal('contact_deleted', 'ss'), + Signal('contact_new', 'sa{ss}ass'), Signal('disconnected', 's'), - Signal('entityDataUpdated', 'ssss'), - Signal('messageEncryptionStarted', 'sss'), - Signal('messageEncryptionStopped', 'sa{ss}s'), - Signal('messageNew', 'sdssa{ss}a{ss}sss'), - Signal('newContact', 'sa{ss}ass'), - Signal('paramUpdate', 'ssss'), - Signal('presenceUpdate', 'ssia{ss}s'), - Signal('progressError', 'sss'), - Signal('progressFinished', 'sa{ss}s'), - Signal('progressStarted', 'sa{ss}s'), + Signal('entity_data_updated', 'ssss'), + Signal('message_encryption_started', 'sss'), + Signal('message_encryption_stopped', 'sa{ss}s'), + Signal('message_new', 'sdssa{ss}a{ss}sss'), + Signal('param_update', 'ssss'), + Signal('presence_update', 'ssia{ss}s'), + Signal('progress_error', 'sss'), + Signal('progress_finished', 'sa{ss}s'), + Signal('progress_started', 'sa{ss}s'), Signal('subscribe', 'sss'), ) plugin_iface = DBusInterface( @@ -196,212 +196,212 @@ d.addErrback(GenericException.create_and_raise) return d - def dbus_actionsGet(self, profile_key="@DEFAULT@"): - return self._callback("actionsGet", profile_key) - - def dbus_addContact(self, entity_jid, profile_key="@DEFAULT@"): - return self._callback("addContact", entity_jid, profile_key) + def dbus_action_launch(self, callback_id, data, profile_key="@DEFAULT@"): + return self._callback("action_launch", callback_id, data, profile_key) - def dbus_asyncDeleteProfile(self, profile): - return self._callback("asyncDeleteProfile", profile) + def dbus_actions_get(self, profile_key="@DEFAULT@"): + return self._callback("actions_get", profile_key) - def dbus_asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): - return self._callback("asyncGetParamA", name, category, attribute, security_limit, profile_key) - - def dbus_asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): - return self._callback("asyncGetParamsValuesFromCategory", category, security_limit, app, extra, profile_key) + def dbus_config_get(self, section, name): + return self._callback("config_get", section, name) def dbus_connect(self, profile_key="@DEFAULT@", password='', options={}): return self._callback("connect", profile_key, password, options) - def dbus_contactGet(self, arg_0, profile_key="@DEFAULT@"): - return self._callback("contactGet", arg_0, profile_key) + def dbus_contact_add(self, entity_jid, profile_key="@DEFAULT@"): + return self._callback("contact_add", entity_jid, profile_key) + + def dbus_contact_del(self, entity_jid, profile_key="@DEFAULT@"): + return self._callback("contact_del", entity_jid, profile_key) - def dbus_delContact(self, entity_jid, profile_key="@DEFAULT@"): - return self._callback("delContact", entity_jid, profile_key) + def dbus_contact_get(self, arg_0, profile_key="@DEFAULT@"): + return self._callback("contact_get", arg_0, profile_key) - def dbus_devicesInfosGet(self, bare_jid, profile_key): - return self._callback("devicesInfosGet", bare_jid, profile_key) + def dbus_contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@"): + return self._callback("contact_update", entity_jid, name, groups, profile_key) + + def dbus_contacts_get(self, profile_key="@DEFAULT@"): + return self._callback("contacts_get", profile_key) - def dbus_discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): - return self._callback("discoFindByFeatures", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) + def dbus_contacts_get_from_group(self, group, profile_key="@DEFAULT@"): + return self._callback("contacts_get_from_group", group, profile_key) + + def dbus_devices_infos_get(self, bare_jid, profile_key): + return self._callback("devices_infos_get", bare_jid, profile_key) - def dbus_discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - return self._callback("discoInfos", entity_jid, node, use_cache, profile_key) + def dbus_disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): + return self._callback("disco_find_by_features", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) - def dbus_discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - return self._callback("discoItems", entity_jid, node, use_cache, profile_key) + def dbus_disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + return self._callback("disco_infos", entity_jid, node, use_cache, profile_key) + + def dbus_disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + return self._callback("disco_items", entity_jid, node, use_cache, profile_key) def dbus_disconnect(self, profile_key="@DEFAULT@"): return self._callback("disconnect", profile_key) - def dbus_encryptionNamespaceGet(self, arg_0): - return self._callback("encryptionNamespaceGet", arg_0) + def dbus_encryption_namespace_get(self, arg_0): + return self._callback("encryption_namespace_get", arg_0) - def dbus_encryptionPluginsGet(self, ): - return self._callback("encryptionPluginsGet", ) + def dbus_encryption_plugins_get(self, ): + return self._callback("encryption_plugins_get", ) - def dbus_encryptionTrustUIGet(self, to_jid, namespace, profile_key): - return self._callback("encryptionTrustUIGet", to_jid, namespace, profile_key) + def dbus_encryption_trust_ui_get(self, to_jid, namespace, profile_key): + return self._callback("encryption_trust_ui_get", to_jid, namespace, profile_key) - def dbus_getConfig(self, section, name): - return self._callback("getConfig", section, name) + def dbus_entities_data_get(self, jids, keys, profile): + return self._callback("entities_data_get", jids, keys, profile) - def dbus_getContacts(self, profile_key="@DEFAULT@"): - return self._callback("getContacts", profile_key) + def dbus_entity_data_get(self, jid, keys, profile): + return self._callback("entity_data_get", jid, keys, profile) - def dbus_getContactsFromGroup(self, group, profile_key="@DEFAULT@"): - return self._callback("getContactsFromGroup", group, profile_key) + def dbus_features_get(self, profile_key): + return self._callback("features_get", profile_key) - def dbus_getEntitiesData(self, jids, keys, profile): - return self._callback("getEntitiesData", jids, keys, profile) + def dbus_history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): + return self._callback("history_get", from_jid, to_jid, limit, between, filters, profile) - def dbus_getEntityData(self, jid, keys, profile): - return self._callback("getEntityData", jid, keys, profile) + def dbus_image_check(self, arg_0): + return self._callback("image_check", arg_0) - def dbus_getFeatures(self, profile_key): - return self._callback("getFeatures", profile_key) + def dbus_image_convert(self, source, dest, arg_2, extra): + return self._callback("image_convert", source, dest, arg_2, extra) - def dbus_getMainResource(self, contact_jid, profile_key="@DEFAULT@"): - return self._callback("getMainResource", contact_jid, profile_key) + def dbus_image_generate_preview(self, image_path, profile_key): + return self._callback("image_generate_preview", image_path, profile_key) - def dbus_getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@"): - return self._callback("getParamA", name, category, attribute, profile_key) + def dbus_image_resize(self, image_path, width, height): + return self._callback("image_resize", image_path, width, height) - def dbus_getParamsCategories(self, ): - return self._callback("getParamsCategories", ) + def dbus_is_connected(self, profile_key="@DEFAULT@"): + return self._callback("is_connected", profile_key) - def dbus_getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): - return self._callback("getParamsUI", security_limit, app, extra, profile_key) + def dbus_main_resource_get(self, contact_jid, profile_key="@DEFAULT@"): + return self._callback("main_resource_get", contact_jid, profile_key) - def dbus_getPresenceStatuses(self, profile_key="@DEFAULT@"): - return self._callback("getPresenceStatuses", profile_key) + def dbus_menu_help_get(self, menu_id, language): + return self._callback("menu_help_get", menu_id, language) - def dbus_getReady(self, ): - return self._callback("getReady", ) + def dbus_menu_launch(self, menu_type, path, data, security_limit, profile_key): + return self._callback("menu_launch", menu_type, path, data, security_limit, profile_key) - def dbus_getVersion(self, ): - return self._callback("getVersion", ) + def dbus_menus_get(self, language, security_limit): + return self._callback("menus_get", language, security_limit) - def dbus_getWaitingSub(self, profile_key="@DEFAULT@"): - return self._callback("getWaitingSub", profile_key) + def dbus_message_encryption_get(self, to_jid, profile_key): + return self._callback("message_encryption_get", to_jid, profile_key) - def dbus_historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): - return self._callback("historyGet", from_jid, to_jid, limit, between, filters, profile) + def dbus_message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): + return self._callback("message_encryption_start", to_jid, namespace, replace, profile_key) - def dbus_imageCheck(self, arg_0): - return self._callback("imageCheck", arg_0) + def dbus_message_encryption_stop(self, to_jid, profile_key): + return self._callback("message_encryption_stop", to_jid, profile_key) - def dbus_imageConvert(self, source, dest, arg_2, extra): - return self._callback("imageConvert", source, dest, arg_2, extra) + def dbus_message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): + return self._callback("message_send", to_jid, message, subject, mess_type, extra, profile_key) - def dbus_imageGeneratePreview(self, image_path, profile_key): - return self._callback("imageGeneratePreview", image_path, profile_key) + def dbus_namespaces_get(self, ): + return self._callback("namespaces_get", ) - def dbus_imageResize(self, image_path, width, height): - return self._callback("imageResize", image_path, width, height) - - def dbus_isConnected(self, profile_key="@DEFAULT@"): - return self._callback("isConnected", profile_key) + def dbus_param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@"): + return self._callback("param_get_a", name, category, attribute, profile_key) - def dbus_launchAction(self, callback_id, data, profile_key="@DEFAULT@"): - return self._callback("launchAction", callback_id, data, profile_key) + def dbus_param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): + return self._callback("param_get_a_async", name, category, attribute, security_limit, profile_key) - def dbus_loadParamsTemplate(self, filename): - return self._callback("loadParamsTemplate", filename) + def dbus_param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): + return self._callback("param_set", name, value, category, security_limit, profile_key) - def dbus_menuHelpGet(self, menu_id, language): - return self._callback("menuHelpGet", menu_id, language) + def dbus_param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): + return self._callback("param_ui_get", security_limit, app, extra, profile_key) - def dbus_menuLaunch(self, menu_type, path, data, security_limit, profile_key): - return self._callback("menuLaunch", menu_type, path, data, security_limit, profile_key) + def dbus_params_categories_get(self, ): + return self._callback("params_categories_get", ) - def dbus_menusGet(self, language, security_limit): - return self._callback("menusGet", language, security_limit) + def dbus_params_register_app(self, xml, security_limit=-1, app=''): + return self._callback("params_register_app", xml, security_limit, app) - def dbus_messageEncryptionGet(self, to_jid, profile_key): - return self._callback("messageEncryptionGet", to_jid, profile_key) + def dbus_params_template_load(self, filename): + return self._callback("params_template_load", filename) - def dbus_messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): - return self._callback("messageEncryptionStart", to_jid, namespace, replace, profile_key) + def dbus_params_template_save(self, filename): + return self._callback("params_template_save", filename) - def dbus_messageEncryptionStop(self, to_jid, profile_key): - return self._callback("messageEncryptionStop", to_jid, profile_key) + def dbus_params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): + return self._callback("params_values_from_category_get_async", category, security_limit, app, extra, profile_key) - def dbus_messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): - return self._callback("messageSend", to_jid, message, subject, mess_type, extra, profile_key) + def dbus_presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): + return self._callback("presence_set", to_jid, show, statuses, profile_key) - def dbus_namespacesGet(self, ): - return self._callback("namespacesGet", ) + def dbus_presence_statuses_get(self, profile_key="@DEFAULT@"): + return self._callback("presence_statuses_get", profile_key) - def dbus_paramsRegisterApp(self, xml, security_limit=-1, app=''): - return self._callback("paramsRegisterApp", xml, security_limit, app) + def dbus_private_data_delete(self, namespace, key, arg_2): + return self._callback("private_data_delete", namespace, key, arg_2) - def dbus_privateDataDelete(self, namespace, key, arg_2): - return self._callback("privateDataDelete", namespace, key, arg_2) - - def dbus_privateDataGet(self, namespace, key, profile_key): - return self._callback("privateDataGet", namespace, key, profile_key) + def dbus_private_data_get(self, namespace, key, profile_key): + return self._callback("private_data_get", namespace, key, profile_key) - def dbus_privateDataSet(self, namespace, key, data, profile_key): - return self._callback("privateDataSet", namespace, key, data, profile_key) + def dbus_private_data_set(self, namespace, key, data, profile_key): + return self._callback("private_data_set", namespace, key, data, profile_key) - def dbus_profileCreate(self, profile, password='', component=''): - return self._callback("profileCreate", profile, password, component) + def dbus_profile_create(self, profile, password='', component=''): + return self._callback("profile_create", profile, password, component) - def dbus_profileIsSessionStarted(self, profile_key="@DEFAULT@"): - return self._callback("profileIsSessionStarted", profile_key) + def dbus_profile_delete_async(self, profile): + return self._callback("profile_delete_async", profile) - def dbus_profileNameGet(self, profile_key="@DEFAULT@"): - return self._callback("profileNameGet", profile_key) + def dbus_profile_is_session_started(self, profile_key="@DEFAULT@"): + return self._callback("profile_is_session_started", profile_key) - def dbus_profileSetDefault(self, profile): - return self._callback("profileSetDefault", profile) + def dbus_profile_name_get(self, profile_key="@DEFAULT@"): + return self._callback("profile_name_get", profile_key) - def dbus_profileStartSession(self, password='', profile_key="@DEFAULT@"): - return self._callback("profileStartSession", password, profile_key) + def dbus_profile_set_default(self, profile): + return self._callback("profile_set_default", profile) - def dbus_profilesListGet(self, clients=True, components=False): - return self._callback("profilesListGet", clients, components) + def dbus_profile_start_session(self, password='', profile_key="@DEFAULT@"): + return self._callback("profile_start_session", password, profile_key) - def dbus_progressGet(self, id, profile): - return self._callback("progressGet", id, profile) + def dbus_profiles_list_get(self, clients=True, components=False): + return self._callback("profiles_list_get", clients, components) - def dbus_progressGetAll(self, profile): - return self._callback("progressGetAll", profile) + def dbus_progress_get(self, id, profile): + return self._callback("progress_get", id, profile) - def dbus_progressGetAllMetadata(self, profile): - return self._callback("progressGetAllMetadata", profile) + def dbus_progress_get_all(self, profile): + return self._callback("progress_get_all", profile) - def dbus_rosterResync(self, profile_key="@DEFAULT@"): - return self._callback("rosterResync", profile_key) + def dbus_progress_get_all_metadata(self, profile): + return self._callback("progress_get_all_metadata", profile) - def dbus_saveParamsTemplate(self, filename): - return self._callback("saveParamsTemplate", filename) + def dbus_ready_get(self, ): + return self._callback("ready_get", ) - def dbus_sessionInfosGet(self, profile_key): - return self._callback("sessionInfosGet", profile_key) + def dbus_roster_resync(self, profile_key="@DEFAULT@"): + return self._callback("roster_resync", profile_key) - def dbus_setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): - return self._callback("setParam", name, value, category, security_limit, profile_key) + def dbus_session_infos_get(self, profile_key): + return self._callback("session_infos_get", profile_key) - def dbus_setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): - return self._callback("setPresence", to_jid, show, statuses, profile_key) + def dbus_sub_waiting_get(self, profile_key="@DEFAULT@"): + return self._callback("sub_waiting_get", profile_key) def dbus_subscription(self, sub_type, entity, profile_key="@DEFAULT@"): return self._callback("subscription", sub_type, entity, profile_key) - def dbus_updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@"): - return self._callback("updateContact", entity_jid, name, groups, profile_key) + def dbus_version_get(self, ): + return self._callback("version_get", ) -class Bridge: +class bridge: def __init__(self): log.info("Init DBus...") self._obj = DBusObject(const_OBJ_PATH) - async def postInit(self): + async def post_init(self): try: conn = await client.connect(reactor) except error.DBusException as e: @@ -420,47 +420,47 @@ def _debug(self, action, params, profile): self._obj.emitSignal("_debug", action, params, profile) - def actionNew(self, action_data, id, security_limit, profile): - self._obj.emitSignal("actionNew", action_data, id, security_limit, profile) + def action_new(self, action_data, id, security_limit, profile): + self._obj.emitSignal("action_new", action_data, id, security_limit, profile) def connected(self, jid_s, profile): self._obj.emitSignal("connected", jid_s, profile) - def contactDeleted(self, entity_jid, profile): - self._obj.emitSignal("contactDeleted", entity_jid, profile) + def contact_deleted(self, entity_jid, profile): + self._obj.emitSignal("contact_deleted", entity_jid, profile) + + def contact_new(self, contact_jid, attributes, groups, profile): + self._obj.emitSignal("contact_new", contact_jid, attributes, groups, profile) def disconnected(self, profile): self._obj.emitSignal("disconnected", profile) - def entityDataUpdated(self, jid, name, value, profile): - self._obj.emitSignal("entityDataUpdated", jid, name, value, profile) + def entity_data_updated(self, jid, name, value, profile): + self._obj.emitSignal("entity_data_updated", jid, name, value, profile) - def messageEncryptionStarted(self, to_jid, encryption_data, profile_key): - self._obj.emitSignal("messageEncryptionStarted", to_jid, encryption_data, profile_key) + def message_encryption_started(self, to_jid, encryption_data, profile_key): + self._obj.emitSignal("message_encryption_started", to_jid, encryption_data, profile_key) - def messageEncryptionStopped(self, to_jid, encryption_data, profile_key): - self._obj.emitSignal("messageEncryptionStopped", to_jid, encryption_data, profile_key) + def message_encryption_stopped(self, to_jid, encryption_data, profile_key): + self._obj.emitSignal("message_encryption_stopped", to_jid, encryption_data, profile_key) - def messageNew(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): - self._obj.emitSignal("messageNew", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - - def newContact(self, contact_jid, attributes, groups, profile): - self._obj.emitSignal("newContact", contact_jid, attributes, groups, profile) + def message_new(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): + self._obj.emitSignal("message_new", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - def paramUpdate(self, name, value, category, profile): - self._obj.emitSignal("paramUpdate", name, value, category, profile) + def param_update(self, name, value, category, profile): + self._obj.emitSignal("param_update", name, value, category, profile) - def presenceUpdate(self, entity_jid, show, priority, statuses, profile): - self._obj.emitSignal("presenceUpdate", entity_jid, show, priority, statuses, profile) + def presence_update(self, entity_jid, show, priority, statuses, profile): + self._obj.emitSignal("presence_update", entity_jid, show, priority, statuses, profile) - def progressError(self, id, error, profile): - self._obj.emitSignal("progressError", id, error, profile) + def progress_error(self, id, error, profile): + self._obj.emitSignal("progress_error", id, error, profile) - def progressFinished(self, id, metadata, profile): - self._obj.emitSignal("progressFinished", id, metadata, profile) + def progress_finished(self, id, metadata, profile): + self._obj.emitSignal("progress_finished", id, metadata, profile) - def progressStarted(self, id, metadata, profile): - self._obj.emitSignal("progressStarted", id, metadata, profile) + def progress_started(self, id, metadata, profile): + self._obj.emitSignal("progress_started", id, metadata, profile) def subscribe(self, sub_type, entity_jid, profile): self._obj.emitSignal("subscribe", sub_type, entity_jid, profile) @@ -469,13 +469,13 @@ log.debug(f"registering DBus bridge method [{name}]") self._obj.register_method(name, callback) - def emitSignal(self, name, *args): + def emit_signal(self, name, *args): self._obj.emitSignal(name, *args) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to D-Bus Bridge""" + """Dynamically add a method to D-Bus bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug(f"Adding method {name!r} to D-Bus bridge") self._obj.plugin_iface.addMethod( @@ -488,8 +488,8 @@ setattr(self._obj, f"dbus_{name}", MethodType(caller, self._obj)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): - """Dynamically add a signal to D-Bus Bridge""" + def add_signal(self, name, int_suffix, signature, doc={}): + """Dynamically add a signal to D-Bus bridge""" log.debug(f"Adding signal {name!r} to D-Bus bridge") self._obj.plugin_iface.addSignal(Signal(name, signature)) - setattr(Bridge, name, partialmethod(Bridge.emitSignal, name)) \ No newline at end of file + setattr(bridge, name, partialmethod(bridge.emit_signal, name)) \ No newline at end of file diff -r c4464d7ae97b -r 524856bd7b19 sat/bridge/pb.py --- a/sat/bridge/pb.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/pb.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,17 +55,17 @@ def __init__(self): self.signals_handlers = [] - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(HandlerWrapper(signals_handler)) log.info("registered signal handler") - def sendSignalEb(self, failure_, signal_name): + def send_signal_eb(self, failure_, signal_name): if not failure_.check(pb.PBConnectionLost): log.error( f"Error while sending signal {signal_name}: {failure_}", ) - def sendSignal(self, name, args, kwargs): + def send_signal(self, name, args, kwargs): to_remove = [] for wrapper in self.signals_handlers: handler = wrapper.handler @@ -74,13 +74,13 @@ except pb.DeadReferenceError: to_remove.append(wrapper) else: - d.addErrback(self.sendSignalEb, name) + d.addErrback(self.send_signal_eb, name) if to_remove: for wrapper in to_remove: log.debug("Removing signal handler for dead frontend") self.signals_handlers.remove(wrapper) - def _bridgeDeactivateSignals(self): + def _bridge_deactivate_signals(self): if hasattr(self, "signals_paused"): log.warning("bridge signals already deactivated") if self.signals_handler: @@ -90,7 +90,7 @@ self.signals_handlers = [] log.debug("bridge signals have been deactivated") - def _bridgeReactivateSignals(self): + def _bridge_reactivate_signals(self): try: self.signals_handlers = self.signals_paused except AttributeError: @@ -102,31 +102,31 @@ ##METHODS_PART## -class Bridge(object): +class bridge(object): def __init__(self): log.info("Init Perspective Broker...") self.root = PBRoot() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" log.info(f"using UNIX Socket at {socket_path}") reactor.listenUNIX( str(socket_path), pb.PBServerFactory(self.root), mode=0o600 ) elif conn_type == "socket": - port = int(getConf("port", 8789)) + port = int(get_conf("port", 8789)) log.info(f"using TCP Socket at port {port}") reactor.listenTCP(port, pb.PBServerFactory(self.root)) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") - def sendSignal(self, name, *args, **kwargs): - self.root.sendSignal(name, args, kwargs) + def send_signal(self, name, *args, **kwargs): + self.root.send_signal(name, args, kwargs) - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(signals_handler) log.info("registered signal handler") @@ -135,78 +135,78 @@ setattr(self.root, "remote_" + name, callback) #  self.root.register_method(name, callback) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to PB Bridge""" + """Dynamically add a method to PB bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method {name} to PB bridge".format(name=name)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): log.debug("Adding signal {name} to PB bridge".format(name=name)) setattr( - self, name, lambda *args, **kwargs: self.sendSignal(name, *args, **kwargs) + self, name, lambda *args, **kwargs: self.send_signal(name, *args, **kwargs) ) - def bridgeDeactivateSignals(self): + def bridge_deactivate_signals(self): """Stop sending signals to bridge Mainly used for mobile frontends, when the frontend is paused """ - self.root._bridgeDeactivateSignals() + self.root._bridge_deactivate_signals() - def bridgeReactivateSignals(self): + def bridge_reactivate_signals(self): """Send again signals to bridge - Should only be used after bridgeDeactivateSignals has been called + Should only be used after bridge_deactivate_signals has been called """ - self.root._bridgeReactivateSignals() + self.root._bridge_reactivate_signals() def _debug(self, action, params, profile): - self.sendSignal("_debug", action, params, profile) + self.send_signal("_debug", action, params, profile) - def actionNew(self, action_data, id, security_limit, profile): - self.sendSignal("actionNew", action_data, id, security_limit, profile) + def action_new(self, action_data, id, security_limit, profile): + self.send_signal("action_new", action_data, id, security_limit, profile) def connected(self, jid_s, profile): - self.sendSignal("connected", jid_s, profile) + self.send_signal("connected", jid_s, profile) - def contactDeleted(self, entity_jid, profile): - self.sendSignal("contactDeleted", entity_jid, profile) + def contact_deleted(self, entity_jid, profile): + self.send_signal("contact_deleted", entity_jid, profile) + + def contact_new(self, contact_jid, attributes, groups, profile): + self.send_signal("contact_new", contact_jid, attributes, groups, profile) def disconnected(self, profile): - self.sendSignal("disconnected", profile) + self.send_signal("disconnected", profile) - def entityDataUpdated(self, jid, name, value, profile): - self.sendSignal("entityDataUpdated", jid, name, value, profile) + def entity_data_updated(self, jid, name, value, profile): + self.send_signal("entity_data_updated", jid, name, value, profile) - def messageEncryptionStarted(self, to_jid, encryption_data, profile_key): - self.sendSignal("messageEncryptionStarted", to_jid, encryption_data, profile_key) + def message_encryption_started(self, to_jid, encryption_data, profile_key): + self.send_signal("message_encryption_started", to_jid, encryption_data, profile_key) - def messageEncryptionStopped(self, to_jid, encryption_data, profile_key): - self.sendSignal("messageEncryptionStopped", to_jid, encryption_data, profile_key) + def message_encryption_stopped(self, to_jid, encryption_data, profile_key): + self.send_signal("message_encryption_stopped", to_jid, encryption_data, profile_key) - def messageNew(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): - self.sendSignal("messageNew", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) + def message_new(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): + self.send_signal("message_new", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - def newContact(self, contact_jid, attributes, groups, profile): - self.sendSignal("newContact", contact_jid, attributes, groups, profile) + def param_update(self, name, value, category, profile): + self.send_signal("param_update", name, value, category, profile) - def paramUpdate(self, name, value, category, profile): - self.sendSignal("paramUpdate", name, value, category, profile) - - def presenceUpdate(self, entity_jid, show, priority, statuses, profile): - self.sendSignal("presenceUpdate", entity_jid, show, priority, statuses, profile) + def presence_update(self, entity_jid, show, priority, statuses, profile): + self.send_signal("presence_update", entity_jid, show, priority, statuses, profile) - def progressError(self, id, error, profile): - self.sendSignal("progressError", id, error, profile) + def progress_error(self, id, error, profile): + self.send_signal("progress_error", id, error, profile) - def progressFinished(self, id, metadata, profile): - self.sendSignal("progressFinished", id, metadata, profile) + def progress_finished(self, id, metadata, profile): + self.send_signal("progress_finished", id, metadata, profile) - def progressStarted(self, id, metadata, profile): - self.sendSignal("progressStarted", id, metadata, profile) + def progress_started(self, id, metadata, profile): + self.send_signal("progress_started", id, metadata, profile) def subscribe(self, sub_type, entity_jid, profile): - self.sendSignal("subscribe", sub_type, entity_jid, profile) + self.send_signal("subscribe", sub_type, entity_jid, profile) diff -r c4464d7ae97b -r 524856bd7b19 sat/core/constants.py --- a/sat/core/constants.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/constants.py Sat Apr 08 13:54:42 2023 +0200 @@ -435,7 +435,7 @@ return value.lower() in (cls.BOOL_TRUE, "1", "yes", "on") @classmethod - def boolConst(cls, value: bool) -> str: + def bool_const(cls, value: bool) -> str: """@return (str): constant associated to bool value""" assert isinstance(value, bool) return cls.BOOL_TRUE if value else cls.BOOL_FALSE diff -r c4464d7ae97b -r 524856bd7b19 sat/core/i18n.py --- a/sat/core/i18n.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/i18n.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,7 +30,7 @@ _ = gettext.translation("sat", "i18n", fallback=True).gettext _translators = {None: gettext.NullTranslations()} - def languageSwitch(lang=None): + def language_switch(lang=None): if not lang in _translators: _translators[lang] = gettext.translation( "sat", languages=[lang], fallback=True @@ -43,7 +43,7 @@ log.warning("gettext support disabled") _ = cast(Callable[[str], str], lambda msg: msg) # Libervia doesn't support gettext - def languageSwitch(lang=None): + def language_switch(lang=None): pass diff -r c4464d7ae97b -r 524856bd7b19 sat/core/log.py --- a/sat/core/log.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/log.py Sat Apr 08 13:54:42 2023 +0200 @@ -121,7 +121,7 @@ 'levelname': level, } try: - if not self.filter_name.dictFilter(record): + if not self.filter_name.dict_filter(record): raise Filtered except (AttributeError, TypeError): # XXX: TypeError is here because of a pyjamas bug which need to be fixed (TypeError is raised instead of AttributeError) if self.filter_name is not None: @@ -133,7 +133,7 @@ except KeyError as e: if e.args[0] == 'profile': # XXX: %(profile)s use some magic with introspection, for debugging purpose only *DO NOT* use in production - record['profile'] = configure_cls[backend].getProfile() + record['profile'] = configure_cls[backend].get_profile() return self.fmt % record else: raise e @@ -174,7 +174,7 @@ return 1 return 0 - def dictFilter(self, dict_record): + def dict_filter(self, dict_record): """Filter using a dictionary record @param dict_record: dictionary with at list a key "name" with logger name @@ -208,26 +208,26 @@ @param force_colors: if True ANSI colors are used even if stdout is not a tty """ self.backend_data = backend_data - self.preTreatment() - self.configureLevel(level) - self.configureFormat(fmt) - self.configureOutput(output) - self.configureLogger(logger) - self.configureColors(colors, force_colors, levels_taints_dict) - self.postTreatment() - self.updateCurrentLogger() + self.pre_treatment() + self.configure_level(level) + self.configure_format(fmt) + self.configure_output(output) + self.configure_logger(logger) + self.configure_colors(colors, force_colors, levels_taints_dict) + self.post_treatment() + self.update_current_logger() - def updateCurrentLogger(self): + def update_current_logger(self): """update existing logger to the class needed for this backend""" if self.LOGGER_CLASS is None: return for name, logger in list(_loggers.items()): _loggers[name] = self.LOGGER_CLASS(logger) - def preTreatment(self): + def pre_treatment(self): pass - def configureLevel(self, level): + def configure_level(self, level): if level is not None: # we deactivate methods below level level_idx = C.LOG_LEVELS.index(level) @@ -236,7 +236,7 @@ for _level in C.LOG_LEVELS[:level_idx]: setattr(Logger, _level.lower(), dev_null) - def configureFormat(self, fmt): + def configure_format(self, fmt): if fmt is not None: if fmt != '%(message)s': # %(message)s is the same as None Logger.fmt = fmt @@ -246,17 +246,17 @@ # color_start not followed by an end, we add it Logger.fmt += COLOR_END - def configureOutput(self, output): + def configure_output(self, output): if output is not None: if output != C.LOG_OPT_OUTPUT_SEP + C.LOG_OPT_OUTPUT_DEFAULT: # TODO: manage other outputs raise NotImplementedError("Basic backend only manage default output yet") - def configureLogger(self, logger): + def configure_logger(self, logger): if logger: Logger.filter_name = FilterName(logger) - def configureColors(self, colors, force_colors, levels_taints_dict): + def configure_colors(self, colors, force_colors, levels_taints_dict): if colors: # if color are used, we need to handle levels_taints_dict for level in list(levels_taints_dict.keys()): @@ -280,10 +280,10 @@ ansi_list.append(ansi) taints[level] = ''.join(ansi_list) - def postTreatment(self): + def post_treatment(self): pass - def manageOutputs(self, outputs_raw): + def manage_outputs(self, outputs_raw): """ Parse output option in a backend agnostic way, and fill handlers consequently @param outputs_raw: output option as enterred in environment variable or in configuration @@ -330,7 +330,7 @@ raise ValueError("options [{options}] are not supported for {handler} output".format(options=options, handler=output)) @staticmethod - def memoryGet(size=None): + def memory_get(size=None): """Return buffered logs @param size: number of logs to return @@ -338,7 +338,7 @@ raise NotImplementedError @classmethod - def ansiColors(cls, level, message): + def ansi_colors(cls, level, message): """Colorise message depending on level for terminals @param level: one of C.LOG_LEVELS @@ -358,7 +358,7 @@ return '%s%s%s' % (start, message, A.RESET) @staticmethod - def getProfile(): + def get_profile(): """Try to find profile value using introspection""" raise NotImplementedError @@ -396,10 +396,10 @@ else: configure_class(**options) -def memoryGet(size=None): +def memory_get(size=None): if not C.LOG_OPT_OUTPUT_MEMORY in handlers: raise ValueError('memory output is not used') - return configure_cls[backend].memoryGet(size) + return configure_cls[backend].memory_get(size) def getLogger(name=C.LOG_BASE_LOGGER) -> Logger: try: diff -r c4464d7ae97b -r 524856bd7b19 sat/core/log_config.py --- a/sat/core/log_config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/log_config.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,8 +47,8 @@ class ConfigureBasic(log.ConfigureBase): - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureBasic, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureBasic, self).configure_colors( colors, force_colors, levels_taints_dict ) if colors: @@ -61,14 +61,14 @@ # FIXME: isatty should be tested on each handler, not globaly if (force_colors or isatty): # we need colors - log.Logger.post_treat = lambda logger, level, message: self.ansiColors( + log.Logger.post_treat = lambda logger, level, message: self.ansi_colors( level, message ) elif force_colors: raise ValueError("force_colors can't be used if colors is False") @staticmethod - def getProfile(): + def get_profile(): """Try to find profile value using introspection""" import inspect @@ -107,7 +107,7 @@ class ConfigureTwisted(ConfigureBasic): LOGGER_CLASS = TwistedLogger - def preTreatment(self): + def pre_treatment(self): from twisted import logger global logger self.level_map = { @@ -119,17 +119,17 @@ } self.LOGGER_CLASS.level_map = self.level_map - def configureLevel(self, level): + def configure_level(self, level): self.level = self.level_map[level] - def configureOutput(self, output): + def configure_output(self, output): import sys from twisted.python import logfile self.log_publisher = logger.LogPublisher() if output is None: output = C.LOG_OPT_OUTPUT_SEP + C.LOG_OPT_OUTPUT_DEFAULT - self.manageOutputs(output) + self.manage_outputs(output) if C.LOG_OPT_OUTPUT_DEFAULT in log.handlers: if self.backend_data is None: @@ -139,11 +139,11 @@ options = self.backend_data log_file = logfile.LogFile.fromFullPath(options['logfile']) self.log_publisher.addObserver( - logger.FileLogObserver(log_file, self.textFormatter)) + logger.FileLogObserver(log_file, self.text_formatter)) # we also want output to stdout if we are in debug or nodaemon mode if options.get("nodaemon", False) or options.get("debug", False): self.log_publisher.addObserver( - logger.FileLogObserver(sys.stdout, self.textFormatter)) + logger.FileLogObserver(sys.stdout, self.text_formatter)) if C.LOG_OPT_OUTPUT_FILE in log.handlers: @@ -152,15 +152,15 @@ sys.stdout if path == "-" else logfile.LogFile.fromFullPath(path) ) self.log_publisher.addObserver( - logger.FileLogObserver(log_file, self.textFormatter)) + logger.FileLogObserver(log_file, self.text_formatter)) if C.LOG_OPT_OUTPUT_MEMORY in log.handlers: raise NotImplementedError( "Memory observer is not implemented in Twisted backend" ) - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureTwisted, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureTwisted, self).configure_colors( colors, force_colors, levels_taints_dict ) self.LOGGER_CLASS.colors = colors @@ -168,7 +168,7 @@ if force_colors and not colors: raise ValueError("colors must be True if force_colors is True") - def postTreatment(self): + def post_treatment(self): """Install twistedObserver which manage non SàT logs""" # from twisted import logger import sys @@ -180,7 +180,7 @@ ) logger.globalLogBeginner.beginLoggingTo([filtering_obs]) - def textFormatter(self, event): + def text_formatter(self, event): if event.get('sat_logged', False): timestamp = ''.join([logger.formatTime(event.get("log_time", None)), " "]) return f"{timestamp}{event.get('log_format', '')}\n" @@ -219,7 +219,7 @@ backend_data, ) - def preTreatment(self): + def pre_treatment(self): """We use logging methods directly, instead of using Logger""" import logging @@ -230,13 +230,13 @@ log.error = logging.error log.critical = logging.critical - def configureLevel(self, level): + def configure_level(self, level): if level is None: level = C.LOG_LVL_DEBUG self.level = level - def configureFormat(self, fmt): - super(ConfigureStandard, self).configureFormat(fmt) + def configure_format(self, fmt): + super(ConfigureStandard, self).configure_format(fmt) import logging class SatFormatter(logging.Formatter): @@ -250,11 +250,11 @@ def format(self, record): if self._with_profile: - record.profile = ConfigureStandard.getProfile() + record.profile = ConfigureStandard.get_profile() do_color = self.with_colors and (self.can_colors or self.force_colors) if ConfigureStandard._color_location: # we copy raw formatting strings for color_* - # as formatting is handled in ansiColors in this case + # as formatting is handled in ansi_colors in this case if do_color: record.color_start = log.COLOR_START record.color_end = log.COLOR_END @@ -262,19 +262,19 @@ record.color_start = record.color_end = "" s = super(SatFormatter, self).format(record) if do_color: - s = ConfigureStandard.ansiColors(record.levelname, s) + s = ConfigureStandard.ansi_colors(record.levelname, s) return s self.formatterClass = SatFormatter - def configureOutput(self, output): - self.manageOutputs(output) + def configure_output(self, output): + self.manage_outputs(output) - def configureLogger(self, logger): + def configure_logger(self, logger): self.name_filter = log.FilterName(logger) if logger else None - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureStandard, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureStandard, self).configure_colors( colors, force_colors, levels_taints_dict ) self.formatterClass.with_colors = colors @@ -282,14 +282,14 @@ if not colors and force_colors: raise ValueError("force_colors can't be used if colors is False") - def _addHandler(self, root_logger, hdlr, can_colors=False): + def _add_handler(self, root_logger, hdlr, can_colors=False): hdlr.setFormatter(self.formatterClass(can_colors)) root_logger.addHandler(hdlr) root_logger.setLevel(self.level) if self.name_filter is not None: hdlr.addFilter(self.name_filter) - def postTreatment(self): + def post_treatment(self): import logging root_logger = logging.getLogger() @@ -301,7 +301,7 @@ can_colors = hdlr.stream.isatty() except AttributeError: can_colors = False - self._addHandler(root_logger, hdlr, can_colors=can_colors) + self._add_handler(root_logger, hdlr, can_colors=can_colors) elif handler == C.LOG_OPT_OUTPUT_MEMORY: from logging.handlers import BufferingHandler @@ -315,20 +315,20 @@ ] = ( hdlr ) # we keep a reference to the handler to read the buffer later - self._addHandler(root_logger, hdlr, can_colors=False) + self._add_handler(root_logger, hdlr, can_colors=False) elif handler == C.LOG_OPT_OUTPUT_FILE: import os.path for path in options: hdlr = logging.FileHandler(os.path.expanduser(path)) - self._addHandler(root_logger, hdlr, can_colors=False) + self._add_handler(root_logger, hdlr, can_colors=False) else: raise ValueError("Unknown handler type") else: root_logger.warning("Handlers already set on root logger") @staticmethod - def memoryGet(size=None): + def memory_get(size=None): """Return buffered logs @param size: number of logs to return @@ -355,7 +355,7 @@ return log.configure(backend, **options) -def _parseOptions(options): +def _parse_options(options): """Parse string options as given in conf or environment variable, and return expected python value @param options (dict): options with (key: name, value: string value) @@ -378,7 +378,7 @@ options[LEVEL] = level -def satConfigure(backend=C.LOG_BACKEND_STANDARD, const=None, backend_data=None): +def sat_configure(backend=C.LOG_BACKEND_STANDARD, const=None, backend_data=None): """Configure logging system for SàT, can be used by frontends logs conf is read in SàT conf, then in environment variables. It must be done before Memory init @@ -396,16 +396,16 @@ import os log_conf = {} - sat_conf = config.parseMainConf() + sat_conf = config.parse_main_conf() for opt_name, opt_default in C.LOG_OPTIONS(): try: log_conf[opt_name] = os.environ[ "".join((C.ENV_PREFIX, C.LOG_OPT_PREFIX.upper(), opt_name.upper())) ] except KeyError: - log_conf[opt_name] = config.getConfig( + log_conf[opt_name] = config.config_get( sat_conf, C.LOG_OPT_SECTION, C.LOG_OPT_PREFIX + opt_name, opt_default ) - _parseOptions(log_conf) + _parse_options(log_conf) configure(backend, backend_data=backend_data, **log_conf) diff -r c4464d7ae97b -r 524856bd7b19 sat/core/patches.py --- a/sat/core/patches.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/patches.py Sat Apr 08 13:54:42 2023 +0200 @@ -71,7 +71,7 @@ self._onElementHooks = [] self._sendHooks = [] - def addHook(self, hook_type, callback): + def add_hook(self, hook_type, callback): """Add a send or receive hook""" conflict_msg = f"Hook conflict: can't add {hook_type} hook {callback}" if hook_type == C.STREAM_HOOK_RECEIVE: diff -r c4464d7ae97b -r 524856bd7b19 sat/core/sat_main.py --- a/sat/core/sat_main.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/sat_main.py Sat Apr 08 13:54:42 2023 +0200 @@ -26,7 +26,7 @@ from wokkel.data_form import Option import sat -from sat.core.i18n import _, D_, languageSwitch +from sat.core.i18n import _, D_, language_switch from sat.core import patches patches.apply() from twisted.application import service @@ -69,7 +69,7 @@ self.profiles = {} self.plugins = {} # map for short name to whole namespace, - # extended by plugins with registerNamespace + # extended by plugins with register_namespace self.ns_map = { "x-data": xmpp.NS_X_DATA, "disco#info": xmpp.NS_DISCO_INFO, @@ -84,7 +84,7 @@ bridge_name = ( os.getenv("LIBERVIA_BRIDGE_NAME") - or self.memory.getConfig("", "bridge", "dbus") + or self.memory.config_get("", "bridge", "dbus") ) bridge_module = dynamic_import.bridge(bridge_name) @@ -93,9 +93,9 @@ sys.exit(1) log.info(f"using {bridge_name} bridge") try: - self.bridge = bridge_module.Bridge() + self.bridge = bridge_module.bridge() except exceptions.BridgeInitError: - log.exception("Bridge can't be initialised, can't start Libervia Backend") + log.exception("bridge can't be initialised, can't start Libervia Backend") sys.exit(1) defer.ensureDeferred(self._post_init()) @@ -118,7 +118,7 @@ return self._version_cache except AttributeError: self._version_cache = "{} « {} » ({})".format( - version, C.APP_RELEASE_NAME, utils.getRepositoryData(sat) + version, C.APP_RELEASE_NAME, utils.get_repository_data(sat) ) return self._version_cache else: @@ -130,7 +130,7 @@ async def _post_init(self): try: - bridge_pi = self.bridge.postInit + bridge_pi = self.bridge.post_init except AttributeError: pass else: @@ -142,84 +142,84 @@ reactor.callLater(0, self.stop) return - self.bridge.register_method("getReady", lambda: self.initialised) - self.bridge.register_method("getVersion", lambda: self.full_version) - self.bridge.register_method("getFeatures", self.getFeatures) - self.bridge.register_method("profileNameGet", self.memory.getProfileName) - self.bridge.register_method("profilesListGet", self.memory.getProfilesList) - self.bridge.register_method("getEntityData", self.memory._getEntityData) - self.bridge.register_method("getEntitiesData", self.memory._getEntitiesData) - self.bridge.register_method("profileCreate", self.memory.createProfile) - self.bridge.register_method("asyncDeleteProfile", self.memory.asyncDeleteProfile) - self.bridge.register_method("profileStartSession", self.memory.startSession) + self.bridge.register_method("ready_get", lambda: self.initialised) + self.bridge.register_method("version_get", lambda: self.full_version) + self.bridge.register_method("features_get", self.features_get) + self.bridge.register_method("profile_name_get", self.memory.get_profile_name) + self.bridge.register_method("profiles_list_get", self.memory.get_profiles_list) + self.bridge.register_method("entity_data_get", self.memory._get_entity_data) + self.bridge.register_method("entities_data_get", self.memory._get_entities_data) + self.bridge.register_method("profile_create", self.memory.create_profile) + self.bridge.register_method("profile_delete_async", self.memory.profile_delete_async) + self.bridge.register_method("profile_start_session", self.memory.start_session) self.bridge.register_method( - "profileIsSessionStarted", self.memory._isSessionStarted + "profile_is_session_started", self.memory._is_session_started ) - self.bridge.register_method("profileSetDefault", self.memory.profileSetDefault) + self.bridge.register_method("profile_set_default", self.memory.profile_set_default) self.bridge.register_method("connect", self._connect) self.bridge.register_method("disconnect", self.disconnect) - self.bridge.register_method("contactGet", self._contactGet) - self.bridge.register_method("getContacts", self.getContacts) - self.bridge.register_method("getContactsFromGroup", self.getContactsFromGroup) - self.bridge.register_method("getMainResource", self.memory._getMainResource) + self.bridge.register_method("contact_get", self._contact_get) + self.bridge.register_method("contacts_get", self.contacts_get) + self.bridge.register_method("contacts_get_from_group", self.contacts_get_from_group) + self.bridge.register_method("main_resource_get", self.memory._get_main_resource) self.bridge.register_method( - "getPresenceStatuses", self.memory._getPresenceStatuses + "presence_statuses_get", self.memory._get_presence_statuses ) - self.bridge.register_method("getWaitingSub", self.memory.getWaitingSub) - self.bridge.register_method("messageSend", self._messageSend) - self.bridge.register_method("messageEncryptionStart", - self._messageEncryptionStart) - self.bridge.register_method("messageEncryptionStop", - self._messageEncryptionStop) - self.bridge.register_method("messageEncryptionGet", - self._messageEncryptionGet) - self.bridge.register_method("encryptionNamespaceGet", - self._encryptionNamespaceGet) - self.bridge.register_method("encryptionPluginsGet", self._encryptionPluginsGet) - self.bridge.register_method("encryptionTrustUIGet", self._encryptionTrustUIGet) - self.bridge.register_method("getConfig", self._getConfig) - self.bridge.register_method("setParam", self.setParam) - self.bridge.register_method("getParamA", self.memory.getStringParamA) - self.bridge.register_method("privateDataGet", self.memory._privateDataGet) - self.bridge.register_method("privateDataSet", self.memory._privateDataSet) - self.bridge.register_method("privateDataDelete", self.memory._privateDataDelete) - self.bridge.register_method("asyncGetParamA", self.memory.asyncGetStringParamA) + self.bridge.register_method("sub_waiting_get", self.memory.sub_waiting_get) + self.bridge.register_method("message_send", self._message_send) + self.bridge.register_method("message_encryption_start", + self._message_encryption_start) + self.bridge.register_method("message_encryption_stop", + self._message_encryption_stop) + self.bridge.register_method("message_encryption_get", + self._message_encryption_get) + self.bridge.register_method("encryption_namespace_get", + self._encryption_namespace_get) + self.bridge.register_method("encryption_plugins_get", self._encryption_plugins_get) + self.bridge.register_method("encryption_trust_ui_get", self._encryption_trust_ui_get) + self.bridge.register_method("config_get", self._get_config) + self.bridge.register_method("param_set", self.param_set) + self.bridge.register_method("param_get_a", self.memory.get_string_param_a) + self.bridge.register_method("private_data_get", self.memory._private_data_get) + self.bridge.register_method("private_data_set", self.memory._private_data_set) + self.bridge.register_method("private_data_delete", self.memory._private_data_delete) + self.bridge.register_method("param_get_a_async", self.memory.async_get_string_param_a) self.bridge.register_method( - "asyncGetParamsValuesFromCategory", - self.memory._getParamsValuesFromCategory, + "params_values_from_category_get_async", + self.memory._get_params_values_from_category, ) - self.bridge.register_method("getParamsUI", self.memory._getParamsUI) + self.bridge.register_method("param_ui_get", self.memory._get_params_ui) self.bridge.register_method( - "getParamsCategories", self.memory.getParamsCategories + "params_categories_get", self.memory.params_categories_get ) - self.bridge.register_method("paramsRegisterApp", self.memory.paramsRegisterApp) - self.bridge.register_method("historyGet", self.memory._historyGet) - self.bridge.register_method("setPresence", self._setPresence) + self.bridge.register_method("params_register_app", self.memory.params_register_app) + self.bridge.register_method("history_get", self.memory._history_get) + self.bridge.register_method("presence_set", self._set_presence) self.bridge.register_method("subscription", self.subscription) - self.bridge.register_method("addContact", self._addContact) - self.bridge.register_method("updateContact", self._updateContact) - self.bridge.register_method("delContact", self._delContact) - self.bridge.register_method("rosterResync", self._rosterResync) - self.bridge.register_method("isConnected", self.isConnected) - self.bridge.register_method("launchAction", self.launchCallback) - self.bridge.register_method("actionsGet", self.actionsGet) - self.bridge.register_method("progressGet", self._progressGet) - self.bridge.register_method("progressGetAll", self._progressGetAll) - self.bridge.register_method("menusGet", self.getMenus) - self.bridge.register_method("menuHelpGet", self.getMenuHelp) - self.bridge.register_method("menuLaunch", self._launchMenu) - self.bridge.register_method("discoInfos", self.memory.disco._discoInfos) - self.bridge.register_method("discoItems", self.memory.disco._discoItems) - self.bridge.register_method("discoFindByFeatures", self._findByFeatures) - self.bridge.register_method("saveParamsTemplate", self.memory.save_xml) - self.bridge.register_method("loadParamsTemplate", self.memory.load_xml) - self.bridge.register_method("sessionInfosGet", self.getSessionInfos) - self.bridge.register_method("devicesInfosGet", self._getDevicesInfos) - self.bridge.register_method("namespacesGet", self.getNamespaces) - self.bridge.register_method("imageCheck", self._imageCheck) - self.bridge.register_method("imageResize", self._imageResize) - self.bridge.register_method("imageGeneratePreview", self._imageGeneratePreview) - self.bridge.register_method("imageConvert", self._imageConvert) + self.bridge.register_method("contact_add", self._add_contact) + self.bridge.register_method("contact_update", self._update_contact) + self.bridge.register_method("contact_del", self._del_contact) + self.bridge.register_method("roster_resync", self._roster_resync) + self.bridge.register_method("is_connected", self.is_connected) + self.bridge.register_method("action_launch", self.launch_callback) + self.bridge.register_method("actions_get", self.actions_get) + self.bridge.register_method("progress_get", self._progress_get) + self.bridge.register_method("progress_get_all", self._progress_get_all) + self.bridge.register_method("menus_get", self.get_menus) + self.bridge.register_method("menu_help_get", self.get_menu_help) + self.bridge.register_method("menu_launch", self._launch_menu) + self.bridge.register_method("disco_infos", self.memory.disco._disco_infos) + self.bridge.register_method("disco_items", self.memory.disco._disco_items) + self.bridge.register_method("disco_find_by_features", self._find_by_features) + self.bridge.register_method("params_template_save", self.memory.save_xml) + self.bridge.register_method("params_template_load", self.memory.load_xml) + self.bridge.register_method("session_infos_get", self.get_session_infos) + self.bridge.register_method("devices_infos_get", self._get_devices_infos) + self.bridge.register_method("namespaces_get", self.get_namespaces) + self.bridge.register_method("image_check", self._image_check) + self.bridge.register_method("image_resize", self._image_resize) + self.bridge.register_method("image_generate_preview", self._image_generate_preview) + self.bridge.register_method("image_convert", self._image_convert) await self.memory.initialise() @@ -232,14 +232,14 @@ except Exception as e: log.error(f"Could not initialize backend: {e}") sys.exit(1) - self._addBaseMenus() + self._add_base_menus() self.initialised.callback(None) log.info(_("Backend is ready")) # profile autoconnection must be done after self.initialised is called because - # startSession waits for it. - autoconnect_dict = await self.memory.storage.getIndParamValues( + # start_session waits for it. + autoconnect_dict = await self.memory.storage.get_ind_param_values( category='Connection', name='autoconnect_backend', ) profiles_autoconnect = [p for p, v in autoconnect_dict.items() if C.bool(v)] @@ -264,9 +264,9 @@ reason = result) ) - def _addBaseMenus(self): + def _add_base_menus(self): """Add base menus""" - encryption.EncryptionHandler._importMenus(self) + encryption.EncryptionHandler._import_menus(self) def _unimport_plugin(self, plugin_path): """remove a plugin from sys.modules if it is there""" @@ -276,7 +276,7 @@ pass def _import_plugins(self): - """Import all plugins found in plugins directory""" + """import all plugins found in plugins directory""" # FIXME: module imported but cancelled should be deleted # TODO: make this more generic and reusable in tools.common # FIXME: should use imp @@ -446,7 +446,7 @@ self.plugins[import_name]._info = plugin_info # TODO: test xmppclient presence and register handler parent - def pluginsUnload(self): + def plugins_unload(self): """Call unload method on every loaded plugin, if exists @return (D): A deferred which return None when all method have been called @@ -461,11 +461,11 @@ except AttributeError: continue else: - defers_list.append(utils.asDeferred(unload)) + defers_list.append(utils.as_deferred(unload)) return defers_list def _connect(self, profile_key, password="", options=None): - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) return defer.ensureDeferred(self.connect(profile, password, options)) async def connect( @@ -487,16 +487,16 @@ if options is None: options = {} - await self.memory.startSession(password, profile) + await self.memory.start_session(password, profile) - if self.isConnected(profile): + if self.is_connected(profile): log.info(_("already connected !")) return True - if self.memory.isComponent(profile): - await xmpp.SatXMPPComponent.startConnection(self, profile, max_retries) + if self.memory.is_component(profile): + await xmpp.SatXMPPComponent.start_connection(self, profile, max_retries) else: - await xmpp.SatXMPPClient.startConnection(self, profile, max_retries) + await xmpp.SatXMPPClient.start_connection(self, profile, max_retries) return False @@ -504,15 +504,15 @@ """disconnect from jabber server""" # FIXME: client should not be deleted if only disconnected # it shoud be deleted only when session is finished - if not self.isConnected(profile_key): - # isConnected is checked here and not on client + if not self.is_connected(profile_key): + # is_connected is checked here and not on client # because client is deleted when session is ended log.info(_("not connected !")) return defer.succeed(None) - client = self.getClient(profile_key) - return client.entityDisconnect() + client = self.get_client(profile_key) + return client.entity_disconnect() - def getFeatures(self, profile_key=C.PROF_KEY_NONE): + def features_get(self, profile_key=C.PROF_KEY_NONE): """Get available features Return list of activated plugins and plugin specific data @@ -528,7 +528,7 @@ try: # FIXME: there is no method yet to check profile session # as soon as one is implemented, it should be used here - self.getClient(profile_key) + self.get_client(profile_key) except KeyError: log.warning("Requesting features for a profile outside a session") profile_key = C.PROF_KEY_NONE @@ -538,14 +538,14 @@ features = [] for import_name, plugin in self.plugins.items(): try: - features_d = utils.asDeferred(plugin.getFeatures, profile_key) + features_d = utils.as_deferred(plugin.features_get, profile_key) except AttributeError: features_d = defer.succeed({}) features.append(features_d) d_list = defer.DeferredList(features) - def buildFeatures(result, import_names): + def build_features(result, import_names): assert len(result) == len(import_names) ret = {} for name, (success, data) in zip(import_names, result): @@ -560,30 +560,30 @@ ret[name] = {} return ret - d_list.addCallback(buildFeatures, list(self.plugins.keys())) + d_list.addCallback(build_features, list(self.plugins.keys())) return d_list - def _contactGet(self, entity_jid_s, profile_key): - client = self.getClient(profile_key) + def _contact_get(self, entity_jid_s, profile_key): + client = self.get_client(profile_key) entity_jid = jid.JID(entity_jid_s) - return defer.ensureDeferred(self.getContact(client, entity_jid)) + return defer.ensureDeferred(self.get_contact(client, entity_jid)) - async def getContact(self, client, entity_jid): + async def get_contact(self, client, entity_jid): # we want to be sure that roster has been received await client.roster.got_roster - item = client.roster.getItem(entity_jid) + item = client.roster.get_item(entity_jid) if item is None: raise exceptions.NotFound(f"{entity_jid} is not in roster!") - return (client.roster.getAttributes(item), list(item.groups)) + return (client.roster.get_attributes(item), list(item.groups)) - def getContacts(self, profile_key): - client = self.getClient(profile_key) + def contacts_get(self, profile_key): + client = self.get_client(profile_key) def got_roster(__): ret = [] - for item in client.roster.getItems(): # we get all items for client's roster + for item in client.roster.get_items(): # we get all items for client's roster # and convert them to expected format - attr = client.roster.getAttributes(item) + attr = client.roster.get_attributes(item) # we use full() and not userhost() because jid with resources are allowed # in roster, even if it's not common. ret.append([item.entity.full(), attr, list(item.groups)]) @@ -591,11 +591,11 @@ return client.roster.got_roster.addCallback(got_roster) - def getContactsFromGroup(self, group, profile_key): - client = self.getClient(profile_key) - return [jid_.full() for jid_ in client.roster.getJidsFromGroup(group)] + def contacts_get_from_group(self, group, profile_key): + client = self.get_client(profile_key) + return [jid_.full() for jid_ in client.roster.get_jids_from_group(group)] - def purgeEntity(self, profile): + def purge_entity(self, profile): """Remove reference to a profile client/component and purge cache the garbage collector can then free the memory @@ -605,7 +605,7 @@ except KeyError: log.error(_("Trying to remove reference to a client not referenced")) else: - self.memory.purgeProfileSession(profile) + self.memory.purge_profile_session(profile) def startService(self): self._init() @@ -613,7 +613,7 @@ def stopService(self): log.info("Salut aussi à Rantanplan") - return self.pluginsUnload() + return self.plugins_unload() def run(self): log.debug(_("running app")) @@ -625,16 +625,16 @@ ## Misc methods ## - def getJidNStream(self, profile_key): + def get_jid_n_stream(self, profile_key): """Convenient method to get jid and stream from profile key @return: tuple (jid, xmlstream) from profile, can be None""" - # TODO: deprecate this method (getClient is enough) - profile = self.memory.getProfileName(profile_key) - if not profile or not self.profiles[profile].isConnected(): + # TODO: deprecate this method (get_client is enough) + profile = self.memory.get_profile_name(profile_key) + if not profile or not self.profiles[profile].is_connected(): return (None, None) return (self.profiles[profile].jid, self.profiles[profile].xmlstream) - def getClient(self, profile_key: str) -> xmpp.SatXMPPClient: + def get_client(self, profile_key: str) -> xmpp.SatXMPPClient: """Convenient method to get client from profile key @return: the client @@ -642,7 +642,7 @@ @raise exceptions.NotFound: client is not available This happen if profile has not been used yet """ - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileKeyUnknown try: @@ -650,7 +650,7 @@ except KeyError: raise exceptions.NotFound(profile_key) - def getClients(self, profile_key): + def get_clients(self, profile_key): """Convenient method to get list of clients from profile key Manage list through profile_key like C.PROF_KEY_ALL @@ -660,7 +660,7 @@ if not profile_key: raise exceptions.DataError(_("profile_key must not be empty")) try: - profile = self.memory.getProfileName(profile_key, True) + profile = self.memory.get_profile_name(profile_key, True) except exceptions.ProfileUnknownError: return [] if profile == C.PROF_KEY_ALL: @@ -669,16 +669,16 @@ raise exceptions.ProfileKeyUnknown return [self.profiles[profile]] - def _getConfig(self, section, name): + def _get_config(self, section, name): """Get the main configuration option @param section: section of the config file (None or '' for DEFAULT) @param name: name of the option @return: unicode representation of the option """ - return str(self.memory.getConfig(section, name, "")) + return str(self.memory.config_get(section, name, "")) - def logErrback(self, failure_, msg=_("Unexpected error: {failure_}")): + def log_errback(self, failure_, msg=_("Unexpected error: {failure_}")): """Generic errback logging @param msg(unicode): error message ("failure_" key will be use for format) @@ -689,41 +689,41 @@ #  namespaces - def registerNamespace(self, short_name, namespace): + def register_namespace(self, short_name, namespace): """associate a namespace to a short name""" if short_name in self.ns_map: raise exceptions.ConflictError("this short name is already used") log.debug(f"registering namespace {short_name} => {namespace}") self.ns_map[short_name] = namespace - def getNamespaces(self): + def get_namespaces(self): return self.ns_map - def getNamespace(self, short_name): + def get_namespace(self, short_name): try: return self.ns_map[short_name] except KeyError: raise exceptions.NotFound("namespace {short_name} is not registered" .format(short_name=short_name)) - def getSessionInfos(self, profile_key): + def get_session_infos(self, profile_key): """compile interesting data on current profile session""" - client = self.getClient(profile_key) + client = self.get_client(profile_key) data = { "jid": client.jid.full(), "started": str(int(client.started)) } return defer.succeed(data) - def _getDevicesInfos(self, bare_jid, profile_key): - client = self.getClient(profile_key) + def _get_devices_infos(self, bare_jid, profile_key): + client = self.get_client(profile_key) if not bare_jid: bare_jid = None - d = defer.ensureDeferred(self.getDevicesInfos(client, bare_jid)) + d = defer.ensureDeferred(self.get_devices_infos(client, bare_jid)) d.addCallback(lambda data: data_format.serialise(data)) return d - async def getDevicesInfos(self, client, bare_jid=None): + async def get_devices_infos(self, client, bare_jid=None): """compile data on an entity devices @param bare_jid(jid.JID, None): bare jid of entity to check @@ -737,7 +737,7 @@ bare_jid = own_jid else: bare_jid = jid.JID(bare_jid) - resources = self.memory.getAllResources(client, bare_jid) + resources = self.memory.get_all_resources(client, bare_jid) if bare_jid == own_jid: # our own jid is not stored in memory's cache resources.add(client.jid.resource) @@ -745,7 +745,7 @@ for resource in resources: res_jid = copy.copy(bare_jid) res_jid.resource = resource - cache_data = self.memory.getEntityData(client, res_jid) + cache_data = self.memory.entity_data_get(client, res_jid) res_data = { "resource": resource, } @@ -760,7 +760,7 @@ "statuses": presence.statuses, } - disco = await self.getDiscoInfos(client, res_jid) + disco = await self.get_disco_infos(client, res_jid) for (category, type_), name in disco.identities.items(): identities = res_data.setdefault('identities', []) @@ -776,22 +776,22 @@ # images - def _imageCheck(self, path): + def _image_check(self, path): report = image.check(self, path) return data_format.serialise(report) - def _imageResize(self, path, width, height): + def _image_resize(self, path, width, height): d = image.resize(path, (width, height)) d.addCallback(lambda new_image_path: str(new_image_path)) return d - def _imageGeneratePreview(self, path, profile_key): - client = self.getClient(profile_key) - d = defer.ensureDeferred(self.imageGeneratePreview(client, Path(path))) + def _image_generate_preview(self, path, profile_key): + client = self.get_client(profile_key) + d = defer.ensureDeferred(self.image_generate_preview(client, Path(path))) d.addCallback(lambda preview_path: str(preview_path)) return d - async def imageGeneratePreview(self, client, path): + async def image_generate_preview(self, client, path): """Helper method to generate in cache a preview of an image @param path(Path): path to the image @@ -807,11 +807,11 @@ path_hash = hashlib.sha256(str(path).encode()).hexdigest() uid = f"{path.stem}_{path_hash}_preview" filename = f"{uid}{path.suffix.lower()}" - metadata = client.cache.getMetadata(uid=uid) + metadata = client.cache.get_metadata(uid=uid) if metadata is not None: preview_path = metadata['path'] else: - with client.cache.cacheData( + with client.cache.cache_data( source='HOST_PREVIEW', uid=uid, filename=filename) as cache_f: @@ -824,16 +824,16 @@ return preview_path - def _imageConvert(self, source, dest, extra, profile_key): - client = self.getClient(profile_key) if profile_key else None + def _image_convert(self, source, dest, extra, profile_key): + client = self.get_client(profile_key) if profile_key else None source = Path(source) dest = None if not dest else Path(dest) extra = data_format.deserialise(extra) - d = defer.ensureDeferred(self.imageConvert(client, source, dest, extra)) + d = defer.ensureDeferred(self.image_convert(client, source, dest, extra)) d.addCallback(lambda dest_path: str(dest_path)) return d - async def imageConvert(self, client, source, dest=None, extra=None): + async def image_convert(self, client, source, dest=None, extra=None): """Helper method to convert an image from one format to an other @param client(SatClient, None): client to use for caching @@ -861,12 +861,12 @@ cache = self.common_cache else: cache = client.cache - metadata = cache.getMetadata(uid=uid) + metadata = cache.get_metadata(uid=uid) if metadata is not None: # there is already a conversion for this image in cache return metadata['path'] else: - with cache.cacheData( + with cache.cache_data( source='HOST_IMAGE_CONVERT', uid=uid, filename=filename) as cache_f: @@ -900,69 +900,69 @@ @param component: if True, path will be prefixed with C.COMPONENTS_DIR @return: path """ - local_dir = self.memory.getConfig("", "local_dir") + local_dir = self.memory.config_get("", "local_dir") if not local_dir: raise exceptions.InternalError("local_dir must be set") path_elts = [] if component: path_elts.append(C.COMPONENTS_DIR) - path_elts.append(regex.pathEscape(dir_name)) + path_elts.append(regex.path_escape(dir_name)) if extra_path: - path_elts.extend([regex.pathEscape(p) for p in extra_path]) + path_elts.extend([regex.path_escape(p) for p in extra_path]) if client is not None: - path_elts.append(regex.pathEscape(client.profile)) + path_elts.append(regex.path_escape(client.profile)) local_path = Path(*path_elts) local_path.mkdir(0o700, parents=True, exist_ok=True) return local_path ## Client management ## - def setParam(self, name, value, category, security_limit, profile_key): + def param_set(self, name, value, category, security_limit, profile_key): """set wanted paramater and notice observers""" - self.memory.setParam(name, value, category, security_limit, profile_key) + self.memory.param_set(name, value, category, security_limit, profile_key) - def isConnected(self, profile_key): + def is_connected(self, profile_key): """Return connection status of profile @param profile_key: key_word or profile name to determine profile name @return: True if connected """ - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: log.error(_("asking connection status for a non-existant profile")) raise exceptions.ProfileUnknownError(profile_key) if profile not in self.profiles: return False - return self.profiles[profile].isConnected() + return self.profiles[profile].is_connected() ## Encryption ## - def registerEncryptionPlugin(self, *args, **kwargs): - return encryption.EncryptionHandler.registerPlugin(*args, **kwargs) + def register_encryption_plugin(self, *args, **kwargs): + return encryption.EncryptionHandler.register_plugin(*args, **kwargs) - def _messageEncryptionStart(self, to_jid_s, namespace, replace=False, + def _message_encryption_start(self, to_jid_s, namespace, replace=False, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return defer.ensureDeferred( client.encryption.start(to_jid, namespace or None, replace)) - def _messageEncryptionStop(self, to_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + def _message_encryption_stop(self, to_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return defer.ensureDeferred( client.encryption.stop(to_jid)) - def _messageEncryptionGet(self, to_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + def _message_encryption_get(self, to_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) session_data = client.encryption.getSession(to_jid) - return client.encryption.getBridgeData(session_data) + return client.encryption.get_bridge_data(session_data) - def _encryptionNamespaceGet(self, name): - return encryption.EncryptionHandler.getNSFromName(name) + def _encryption_namespace_get(self, name): + return encryption.EncryptionHandler.get_ns_from_name(name) - def _encryptionPluginsGet(self): + def _encryption_plugins_get(self): plugins = encryption.EncryptionHandler.getPlugins() ret = [] for p in plugins: @@ -974,20 +974,20 @@ }) return data_format.serialise(ret) - def _encryptionTrustUIGet(self, to_jid_s, namespace, profile_key): - client = self.getClient(profile_key) + def _encryption_trust_ui_get(self, to_jid_s, namespace, profile_key): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) d = defer.ensureDeferred( - client.encryption.getTrustUI(to_jid, namespace=namespace or None)) + client.encryption.get_trust_ui(to_jid, namespace=namespace or None)) d.addCallback(lambda xmlui: xmlui.toXml()) return d ## XMPP methods ## - def _messageSend( + def _message_send( self, to_jid_s, message, subject=None, mess_type="auto", extra_s="", profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return client.sendMessage( to_jid, @@ -997,25 +997,25 @@ data_format.deserialise(extra_s) ) - def _setPresence(self, to="", show="", statuses=None, profile_key=C.PROF_KEY_NONE): - return self.setPresence(jid.JID(to) if to else None, show, statuses, profile_key) + def _set_presence(self, to="", show="", statuses=None, profile_key=C.PROF_KEY_NONE): + return self.presence_set(jid.JID(to) if to else None, show, statuses, profile_key) - def setPresence(self, to_jid=None, show="", statuses=None, + def presence_set(self, to_jid=None, show="", statuses=None, profile_key=C.PROF_KEY_NONE): """Send our presence information""" if statuses is None: statuses = {} - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile priority = int( - self.memory.getParamA("Priority", "Connection", profile_key=profile) + self.memory.param_get_a("Priority", "Connection", profile_key=profile) ) self.profiles[profile].presence.available(to_jid, show, statuses, priority) # XXX: FIXME: temporary fix to work around openfire 3.7.0 bug (presence is not # broadcasted to generating resource) if "" in statuses: statuses[C.PRESENCE_STATUSES_DEFAULT] = statuses.pop("") - self.bridge.presenceUpdate( + self.bridge.presence_update( self.profiles[profile].jid.full(), show, int(priority), statuses, profile ) @@ -1024,7 +1024,7 @@ @param subs_type: subsciption type (cf RFC 3921) @param raw_jid: unicode entity's jid @param profile_key: profile""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile to_jid = jid.JID(raw_jid) log.debug( @@ -1040,22 +1040,22 @@ elif subs_type == "unsubscribed": self.profiles[profile].presence.unsubscribed(to_jid) - def _addContact(self, to_jid_s, profile_key): - return self.addContact(jid.JID(to_jid_s), profile_key) + def _add_contact(self, to_jid_s, profile_key): + return self.contact_add(jid.JID(to_jid_s), profile_key) - def addContact(self, to_jid, profile_key): + def contact_add(self, to_jid, profile_key): """Add a contact in roster list""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile # presence is sufficient, as a roster push will be sent according to # RFC 6121 §3.1.2 self.profiles[profile].presence.subscribe(to_jid) - def _updateContact(self, to_jid_s, name, groups, profile_key): - client = self.getClient(profile_key) - return self.updateContact(client, jid.JID(to_jid_s), name, groups) + def _update_contact(self, to_jid_s, name, groups, profile_key): + client = self.get_client(profile_key) + return self.contact_update(client, jid.JID(to_jid_s), name, groups) - def updateContact(self, client, to_jid, name, groups): + def contact_update(self, client, to_jid, name, groups): """update a contact in roster list""" roster_item = RosterItem(to_jid) roster_item.name = name or u'' @@ -1064,18 +1064,18 @@ return return client.roster.setItem(roster_item) - def _delContact(self, to_jid_s, profile_key): - return self.delContact(jid.JID(to_jid_s), profile_key) + def _del_contact(self, to_jid_s, profile_key): + return self.contact_del(jid.JID(to_jid_s), profile_key) - def delContact(self, to_jid, profile_key): + def contact_del(self, to_jid, profile_key): """Remove contact from roster list""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile self.profiles[profile].presence.unsubscribe(to_jid) # is not asynchronous return self.profiles[profile].roster.removeItem(to_jid) - def _rosterResync(self, profile_key): - client = self.getClient(profile_key) + def _roster_resync(self, profile_key): + client = self.get_client(profile_key) return client.roster.resync() ## Discovery ## @@ -1085,39 +1085,39 @@ def hasFeature(self, *args, **kwargs): return self.memory.disco.hasFeature(*args, **kwargs) - def checkFeature(self, *args, **kwargs): - return self.memory.disco.checkFeature(*args, **kwargs) + def check_feature(self, *args, **kwargs): + return self.memory.disco.check_feature(*args, **kwargs) - def checkFeatures(self, *args, **kwargs): - return self.memory.disco.checkFeatures(*args, **kwargs) + def check_features(self, *args, **kwargs): + return self.memory.disco.check_features(*args, **kwargs) - def hasIdentity(self, *args, **kwargs): - return self.memory.disco.hasIdentity(*args, **kwargs) + def has_identity(self, *args, **kwargs): + return self.memory.disco.has_identity(*args, **kwargs) - def getDiscoInfos(self, *args, **kwargs): - return self.memory.disco.getInfos(*args, **kwargs) + def get_disco_infos(self, *args, **kwargs): + return self.memory.disco.get_infos(*args, **kwargs) def getDiscoItems(self, *args, **kwargs): - return self.memory.disco.getItems(*args, **kwargs) + return self.memory.disco.get_items(*args, **kwargs) - def findServiceEntity(self, *args, **kwargs): - return self.memory.disco.findServiceEntity(*args, **kwargs) + def find_service_entity(self, *args, **kwargs): + return self.memory.disco.find_service_entity(*args, **kwargs) - def findServiceEntities(self, *args, **kwargs): - return self.memory.disco.findServiceEntities(*args, **kwargs) + def find_service_entities(self, *args, **kwargs): + return self.memory.disco.find_service_entities(*args, **kwargs) - def findFeaturesSet(self, *args, **kwargs): - return self.memory.disco.findFeaturesSet(*args, **kwargs) + def find_features_set(self, *args, **kwargs): + return self.memory.disco.find_features_set(*args, **kwargs) - def _findByFeatures(self, namespaces, identities, bare_jids, service, roster, own_jid, + def _find_by_features(self, namespaces, identities, bare_jids, service, roster, own_jid, local_device, profile_key): - client = self.getClient(profile_key) + client = self.get_client(profile_key) identities = [tuple(i) for i in identities] if identities else None - return defer.ensureDeferred(self.findByFeatures( + return defer.ensureDeferred(self.find_by_features( client, namespaces, identities, bare_jids, service, roster, own_jid, local_device)) - async def findByFeatures( + async def find_by_features( self, client: xmpp.SatXMPPEntity, namespaces: List[str], @@ -1164,10 +1164,10 @@ found_own = {} found_roster = {} if service: - services_jids = await self.findFeaturesSet(client, namespaces) + services_jids = await self.find_features_set(client, namespaces) services_jids = list(services_jids) # we need a list to map results below services_infos = await defer.DeferredList( - [self.getDiscoInfos(client, service_jid) for service_jid in services_jids] + [self.get_disco_infos(client, service_jid) for service_jid in services_jids] ) for idx, (success, infos) in enumerate(services_infos): @@ -1190,7 +1190,7 @@ if own_jid: to_find.append((found_own, [client.jid.userhostJID()])) if roster: - to_find.append((found_roster, client.roster.getJids())) + to_find.append((found_roster, client.roster.get_jids())) for found, jids in to_find: full_jids = [] @@ -1206,7 +1206,7 @@ resources = [None] else: try: - resources = self.memory.getAvailableResources(client, jid_) + resources = self.memory.get_available_resources(client, jid_) except exceptions.UnknownEntityError: continue if not resources and jid_ == client.jid.userhostJID() and own_jid: @@ -1220,7 +1220,7 @@ continue full_jids.append(full_jid) - disco_defers.append(self.getDiscoInfos(client, full_jid)) + disco_defers.append(self.get_disco_infos(client, full_jid)) d_list = defer.DeferredList(disco_defers) # XXX: 10 seconds may be too low for slow connections (e.g. mobiles) @@ -1251,18 +1251,18 @@ ## Generic HMI ## - def _killAction(self, keep_id, client): + def _kill_action(self, keep_id, client): log.debug("Killing action {} for timeout".format(keep_id)) client.actions[keep_id] - def actionNew( + def action_new( self, action_data, security_limit=C.NO_SECURITY_LIMIT, keep_id=None, profile=C.PROF_KEY_NONE, ): - """Shortcut to bridge.actionNew which generate and id and keep for retrieval + """Shortcut to bridge.action_new which generate and id and keep for retrieval @param action_data(dict): action data (see bridge documentation) @param security_limit: %(doc_security_limit)s @@ -1273,44 +1273,44 @@ """ id_ = str(uuid.uuid4()) if keep_id is not None: - client = self.getClient(profile) - action_timer = reactor.callLater(60 * 30, self._killAction, keep_id, client) + client = self.get_client(profile) + action_timer = reactor.callLater(60 * 30, self._kill_action, keep_id, client) client.actions[keep_id] = (action_data, id_, security_limit, action_timer) - self.bridge.actionNew(action_data, id_, security_limit, profile) + self.bridge.action_new(action_data, id_, security_limit, profile) - def actionsGet(self, profile): + def actions_get(self, profile): """Return current non answered actions @param profile: %(doc_profile)s """ - client = self.getClient(profile) + client = self.get_client(profile) return [action_tuple[:-1] for action_tuple in client.actions.values()] - def registerProgressCb( + def register_progress_cb( self, progress_id, callback, metadata=None, profile=C.PROF_KEY_NONE ): """Register a callback called when progress is requested for id""" if metadata is None: metadata = {} - client = self.getClient(profile) + client = self.get_client(profile) if progress_id in client._progress_cb: raise exceptions.ConflictError("Progress ID is not unique !") client._progress_cb[progress_id] = (callback, metadata) - def removeProgressCb(self, progress_id, profile): + def remove_progress_cb(self, progress_id, profile): """Remove a progress callback""" - client = self.getClient(profile) + client = self.get_client(profile) try: del client._progress_cb[progress_id] except KeyError: log.error(_("Trying to remove an unknow progress callback")) - def _progressGet(self, progress_id, profile): - data = self.progressGet(progress_id, profile) + def _progress_get(self, progress_id, profile): + data = self.progress_get(progress_id, profile) return {k: str(v) for k, v in data.items()} - def progressGet(self, progress_id, profile): + def progress_get(self, progress_id, profile): """Return a dict with progress information @param progress_id(unicode): unique id of the progressing element @@ -1321,22 +1321,22 @@ if id doesn't exists (may be a finished progression), and empty dict is returned """ - client = self.getClient(profile) + client = self.get_client(profile) try: data = client._progress_cb[progress_id][0](progress_id, profile) except KeyError: data = {} return data - def _progressGetAll(self, profile_key): - progress_all = self.progressGetAll(profile_key) + def _progress_get_all(self, profile_key): + progress_all = self.progress_get_all(profile_key) for profile, progress_dict in progress_all.items(): for progress_id, data in progress_dict.items(): for key, value in data.items(): data[key] = str(value) return progress_all - def progressGetAllMetadata(self, profile_key): + def progress_get_all_metadata(self, profile_key): """Return all progress metadata at once @param profile_key: %(doc_profile)s @@ -1344,9 +1344,9 @@ returned @return (dict[dict[dict]]): a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_metadata is the same dict as sent by [progressStarted] + progress_metadata is the same dict as sent by [progress_started] """ - clients = self.getClients(profile_key) + clients = self.get_clients(profile_key) progress_all = {} for client in clients: profile = client.profile @@ -1359,16 +1359,16 @@ progress_dict[progress_id] = progress_metadata return progress_all - def progressGetAll(self, profile_key): + def progress_get_all(self, profile_key): """Return all progress status at once @param profile_key: %(doc_profile)s if C.PROF_KEY_ALL is used, all progress status from all profiles are returned @return (dict[dict[dict]]): a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_data is the same dict as returned by [progressGet] + progress_data is the same dict as returned by [progress_get] """ - clients = self.getClients(profile_key) + clients = self.get_clients(profile_key) progress_all = {} for client in clients: profile = client.profile @@ -1378,7 +1378,7 @@ progress_dict[progress_id] = progress_cb(progress_id, profile) return progress_all - def registerCallback(self, callback, *args, **kwargs): + def register_callback(self, callback, *args, **kwargs): """Register a callback. @param callback(callable): method to call @@ -1399,23 +1399,23 @@ if "one_shot" in kwargs: # One Shot callback are removed after 30 min - def purgeCallback(): + def purge_callback(): try: self.removeCallback(callback_id) except KeyError: pass - reactor.callLater(1800, purgeCallback) + reactor.callLater(1800, purge_callback) return callback_id def removeCallback(self, callback_id): """ Remove a previously registered callback - @param callback_id: id returned by [registerCallback] """ + @param callback_id: id returned by [register_callback] """ log.debug("Removing callback [%s]" % callback_id) del self._cb_map[callback_id] - def launchCallback(self, callback_id, data=None, profile_key=C.PROF_KEY_NONE): + def launch_callback(self, callback_id, data=None, profile_key=C.PROF_KEY_NONE): """Launch a specific callback @param callback_id: id of the action (callback) to launch @@ -1430,10 +1430,10 @@ """ #  FIXME: security limit need to be checked here try: - client = self.getClient(profile_key) + client = self.get_client(profile_key) except exceptions.NotFound: # client is not available yet - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileUnknownError( _("trying to launch action with a non-existant profile") @@ -1468,11 +1468,11 @@ if kwargs.pop("one_shot", False): self.removeCallback(callback_id) - return utils.asDeferred(callback, *args, **kwargs) + return utils.as_deferred(callback, *args, **kwargs) # Menus management - def _getMenuCanonicalPath(self, path): + def _get_menu_canonical_path(self, path): """give canonical form of path canonical form is a tuple of the path were every element is stripped and lowercase @@ -1481,7 +1481,7 @@ """ return tuple((p.lower().strip() for p in path)) - def importMenu(self, path, callback, security_limit=C.NO_SECURITY_LIMIT, + def import_menu(self, path, callback, security_limit=C.NO_SECURITY_LIMIT, help_string="", type_=C.MENU_GLOBAL): r"""register a new menu for frontends @@ -1491,9 +1491,9 @@ untranslated/lower case path can be used to identity a menu, for this reason it must be unique independently of case. @param callback(callable): method to be called when menuitem is selected, callable - or a callback id (string) as returned by [registerCallback] + or a callback id (string) as returned by [register_callback] @param security_limit(int): %(doc_security_limit)s - /!\ security_limit MUST be added to data in launchCallback if used #TODO + /!\ security_limit MUST be added to data in launch_callback if used #TODO @param help_string(unicode): string used to indicate what the menu do (can be show as a tooltip). /!\ use D_() instead of _() for translations @@ -1517,7 +1517,7 @@ """ if callable(callback): - callback_id = self.registerCallback(callback, with_data=True) + callback_id = self.register_callback(callback, with_data=True) elif isinstance(callback, str): # The callback is already registered callback_id = callback @@ -1535,7 +1535,7 @@ _("A menu with the same path and type already exists") ) - path_canonical = self._getMenuCanonicalPath(path) + path_canonical = self._get_menu_canonical_path(path) menu_key = (type_, path_canonical) if menu_key in self._menus_paths: @@ -1558,7 +1558,7 @@ return callback_id - def getMenus(self, language="", security_limit=C.NO_SECURITY_LIMIT): + def get_menus(self, language="", security_limit=C.NO_SECURITY_LIMIT): """Return all menus registered @param language: language used for translation, or empty string for default @@ -1582,20 +1582,20 @@ or menu_security_limit > security_limit ): continue - languageSwitch(language) + language_switch(language) path_i18n = [_(elt) for elt in path] - languageSwitch() + language_switch() extra = {} # TODO: manage extra data like icon ret.append((menu_id, type_, path, path_i18n, extra)) return ret - def _launchMenu(self, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT, + def _launch_menu(self, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) - return self.launchMenu(client, menu_type, path, data, security_limit) + client = self.get_client(profile_key) + return self.launch_menu(client, menu_type, path, data, security_limit) - def launchMenu(self, client, menu_type, path, data=None, + def launch_menu(self, client, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT): """launch action a menu action @@ -1606,7 +1606,7 @@ """ # FIXME: manage security_limit here # defaut security limit should be high instead of C.NO_SECURITY_LIMIT - canonical_path = self._getMenuCanonicalPath(path) + canonical_path = self._get_menu_canonical_path(path) menu_key = (menu_type, canonical_path) try: callback_id = self._menus_paths[menu_key] @@ -1616,9 +1616,9 @@ path=canonical_path, menu_type=menu_type ) ) - return self.launchCallback(callback_id, data, client.profile) + return self.launch_callback(callback_id, data, client.profile) - def getMenuHelp(self, menu_id, language=""): + def get_menu_help(self, menu_id, language=""): """return the help string of the menu @param menu_id: id of the menu (same as callback_id) @@ -1630,7 +1630,7 @@ menu_data = self._menus[menu_id] except KeyError: raise exceptions.DataError("Trying to access an unknown menu") - languageSwitch(language) + language_switch(language) help_string = _(menu_data["help_string"]) - languageSwitch() + language_switch() return help_string diff -r c4464d7ae97b -r 524856bd7b19 sat/core/xmpp.py --- a/sat/core/xmpp.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/xmpp.py Sat Apr 08 13:54:42 2023 +0200 @@ -89,20 +89,20 @@ class SatXMPPEntity(core_types.SatXMPPEntity): """Common code for Client and Component""" - # profile is added there when startConnection begins and removed when it is finished + # profile is added there when start_connection begins and removed when it is finished profiles_connecting = set() def __init__(self, host_app, profile, max_retries): factory = self.factory - # we monkey patch clientConnectionLost to handle networkEnabled/networkDisabled + # we monkey patch clientConnectionLost to handle network_enabled/network_disabled # and to allow plugins to tune reconnection mechanism clientConnectionFailed_ori = factory.clientConnectionFailed clientConnectionLost_ori = factory.clientConnectionLost factory.clientConnectionFailed = partial( - self.connectionTerminated, term_type="failed", cb=clientConnectionFailed_ori) + self.connection_terminated, term_type="failed", cb=clientConnectionFailed_ori) factory.clientConnectionLost = partial( - self.connectionTerminated, term_type="lost", cb=clientConnectionLost_ori) + self.connection_terminated, term_type="lost", cb=clientConnectionLost_ori) factory.maxRetries = max_retries factory.maxDelay = 30 @@ -129,41 +129,41 @@ ## initialisation ## - async def _callConnectionTriggers(self, connection_timer): + async def _call_connection_triggers(self, connection_timer): """Call conneting trigger prepare connected trigger @param plugins(iterable): plugins to use @return (list[object, callable]): plugin to trigger tuples with: - plugin instance - - profileConnected* triggers (to call after connection) + - profile_connected* triggers (to call after connection) """ plugin_conn_cb = [] - for plugin in self._getPluginsList(): + for plugin in self._get_plugins_list(): # we check if plugin handle client mode if plugin.is_handler: - plugin.getHandler(self).setHandlerParent(self) + plugin.get_handler(self).setHandlerParent(self) - # profileConnecting/profileConnected methods handling + # profile_connecting/profile_connected methods handling timer = connection_timer[plugin] = { "total": 0 } # profile connecting is called right now (before actually starting client) - connecting_cb = getattr(plugin, "profileConnecting", None) + connecting_cb = getattr(plugin, "profile_connecting", None) if connecting_cb is not None: connecting_start = time.time() - await utils.asDeferred(connecting_cb, self) + await utils.as_deferred(connecting_cb, self) timer["connecting"] = time.time() - connecting_start timer["total"] += timer["connecting"] # profile connected is called after client is ready and roster is got - connected_cb = getattr(plugin, "profileConnected", None) + connected_cb = getattr(plugin, "profile_connected", None) if connected_cb is not None: plugin_conn_cb.append((plugin, connected_cb)) return plugin_conn_cb - def _getPluginsList(self): + def _get_plugins_list(self): """Return list of plugin to use need to be implemented by subclasses @@ -172,10 +172,10 @@ """ raise NotImplementedError - def _createSubProtocols(self): + def _create_sub_protocols(self): return - def entityConnected(self): + def entity_connected(self): """Called once connection is done may return a Deferred, to perform initialisation tasks @@ -189,12 +189,12 @@ timer: Dict[str, float] ) -> None: connected_start = time.time() - await utils.asDeferred(callback, entity) + await utils.as_deferred(callback, entity) timer["connected"] = time.time() - connected_start timer["total"] += timer["connected"] @classmethod - async def startConnection(cls, host, profile, max_retries): + async def start_connection(cls, host, profile, max_retries): """instantiate the entity and start the connection""" # FIXME: reconnection doesn't seems to be handled correclty # (client is deleted then recreated from scratch) @@ -208,7 +208,7 @@ try: try: port = int( - host.memory.getParamA( + host.memory.param_get_a( C.FORCE_PORT_PARAM, "Connection", profile_key=profile ) ) @@ -218,11 +218,11 @@ None ) # will use default value 5222 or be retrieved from a DNS SRV record - password = await host.memory.asyncGetParamA( + password = await host.memory.param_get_a_async( "Password", "Connection", profile_key=profile ) - entity_jid_s = await host.memory.asyncGetParamA( + entity_jid_s = await host.memory.param_get_a_async( "JabberID", "Connection", profile_key=profile) entity_jid = jid.JID(entity_jid_s) @@ -231,13 +231,13 @@ # server returned one, as it will then stay stable in case of # reconnection. we only do that for client and if there is a user part, to # let server decide for anonymous login - resource_dict = await host.memory.storage.getPrivates( + resource_dict = await host.memory.storage.get_privates( "core:xmpp", ["resource"] , profile=profile) try: resource = resource_dict["resource"] except KeyError: resource = f"{C.APP_NAME_FILE}.{shortuuid.uuid()}" - await host.memory.storage.setPrivateValue( + await host.memory.storage.set_private_value( "core:xmpp", "resource", resource, profile=profile) log.info(_("We'll use the stable resource {resource}").format( @@ -245,7 +245,7 @@ entity_jid.resource = resource if profile in host.profiles: - if host.profiles[profile].isConnected(): + if host.profiles[profile].is_connected(): raise exceptions.InternalError( f"There is already a connected profile of name {profile!r} in " f"host") @@ -254,14 +254,14 @@ del host.profiles[profile] entity = host.profiles[profile] = cls( host, profile, entity_jid, password, - host.memory.getParamA(C.FORCE_SERVER_PARAM, "Connection", + host.memory.param_get_a(C.FORCE_SERVER_PARAM, "Connection", profile_key=profile) or None, port, max_retries, ) - await entity.encryption.loadSessions() + await entity.encryption.load_sessions() - entity._createSubProtocols() + entity._create_sub_protocols() entity.fallBack = SatFallbackHandler(host) entity.fallBack.setHandlerParent(entity) @@ -275,15 +275,15 @@ log.debug(_("setting plugins parents")) connection_timer: Dict[str, Dict[str, float]] = {} - plugin_conn_cb = await entity._callConnectionTriggers(connection_timer) + plugin_conn_cb = await entity._call_connection_triggers(connection_timer) entity.startService() await entity.conn_deferred - await defer.maybeDeferred(entity.entityConnected) + await defer.maybeDeferred(entity.entity_connected) - # Call profileConnected callback for all plugins, + # Call profile_connected callback for all plugins, # and print error message if any of them fails conn_cb_list = [] for plugin, callback in plugin_conn_cb: @@ -296,7 +296,7 @@ ) list_d = defer.DeferredList(conn_cb_list) - def logPluginResults(results): + def log_plugin_results(results): if not results: log.info("no plugin loaded") return @@ -347,22 +347,22 @@ ) await list_d.addCallback( - logPluginResults + log_plugin_results ) # FIXME: we should have a timeout here, and a way to know if a plugin freeze # TODO: mesure launch time of each plugin finally: cls.profiles_connecting.remove(profile) - def _disconnectionCb(self, __): + def _disconnection_cb(self, __): self._connected_d = None - def _disconnectionEb(self, failure_): + def _disconnection_eb(self, failure_): log.error(_("Error while disconnecting: {}".format(failure_))) def _authd(self, xmlstream): super(SatXMPPEntity, self)._authd(xmlstream) log.debug(_("{profile} identified").format(profile=self.profile)) - self.streamInitialized() + self.stream_initialized() def _finish_connection(self, __): if self.conn_deferred.called: @@ -371,14 +371,14 @@ else: self.conn_deferred.callback(None) - def streamInitialized(self): + def stream_initialized(self): """Called after _authd""" log.debug(_("XML stream is initialized")) if not self.host_app.trigger.point("xml_init", self): return - self.postStreamInit() + self.post_stream_init() - def postStreamInit(self): + def post_stream_init(self): """Workflow after stream initalisation.""" log.info( _("********** [{profile}] CONNECTED **********").format(profile=self.profile) @@ -387,9 +387,9 @@ # the following Deferred is used to know when we are connected # so we need to be set it to None when connection is lost self._connected_d = defer.Deferred() - self._connected_d.addCallback(self._cleanConnection) - self._connected_d.addCallback(self._disconnectionCb) - self._connected_d.addErrback(self._disconnectionEb) + self._connected_d.addCallback(self._clean_connection) + self._connected_d.addCallback(self._disconnection_cb) + self._connected_d.addErrback(self._disconnection_eb) # we send the signal to the clients self.host_app.bridge.connected(self.jid.full(), self.profile) @@ -421,7 +421,7 @@ ## connection ## - def connectionTerminated(self, connector, reason, term_type, cb): + def connection_terminated(self, connector, reason, term_type, cb): """Display disconnection reason, and call factory method This method is monkey patched to factory, allowing plugins to handle finely @@ -453,7 +453,7 @@ return return cb(connector, reason) - def networkDisabled(self): + def network_disabled(self): """Indicate that network has been completely disabled In other words, internet is not available anymore and transport must be stopped. @@ -466,7 +466,7 @@ if self.xmlstream is not None: self.xmlstream.transport.abortConnection() - def networkEnabled(self): + def network_enabled(self): """Indicate that network has been (re)enabled This happens when e.g. user activate WIFI connection. @@ -475,7 +475,7 @@ connector = self._saved_connector network_disabled = self._network_disabled except AttributeError: - # connection has not been stopped by networkDisabled + # connection has not been stopped by network_disabled # we don't have to restart it log.debug(f"no connection to restart [{self.profile}]") return @@ -496,12 +496,12 @@ self.host_app.trigger.point( "stream_hooks", self, receive_hooks, send_hooks) for hook in receive_hooks: - xs.addHook(C.STREAM_HOOK_RECEIVE, hook) + xs.add_hook(C.STREAM_HOOK_RECEIVE, hook) for hook in send_hooks: - xs.addHook(C.STREAM_HOOK_SEND, hook) + xs.add_hook(C.STREAM_HOOK_SEND, hook) super(SatXMPPEntity, self)._connected(xs) - def disconnectProfile(self, reason): + def disconnect_profile(self, reason): if self._connected_d is not None: self.host_app.bridge.disconnected( self.profile @@ -516,7 +516,7 @@ log.debug("continueTrying not set, purging entity") self._connected_d.callback(None) # and we remove references to this client - self.host_app.purgeEntity(self.profile) + self.host_app.purge_entity(self.profile) if not self.conn_deferred.called: if reason is None: @@ -535,7 +535,7 @@ try: # with invalid certificate, we should not retry to connect # so we delete saved connector to avoid reconnection if - # networkEnabled is called. + # network_enabled is called. del self._saved_connector except AttributeError: pass @@ -547,21 +547,21 @@ super(SatXMPPEntity, self)._disconnected(reason) if not self.host_app.trigger.point("disconnected", self, reason): return - self.disconnectProfile(reason) + self.disconnect_profile(reason) @defer.inlineCallbacks - def _cleanConnection(self, __): + def _clean_connection(self, __): """method called on disconnection - used to call profileDisconnected* triggers + used to call profile_disconnected* triggers """ - trigger_name = "profileDisconnected" - for plugin in self._getPluginsList(): + trigger_name = "profile_disconnected" + for plugin in self._get_plugins_list(): disconnected_cb = getattr(plugin, trigger_name, None) if disconnected_cb is not None: yield disconnected_cb(self) - def isConnected(self): + def is_connected(self): """Return True is client is fully connected client is considered fully connected if transport is started and all plugins @@ -574,7 +574,7 @@ return self._connected_d is not None and transport_connected - def entityDisconnect(self): + def entity_disconnect(self): if not self.host_app.trigger.point("disconnecting", self): return log.info(_("Disconnecting...")) @@ -609,7 +609,7 @@ ).toResponse(iq_elt) self.xmlstream.send(iq_error_elt) - def generateMessageXML( + def generate_message_xml( self, data: core_types.MessageData, post_xml_treatments: Optional[defer.Deferred] = None @@ -666,9 +666,9 @@ @property def is_admin(self) -> bool: """True if a client is an administrator with extra privileges""" - return self.host_app.memory.isAdmin(self.profile) + return self.host_app.memory.is_admin(self.profile) - def addPostXmlCallbacks(self, post_xml_treatments): + def add_post_xml_callbacks(self, post_xml_treatments): """Used to add class level callbacks at the end of the workflow @param post_xml_treatments(D): the same Deferred as in sendMessage trigger @@ -685,20 +685,20 @@ # (out of band transmission for instance). # e2e should have a priority of 0 here, and out of band transmission # a lower priority - if not (await self.host_app.trigger.asyncPoint("send", self, obj)): + if not (await self.host_app.trigger.async_point("send", self, obj)): return super().send(obj) def send(self, obj): defer.ensureDeferred(self.a_send(obj)) - async def sendMessageData(self, mess_data): + async def send_message_data(self, mess_data): """Convenient method to send message data to stream This method will send mess_data[u'xml'] to stream, but a trigger is there The trigger can't be cancelled, it's a good place for e2e encryption which don't handle full stanza encryption - This trigger can return a Deferred (it's an asyncPoint) + This trigger can return a Deferred (it's an async_point) @param mess_data(dict): message data as constructed by onMessage workflow @return (dict): mess_data (so it can be used in a deferred chain) """ @@ -707,7 +707,7 @@ # This is intented for e2e encryption which doesn't do full stanza # encryption (e.g. OTR) # This trigger point can't cancel the method - await self.host_app.trigger.asyncPoint("sendMessageData", self, mess_data, + await self.host_app.trigger.async_point("send_message_data", self, mess_data, triggers_no_cancel=True) await self.a_send(mess_data["xml"]) return mess_data @@ -762,7 +762,7 @@ elif not data["to"].resource: # we may have a groupchat message, we check if the we know this jid try: - entity_type = self.host_app.memory.getEntityDatum( + entity_type = self.host_app.memory.get_entity_datum( self, data["to"], C.ENTITY_TYPE ) # FIXME: should entity_type manage resources ? @@ -783,7 +783,7 @@ if not no_trigger and not send_only: # is the session encrypted? If so we indicate it in data - self.encryption.setEncryptionFlag(data) + self.encryption.set_encryption_flag(data) if not self.host_app.trigger.point( "sendMessage" + self.trigger_suffix, @@ -797,27 +797,27 @@ log.debug(_("Sending message (type {type}, to {to})") .format(type=data["type"], to=to_jid.full())) - pre_xml_treatments.addCallback(lambda __: self.generateMessageXML(data, post_xml_treatments)) + pre_xml_treatments.addCallback(lambda __: self.generate_message_xml(data, post_xml_treatments)) pre_xml_treatments.addCallback(lambda __: post_xml_treatments) - pre_xml_treatments.addErrback(self._cancelErrorTrap) + pre_xml_treatments.addErrback(self._cancel_error_trap) post_xml_treatments.addCallback( - lambda __: defer.ensureDeferred(self.sendMessageData(data)) + lambda __: defer.ensureDeferred(self.send_message_data(data)) ) if send_only: log.debug(_("Triggers, storage and echo have been inhibited by the " "'send_only' parameter")) else: - self.addPostXmlCallbacks(post_xml_treatments) - post_xml_treatments.addErrback(self._cancelErrorTrap) - post_xml_treatments.addErrback(self.host_app.logErrback) + self.add_post_xml_callbacks(post_xml_treatments) + post_xml_treatments.addErrback(self._cancel_error_trap) + post_xml_treatments.addErrback(self.host_app.log_errback) pre_xml_treatments.callback(data) return pre_xml_treatments - def _cancelErrorTrap(self, failure): + def _cancel_error_trap(self, failure): """A message sending can be cancelled by a plugin treatment""" failure.trap(exceptions.CancelError) - def isMessagePrintable(self, mess_data): + def is_message_printable(self, mess_data): """Return True if a message contain payload to show in frontends""" return ( mess_data["message"] or mess_data["subject"] @@ -825,7 +825,7 @@ or mess_data["type"] == C.MESS_TYPE_INFO ) - async def messageAddToHistory(self, data): + async def message_add_to_history(self, data): """Store message into database (for local history) @param data: message data dictionnary @@ -836,22 +836,22 @@ # and they will be added then # we need a message to store - if self.isMessagePrintable(data): - await self.host_app.memory.addToHistory(self, data) + if self.is_message_printable(data): + await self.host_app.memory.add_to_history(self, data) else: log.warning( "No message found" ) # empty body should be managed by plugins before this point return data - def messageGetBridgeArgs(self, data): + def message_get_bridge_args(self, data): """Generate args to use with bridge from data dict""" return (data["uid"], data["timestamp"], data["from"].full(), data["to"].full(), data["message"], data["subject"], data["type"], data_format.serialise(data["extra"])) - def messageSendToBridge(self, data): + def message_send_to_bridge(self, data): """Send message to bridge, so frontends can display it @param data: message data dictionnary @@ -862,11 +862,11 @@ # and they will be added the # we need a message to send something - if self.isMessagePrintable(data): + if self.is_message_printable(data): # We send back the message, so all frontends are aware of it - self.host_app.bridge.messageNew( - *self.messageGetBridgeArgs(data), + self.host_app.bridge.message_new( + *self.message_get_bridge_args(data), profile=self.profile ) else: @@ -913,7 +913,7 @@ # for now we consider Android devices to be always phones self.identities = [disco.DiscoIdentity("client", "phone", C.APP_NAME)] - hosts_map = host_app.memory.getConfig(None, "hosts_dict", {}) + hosts_map = host_app.memory.config_get(None, "hosts_dict", {}) if host is None and user_jid.host in hosts_map: host_data = hosts_map[user_jid.host] if isinstance(host_data, str): @@ -934,7 +934,7 @@ .format(host_ori=user_jid.host, host=host, port=port) ) - self.check_certificate = host_app.memory.getParamA( + self.check_certificate = host_app.memory.param_get_a( "check_certificate", "Connection", profile_key=profile) if self.check_certificate: @@ -954,19 +954,19 @@ "somebody may be spying on you. If you have no good reason to disable " "certificate validation, please activate \"Check certificate\" in your " "settings in \"Connection\" tab.")) - xml_tools.quickNote(host_app, self, msg, _("Security notice"), + xml_tools.quick_note(host_app, self, msg, _("Security notice"), level = C.XMLUI_DATA_LVL_WARNING) @property def server_jid(self): return jid.JID(self.jid.host) - def _getPluginsList(self): + def _get_plugins_list(self): for p in self.host_app.plugins.values(): if C.PLUG_MODE_CLIENT in p._info["modes"]: yield p - def _createSubProtocols(self): + def _create_sub_protocols(self): self.messageProt = SatMessageProtocol(self.host_app) self.messageProt.setHandlerParent(self) @@ -977,26 +977,26 @@ self.presence.setHandlerParent(self) @classmethod - async def startConnection(cls, host, profile, max_retries): + async def start_connection(cls, host, profile, max_retries): try: - await super(SatXMPPClient, cls).startConnection(host, profile, max_retries) + await super(SatXMPPClient, cls).start_connection(host, profile, max_retries) except exceptions.CancelError as e: - log.warning(f"startConnection cancelled: {e}") + log.warning(f"start_connection cancelled: {e}") return entity = host.profiles[profile] # we finally send our presence entity.presence.available() - def entityConnected(self): + def entity_connected(self): # we want to be sure that we got the roster return self.roster.got_roster - def addPostXmlCallbacks(self, post_xml_treatments): - post_xml_treatments.addCallback(self.messageProt.completeAttachments) + def add_post_xml_callbacks(self, post_xml_treatments): + post_xml_treatments.addCallback(self.messageProt.complete_attachments) post_xml_treatments.addCallback( - lambda ret: defer.ensureDeferred(self.messageAddToHistory(ret)) + lambda ret: defer.ensureDeferred(self.message_add_to_history(ret)) ) - post_xml_treatments.addCallback(self.messageSendToBridge) + post_xml_treatments.addCallback(self.message_send_to_bridge) def feedback( self, @@ -1015,7 +1015,7 @@ """ if extra is None: extra = {} - self.host_app.bridge.messageNew( + self.host_app.bridge.message_new( uid=str(uuid.uuid4()), timestamp=time.time(), from_jid=self.jid.full(), @@ -1028,7 +1028,7 @@ ) def _finish_connection(self, __): - d = self.roster.requestRoster() + d = self.roster.request_roster() d.addCallback(lambda __: super(SatXMPPClient, self)._finish_connection(__)) @@ -1057,7 +1057,7 @@ port = C.XMPP_COMPONENT_PORT ## entry point ## - entry_point = host_app.memory.getEntryPoint(profile) + entry_point = host_app.memory.get_entry_point(profile) try: self.entry_plugin = host_app.plugins[entry_point] except KeyError: @@ -1090,11 +1090,11 @@ def is_admin(self) -> bool: return False - def _createSubProtocols(self): + def _create_sub_protocols(self): self.messageProt = SatMessageProtocol(self.host_app) self.messageProt.setHandlerParent(self) - def _buildDependencies(self, current, plugins, required=True): + def _build_dependencies(self, current, plugins, required=True): """build recursively dependencies needed for a plugin this method build list of plugin needed for a component and raises @@ -1128,7 +1128,7 @@ # plugins are already loaded as dependencies # so we know they are in self.host_app.plugins dep = self.host_app.plugins[import_name] - self._buildDependencies(dep, plugins) + self._build_dependencies(dep, plugins) for import_name in current._info.get(C.PI_RECOMMENDATIONS, []): # here plugins are only recommendations, @@ -1137,21 +1137,21 @@ dep = self.host_app.plugins[import_name] except KeyError: continue - self._buildDependencies(dep, plugins, required=False) + self._build_dependencies(dep, plugins, required=False) if current not in plugins: # current can be required for several plugins and so # it can already be present in the list plugins.append(current) - def _getPluginsList(self): + def _get_plugins_list(self): # XXX: for component we don't launch all plugins triggers # but only the ones from which there is a dependency plugins = [] - self._buildDependencies(self.entry_plugin, plugins) + self._build_dependencies(self.entry_plugin, plugins) return plugins - def entityConnected(self): + def entity_connected(self): # we can now launch entry point try: start_cb = self.entry_plugin.componentStart @@ -1160,13 +1160,13 @@ else: return start_cb(self) - def addPostXmlCallbacks(self, post_xml_treatments): + def add_post_xml_callbacks(self, post_xml_treatments): if self.sendHistory: post_xml_treatments.addCallback( - lambda ret: defer.ensureDeferred(self.messageAddToHistory(ret)) + lambda ret: defer.ensureDeferred(self.message_add_to_history(ret)) ) - def getOwnerFromJid(self, to_jid: jid.JID) -> jid.JID: + def get_owner_from_jid(self, to_jid: jid.JID) -> jid.JID: """Retrieve "owner" of a component resource from the destination jid of the request This method needs plugin XEP-0106 for unescaping, if you use it you must add the @@ -1187,7 +1187,7 @@ # only user part is specified, we use our own host to build the full jid return jid.JID(None, (user, self.host, None)) - def getOwnerAndPeer(self, iq_elt: domish.Element) -> Tuple[jid.JID, jid.JID]: + def get_owner_and_peer(self, iq_elt: domish.Element) -> Tuple[jid.JID, jid.JID]: """Retrieve owner of a component jid, and the jid of the requesting peer "owner" is found by either unescaping full jid from node, or by combining node @@ -1198,14 +1198,14 @@ """ to_jid = jid.JID(iq_elt['to']) if to_jid.user: - owner = self.getOwnerFromJid(to_jid) + owner = self.get_owner_from_jid(to_jid) else: owner = jid.JID(iq_elt["from"]).userhostJID() peer_jid = jid.JID(iq_elt["from"]) return peer_jid, owner - def getVirtualClient(self, jid_: jid.JID) -> SatXMPPEntity: + def get_virtual_client(self, jid_: jid.JID) -> SatXMPPEntity: """Get client for this component with a specified jid This is needed to perform operations with a virtual JID corresponding to a virtual @@ -1229,13 +1229,13 @@ def client(self): return self.parent - def normalizeNS(self, elt: domish.Element, namespace: Optional[str]) -> None: + def normalize_ns(self, elt: domish.Element, namespace: Optional[str]) -> None: if elt.uri == namespace: elt.defaultUri = elt.uri = C.NS_CLIENT for child in elt.elements(): - self.normalizeNS(child, namespace) + self.normalize_ns(child, namespace) - def parseMessage(self, message_elt): + def parse_message(self, message_elt): """Parse a message XML and return message_data @param message_elt(domish.Element): raw xml @@ -1245,13 +1245,13 @@ """ if message_elt.name != "message": log.warning(_( - "parseMessage used with a non stanza, ignoring: {xml}" + "parse_message used with a non stanza, ignoring: {xml}" .format(xml=message_elt.toXml()))) return {} if message_elt.uri == None: # xmlns may be None when wokkel element parsing strip out root namespace - self.normalizeNS(message_elt, None) + self.normalize_ns(message_elt, None) elif message_elt.uri != C.NS_CLIENT: log.warning(_( "received with a wrong namespace: {xml}" @@ -1297,7 +1297,7 @@ received_timestamp = message_elt._received_timestamp except AttributeError: # message_elt._received_timestamp should have been set in onMessage - # but if parseMessage is called directly, it can be missing + # but if parse_message is called directly, it can be missing log.debug("missing received timestamp for {message_elt}".format( message_elt=message_elt)) received_timestamp = time.time() @@ -1316,7 +1316,7 @@ self.host.trigger.point("message_parse", client, message_elt, data) return data - def _onMessageStartWorkflow(self, cont, client, message_elt, post_treat): + def _on_message_start_workflow(self, cont, client, message_elt, post_treat): """Parse message and do post treatments It is the first callback called after messageReceived trigger @@ -1327,16 +1327,16 @@ """ if not cont: return - data = self.parseMessage(message_elt) - post_treat.addCallback(self.completeAttachments) - post_treat.addCallback(self.skipEmptyMessage) + data = self.parse_message(message_elt) + post_treat.addCallback(self.complete_attachments) + post_treat.addCallback(self.skip_empty_message) if not client.is_component or client.receiveHistory: post_treat.addCallback( - lambda ret: defer.ensureDeferred(self.addToHistory(ret)) + lambda ret: defer.ensureDeferred(self.add_to_history(ret)) ) if not client.is_component: - post_treat.addCallback(self.bridgeSignal, data) - post_treat.addErrback(self.cancelErrorTrap) + post_treat.addCallback(self.bridge_signal, data) + post_treat.addErrback(self.cancel_error_trap) post_treat.callback(data) def onMessage(self, message_elt): @@ -1348,18 +1348,18 @@ log.debug(_("got message from: {from_}").format(from_=message_elt["from"])) if self.client.is_component and message_elt.uri == component.NS_COMPONENT_ACCEPT: # we use client namespace all the time to simplify parsing - self.normalizeNS(message_elt, component.NS_COMPONENT_ACCEPT) + self.normalize_ns(message_elt, component.NS_COMPONENT_ACCEPT) # plugin can add their treatments to this deferred post_treat = defer.Deferred() - d = self.host.trigger.asyncPoint( + d = self.host.trigger.async_point( "messageReceived", client, message_elt, post_treat ) - d.addCallback(self._onMessageStartWorkflow, client, message_elt, post_treat) + d.addCallback(self._on_message_start_workflow, client, message_elt, post_treat) - def completeAttachments(self, data): + def complete_attachments(self, data): """Complete missing metadata of attachments""" for attachment in data['extra'].get(C.KEY_ATTACHMENTS, []): if "name" not in attachment and "url" in attachment: @@ -1374,24 +1374,24 @@ return data - def skipEmptyMessage(self, data): + def skip_empty_message(self, data): if not data["message"] and not data["extra"] and not data["subject"]: raise failure.Failure(exceptions.CancelError("Cancelled empty message")) return data - async def addToHistory(self, data): + async def add_to_history(self, data): if data.pop("history", None) == C.HISTORY_SKIP: log.debug("history is skipped as requested") data["extra"]["history"] = C.HISTORY_SKIP else: # we need a message to store - if self.parent.isMessagePrintable(data): - return await self.host.memory.addToHistory(self.parent, data) + if self.parent.is_message_printable(data): + return await self.host.memory.add_to_history(self.parent, data) else: log.debug("not storing empty message to history: {data}" .format(data=data)) - def bridgeSignal(self, __, data): + def bridge_signal(self, __, data): try: data["extra"]["received_timestamp"] = str(data["received_timestamp"]) data["extra"]["delay_sender"] = data["delay_sender"] @@ -1400,8 +1400,8 @@ if self.client.encryption.isEncrypted(data): data["extra"]["encrypted"] = True if data is not None: - if self.parent.isMessagePrintable(data): - self.host.bridge.messageNew( + if self.parent.is_message_printable(data): + self.host.bridge.message_new( data["uid"], data["timestamp"], data["from"].full(), @@ -1417,7 +1417,7 @@ data=data)) return data - def cancelErrorTrap(self, failure_): + def cancel_error_trap(self, failure_): """A message sending can be cancelled by a plugin treatment""" failure_.trap(exceptions.CancelError) @@ -1433,7 +1433,7 @@ self._groups = {} # map from groups to jids: key=group value=set of jids def __contains__(self, entity_jid): - return self.isJidInRoster(entity_jid) + return self.is_jid_in_roster(entity_jid) @property def versioning(self): @@ -1449,7 +1449,7 @@ """ return persistent.PersistentDict(NS_ROSTER_VER, self.parent.profile) - def _registerItem(self, item): + def _register_item(self, item): """Register item in local cache item must be already registered in self._jids before this method is called @@ -1477,7 +1477,7 @@ self._groups.setdefault(group, set()).add(item.entity) @defer.inlineCallbacks - def _cacheRoster(self, version): + def _cache_roster(self, version): """Serialise local roster and save it to storage @param version(unicode): version of roster in local cache @@ -1501,10 +1501,10 @@ yield roster_cache.clear() self._jids.clear() self._groups.clear() - yield self.requestRoster() + yield self.request_roster() @defer.inlineCallbacks - def requestRoster(self): + def request_roster(self): """Ask the server for Roster list """ if self.versioning: log.info(_("our server support roster versioning, we use it")) @@ -1526,7 +1526,7 @@ roster_item_elt = generic.parseXml(roster_item_elt_s.encode('utf-8')) roster_item = xmppim.RosterItem.fromElement(roster_item_elt) self._jids[roster_jid] = roster_item - self._registerItem(roster_item) + self._register_item(roster_item) else: log.warning(_("our server doesn't support roster versioning")) version = None @@ -1553,8 +1553,8 @@ ) self.removeItem(item.entity) # FIXME: to be checked else: - self._registerItem(item) - yield self._cacheRoster(roster.version) + self._register_item(item) + yield self._cache_roster(roster.version) if not self.got_roster.called: # got_roster may already be called if we use resync() @@ -1567,7 +1567,7 @@ """ return xmppim.RosterClientProtocol.removeItem(self, to_jid) - def getAttributes(self, item): + def get_attributes(self, item): """Return dictionary of attributes as used in bridge from a RosterItem @param item: RosterItem @@ -1602,9 +1602,9 @@ except KeyError: pass # no previous item registration (or it's been cleared) self._jids[entity] = item - self._registerItem(item) - self.host.bridge.newContact( - entity.full(), self.getAttributes(item), list(item.groups), + self._register_item(item) + self.host.bridge.contact_new( + entity.full(), self.get_attributes(item), list(item.groups), self.parent.profile ) @@ -1645,13 +1645,13 @@ ) # then we send the bridge signal - self.host.bridge.contactDeleted(entity.full(), self.parent.profile) + self.host.bridge.contact_deleted(entity.full(), self.parent.profile) - def getGroups(self): + def get_groups(self): """Return a list of groups""" return list(self._groups.keys()) - def getItem(self, entity_jid): + def get_item(self, entity_jid): """Return RosterItem for a given jid @param entity_jid(jid.JID): jid of the contact @@ -1660,18 +1660,18 @@ """ return self._jids.get(entity_jid, None) - def getJids(self): + def get_jids(self): """Return all jids of the roster""" return list(self._jids.keys()) - def isJidInRoster(self, entity_jid): + def is_jid_in_roster(self, entity_jid): """Return True if jid is in roster""" if not isinstance(entity_jid, jid.JID): raise exceptions.InternalError( f"a JID is expected, not {type(entity_jid)}: {entity_jid!r}") return entity_jid in self._jids - def isSubscribedFrom(self, entity_jid: jid.JID) -> bool: + def is_subscribed_from(self, entity_jid: jid.JID) -> bool: """Return True if entity is authorised to see our presence""" try: item = self._jids[entity_jid.userhostJID()] @@ -1679,7 +1679,7 @@ return False return item.subscriptionFrom - def isSubscribedTo(self, entity_jid: jid.JID) -> bool: + def is_subscribed_to(self, entity_jid: jid.JID) -> bool: """Return True if we are subscribed to entity""" try: item = self._jids[entity_jid.userhostJID()] @@ -1687,17 +1687,17 @@ return False return item.subscriptionTo - def getItems(self): + def get_items(self): """Return all items of the roster""" return list(self._jids.values()) - def getJidsFromGroup(self, group): + def get_jids_from_group(self, group): try: return self._groups[group] except KeyError: raise exceptions.UnknownGroupError(group) - def getJidsSet(self, type_, groups=None): + def get_jids_set(self, type_, groups=None): """Helper method to get a set of jids @param type_(unicode): one of: @@ -1710,22 +1710,22 @@ raise ValueError("groups must not be set for {} type".format(C.ALL)) if type_ == C.ALL: - return set(self.getJids()) + return set(self.get_jids()) elif type_ == C.GROUP: jids = set() for group in groups: - jids.update(self.getJidsFromGroup(group)) + jids.update(self.get_jids_from_group(group)) return jids else: raise ValueError("Unexpected type_ {}".format(type_)) - def getNick(self, entity_jid): + def get_nick(self, entity_jid): """Return a nick name for an entity return nick choosed by user if available else return user part of entity_jid """ - item = self.getItem(entity_jid) + item = self.get_item(entity_jid) if item is None: return entity_jid.user else: @@ -1761,12 +1761,12 @@ ): return - self.host.memory.setPresenceStatus( + self.host.memory.set_presence_status( entity, show or "", int(priority), statuses, self.parent.profile ) # now it's time to notify frontends - self.host.bridge.presenceUpdate( + self.host.bridge.presence_update( entity.full(), show or "", int(priority), statuses, self.parent.profile ) @@ -1791,7 +1791,7 @@ # if the entity is not known yet in this session or is already unavailable, # there is no need to send an unavailable signal try: - presence = self.host.memory.getEntityDatum( + presence = self.host.memory.get_entity_datum( self.client, entity, "presence" ) except (KeyError, exceptions.UnknownEntityError): @@ -1799,7 +1799,7 @@ pass else: if presence.show != C.PRESENCE_UNAVAILABLE: - self.host.bridge.presenceUpdate( + self.host.bridge.presence_update( entity.full(), C.PRESENCE_UNAVAILABLE, 0, @@ -1807,7 +1807,7 @@ self.parent.profile, ) - self.host.memory.setPresenceStatus( + self.host.memory.set_presence_status( entity, C.PRESENCE_UNAVAILABLE, 0, statuses, self.parent.profile ) @@ -1822,7 +1822,7 @@ if priority is None: try: priority = int( - self.host.memory.getParamA( + self.host.memory.param_get_a( "Priority", "Connection", profile_key=self.parent.profile ) ) @@ -1851,8 +1851,8 @@ def subscribed(self, entity): yield self.parent.roster.got_roster xmppim.PresenceClientProtocol.subscribed(self, entity) - self.host.memory.delWaitingSub(entity.userhost(), self.parent.profile) - item = self.parent.roster.getItem(entity) + self.host.memory.del_waiting_sub(entity.userhost(), self.parent.profile) + item = self.parent.roster.get_item(entity) if ( not item or not item.subscriptionTo ): # we automatically subscribe to 'to' presence @@ -1861,7 +1861,7 @@ def unsubscribed(self, entity): xmppim.PresenceClientProtocol.unsubscribed(self, entity) - self.host.memory.delWaitingSub(entity.userhost(), self.parent.profile) + self.host.memory.del_waiting_sub(entity.userhost(), self.parent.profile) def subscribedReceived(self, entity): log.debug(_("subscription approved for [%s]") % entity.userhost()) @@ -1875,14 +1875,14 @@ def subscribeReceived(self, entity): log.debug(_("subscription request from [%s]") % entity.userhost()) yield self.parent.roster.got_roster - item = self.parent.roster.getItem(entity) + item = self.parent.roster.get_item(entity) if item and item.subscriptionTo: # We automatically accept subscription if we are already subscribed to # contact presence log.debug(_("sending automatic subscription acceptance")) self.subscribed(entity) else: - self.host.memory.addWaitingSub( + self.host.memory.add_waiting_sub( "subscribe", entity.userhost(), self.parent.profile ) self.host.bridge.subscribe( @@ -1893,10 +1893,10 @@ def unsubscribeReceived(self, entity): log.debug(_("unsubscription asked for [%s]") % entity.userhost()) yield self.parent.roster.got_roster - item = self.parent.roster.getItem(entity) + item = self.parent.roster.get_item(entity) if item and item.subscriptionFrom: # we automatically remove contact log.debug(_("automatic contact deletion")) - self.host.delContact(entity, self.parent.profile) + self.host.contact_del(entity, self.parent.profile) self.host.bridge.subscribe("unsubscribe", entity.userhost(), self.parent.profile) diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/cache.py --- a/sat/memory/cache.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/cache.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,9 +45,9 @@ if None, the cache will be common for all profiles """ self.profile = profile - path_elts = [host.memory.getConfig("", "local_dir"), C.CACHE_DIR] + path_elts = [host.memory.config_get("", "local_dir"), C.CACHE_DIR] if profile: - path_elts.extend(["profiles", regex.pathEscape(profile)]) + path_elts.extend(["profiles", regex.path_escape(profile)]) else: path_elts.append("common") self.cache_dir = Path(*path_elts) @@ -121,14 +121,14 @@ raise exceptions.DataError("Invalid char found") return self.cache_dir / filename - def getMetadata(self, uid: str, update_eol: bool = True) -> Optional[Dict[str, Any]]: + def get_metadata(self, uid: str, update_eol: bool = True) -> Optional[Dict[str, Any]]: """Retrieve metadata for cached data @param uid(unicode): unique identifier of file @param update_eol(bool): True if eol must extended if True, max_age will be added to eol (only if it is not already expired) @return (dict, None): metadata with following keys: - see [cacheData] for data details, an additional "path" key is the full path to + see [cache_data] for data details, an additional "path" key is the full path to cached file. None if file is not in cache (or cache is invalid) """ @@ -176,23 +176,23 @@ cache_data["path"] = self.getPath(cache_data["filename"]) return cache_data - def getFilePath(self, uid: str) -> Path: + def get_file_path(self, uid: str) -> Path: """Retrieve absolute path to file @param uid(unicode): unique identifier of file @return (unicode, None): absolute path to cached file None if file is not in cache (or cache is invalid) """ - metadata = self.getMetadata(uid) + metadata = self.get_metadata(uid) if metadata is not None: return metadata["path"] - def removeFromCache(self, uid, metadata=None): + def remove_from_cache(self, uid, metadata=None): """Remove data from cache @param uid(unicode): unique identifier cache file """ - cache_data = self.getMetadata(uid, update_eol=False) + cache_data = self.get_metadata(uid, update_eol=False) if cache_data is None: log.debug(f"cache with uid {uid!r} has already expired or been removed") return @@ -215,7 +215,7 @@ cache_file.unlink() log.debug(f"cache with uid {uid!r} has been removed") - def cacheData( + def cache_data( self, source: str, uid: str, diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/crypto.py --- a/sat/memory/crypto.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/crypto.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,7 +47,7 @@ """ if leave_empty and text == "": return "" - iv = BlockCipher.getRandomKey() + iv = BlockCipher.get_random_key() key = key.encode() key = ( key[: BlockCipher.MAX_KEY_SIZE] @@ -91,7 +91,7 @@ return BlockCipher.unpad(decrypted) @staticmethod - def getRandomKey(size=None, base64=False): + def get_random_key(size=None, base64=False): """Return a random key suitable for block cipher encryption. Note: a good value for the key length is to make it as long as the block size. diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/disco.py --- a/sat/memory/disco.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/disco.py Sat Apr 08 13:54:42 2023 +0200 @@ -87,7 +87,7 @@ return self.hashes.__contains__(hash_) def load(self): - def fillHashes(hashes): + def fill_hashes(hashes): for hash_, xml in hashes.items(): element = xml_tools.ElementParser()(xml) disco_info = disco.DiscoInfo.fromElement(element) @@ -106,7 +106,7 @@ log.info("Disco hashes loaded") d = self.persistent.load() - d.addCallback(fillHashes) + d.addCallback(fill_hashes) return d @@ -131,11 +131,11 @@ @param node(unicode): optional node to use for disco request @return: a Deferred which fire a boolean (True if feature is available) """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) defer.returnValue(feature in disco_infos.features) @defer.inlineCallbacks - def checkFeature(self, client, feature, jid_=None, node=""): + def check_feature(self, client, feature, jid_=None, node=""): """Like hasFeature, but raise an exception is feature is not Found @param feature: feature namespace @@ -144,13 +144,13 @@ @raise: exceptions.FeatureNotFound """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) if not feature in disco_infos.features: raise failure.Failure(exceptions.FeatureNotFound()) @defer.inlineCallbacks - def checkFeatures(self, client, features, jid_=None, identity=None, node=""): - """Like checkFeature, but check several features at once, and check also identity + def check_features(self, client, features, jid_=None, identity=None, node=""): + """Like check_feature, but check several features at once, and check also identity @param features(iterable[unicode]): features to check @param jid_(jid.JID): jid of the target, or None for profile's server @@ -159,14 +159,14 @@ @raise: exceptions.FeatureNotFound """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) if not set(features).issubset(disco_infos.features): raise failure.Failure(exceptions.FeatureNotFound()) if identity is not None and identity not in disco_infos.identities: raise failure.Failure(exceptions.FeatureNotFound()) - async def hasIdentity( + async def has_identity( self, client: SatXMPPEntity, category: str, @@ -182,10 +182,10 @@ @param node(unicode): optional node to use for disco request @return: True if the entity has the given identity """ - disco_infos = await self.getInfos(client, jid_, node) + disco_infos = await self.get_infos(client, jid_, node) return (category, type_) in disco_infos.identities - def getInfos(self, client, jid_=None, node="", use_cache=True): + def get_infos(self, client, jid_=None, node="", use_cache=True): """get disco infos from jid_, filling capability hash if needed @param jid_: jid of the target, or None for profile's server @@ -199,13 +199,13 @@ if not use_cache: # we ignore cache, so we pretend we haven't found it raise KeyError - cap_hash = self.host.memory.getEntityData( + cap_hash = self.host.memory.entity_data_get( client, jid_, [C.ENTITY_CAP_HASH] )[C.ENTITY_CAP_HASH] except (KeyError, exceptions.UnknownEntityError): # capability hash is not available, we'll compute one - def infosCb(disco_infos): - cap_hash = self.generateHash(disco_infos) + def infos_cb(disco_infos): + cap_hash = self.generate_hash(disco_infos) for ext_form in disco_infos.extensions.values(): # wokkel doesn't call typeCheck on reception, so we do it here # to avoid ending up with incorrect types. We have to do it after @@ -213,12 +213,12 @@ # hash) ext_form.typeCheck() self.hashes[cap_hash] = disco_infos - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, C.ENTITY_CAP_HASH, cap_hash ) return disco_infos - def infosEb(fail): + def infos_eb(fail): if fail.check(defer.CancelledError): reason = "request time-out" fail = failure.Failure(exceptions.TimeOutError(str(fail.value))) @@ -236,21 +236,21 @@ # XXX we set empty disco in cache, to avoid getting an error or waiting # for a timeout again the next time - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, C.ENTITY_CAP_HASH, CAP_HASH_ERROR ) raise fail d = client.disco.requestInfo(jid_, nodeIdentifier=node) - d.addCallback(infosCb) - d.addErrback(infosEb) + d.addCallback(infos_cb) + d.addErrback(infos_eb) return d else: disco_infos = self.hashes[cap_hash] return defer.succeed(disco_infos) @defer.inlineCallbacks - def getItems(self, client, jid_=None, node="", use_cache=True): + def get_items(self, client, jid_=None, node="", use_cache=True): """get disco items from jid_, cache them for our own server @param jid_(jid.JID): jid of the target, or None for profile's server @@ -264,7 +264,7 @@ if jid_ == client.server_jid and not node: # we cache items only for our own server and if node is not set try: - items = self.host.memory.getEntityData( + items = self.host.memory.entity_data_get( client, jid_, ["DISCO_ITEMS"] )["DISCO_ITEMS"] log.debug("[%s] disco items are in cache" % jid_.full()) @@ -274,7 +274,7 @@ except (KeyError, exceptions.UnknownEntityError): log.debug("Caching [%s] disco items" % jid_.full()) items = yield client.disco.requestItems(jid_, nodeIdentifier=node) - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, "DISCO_ITEMS", items ) else: @@ -290,24 +290,24 @@ defer.returnValue(items) - def _infosEb(self, failure_, entity_jid): + def _infos_eb(self, failure_, entity_jid): failure_.trap(StanzaError) log.warning( _("Error while requesting [%(jid)s]: %(error)s") % {"jid": entity_jid.full(), "error": failure_.getErrorMessage()} ) - def findServiceEntity(self, client, category, type_, jid_=None): - """Helper method to find first available entity from findServiceEntities + def find_service_entity(self, client, category, type_, jid_=None): + """Helper method to find first available entity from find_service_entities - args are the same as for [findServiceEntities] + args are the same as for [find_service_entities] @return (jid.JID, None): found entity """ - d = self.host.findServiceEntities(client, category, type_) + d = self.host.find_service_entities(client, category, type_) d.addCallback(lambda entities: entities.pop() if entities else None) return d - def findServiceEntities(self, client, category, type_, jid_=None): + def find_service_entities(self, client, category, type_, jid_=None): """Return all available items of an entity which correspond to (category, type_) @param category: identity's category @@ -318,29 +318,29 @@ """ found_entities = set() - def infosCb(infos, entity_jid): + def infos_cb(infos, entity_jid): if (category, type_) in infos.identities: found_entities.add(entity_jid) - def gotItems(items): + def got_items(items): defers_list = [] for item in items: - info_d = self.getInfos(client, item.entity) + info_d = self.get_infos(client, item.entity) info_d.addCallbacks( - infosCb, self._infosEb, [item.entity], None, [item.entity] + infos_cb, self._infos_eb, [item.entity], None, [item.entity] ) defers_list.append(info_d) return defer.DeferredList(defers_list) - d = self.getItems(client, jid_) - d.addCallback(gotItems) + d = self.get_items(client, jid_) + d.addCallback(got_items) d.addCallback(lambda __: found_entities) reactor.callLater( TIMEOUT, d.cancel ) # FIXME: one bad service make a general timeout return d - def findFeaturesSet(self, client, features, identity=None, jid_=None): + def find_features_set(self, client, features, identity=None, jid_=None): """Return entities (including jid_ and its items) offering features @param features: iterable of features which must be present @@ -355,7 +355,7 @@ features = set(features) found_entities = set() - def infosCb(infos, entity): + def infos_cb(infos, entity): if entity is None: log.warning(_("received an item without jid")) return @@ -364,23 +364,23 @@ if features.issubset(infos.features): found_entities.add(entity) - def gotItems(items): + def got_items(items): defer_list = [] for entity in [jid_] + [item.entity for item in items]: - infos_d = self.getInfos(client, entity) - infos_d.addCallbacks(infosCb, self._infosEb, [entity], None, [entity]) + infos_d = self.get_infos(client, entity) + infos_d.addCallbacks(infos_cb, self._infos_eb, [entity], None, [entity]) defer_list.append(infos_d) return defer.DeferredList(defer_list) - d = self.getItems(client, jid_) - d.addCallback(gotItems) + d = self.get_items(client, jid_) + d.addCallback(got_items) d.addCallback(lambda __: found_entities) reactor.callLater( TIMEOUT, d.cancel ) # FIXME: one bad service make a general timeout return d - def generateHash(self, services): + def generate_hash(self, services): """ Generate a unique hash for given service hash algorithm is the one described in XEP-0115 @@ -433,7 +433,7 @@ return cap_hash @defer.inlineCallbacks - def _discoInfos( + def _disco_infos( self, entity_jid_s, node="", use_cache=True, profile_key=C.PROF_KEY_NONE ): """Discovery method for the bridge @@ -443,9 +443,9 @@ @return: list of tuples """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) entity = jid.JID(entity_jid_s) - disco_infos = yield self.getInfos(client, entity, node, use_cache) + disco_infos = yield self.get_infos(client, entity, node, use_cache) extensions = {} # FIXME: should extensions be serialised using tools.common.data_format? for form_type, form in list(disco_infos.extensions.items()): @@ -459,7 +459,7 @@ values = [field.value] if field.value is not None else field.values if field.fieldType == "boolean": - values = [C.boolConst(v) for v in values] + values = [C.bool_const(v) for v in values] fields.append((data, values)) extensions[form_type or ""] = fields @@ -483,7 +483,7 @@ yield (item.entity.full(), item.nodeIdentifier or "", item.name or "") @defer.inlineCallbacks - def _discoItems( + def _disco_items( self, entity_jid_s, node="", use_cache=True, profile_key=C.PROF_KEY_NONE ): """ Discovery method for the bridge @@ -492,8 +492,8 @@ @param node(unicode): optional node to use @param use_cache(bool): if True, use cached data if available @return: list of tuples""" - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) entity = jid.JID(entity_jid_s) - disco_items = yield self.getItems(client, entity, node, use_cache) + disco_items = yield self.get_items(client, entity, node, use_cache) ret = list(self.items2tuples(disco_items)) defer.returnValue(ret) diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/encryption.py --- a/sat/memory/encryption.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/encryption.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,7 +50,7 @@ def host(self): return self.client.host_app - async def loadSessions(self): + async def load_sessions(self): """Load persistent sessions""" await self._stored_session.load() start_d_list = [] @@ -69,18 +69,18 @@ log.info(_("encryption sessions restored")) @classmethod - def registerPlugin(cls, plg_instance, name, namespace, priority=0, directed=False): + def register_plugin(cls, plg_instance, name, namespace, priority=0, directed=False): """Register a plugin handling an encryption algorithm @param plg_instance(object): instance of the plugin it must have the following methods: - - getTrustUI(entity): return a XMLUI for trust management + - get_trust_ui(entity): return a XMLUI for trust management entity(jid.JID): entity to manage The returned XMLUI must be a form if may have the following methods: - - startEncryption(entity): start encrypted session + - start_encryption(entity): start encrypted session entity(jid.JID): entity to start encrypted session with - - stopEncryption(entity): start encrypted session + - stop_encryption(entity): start encrypted session entity(jid.JID): entity to stop encrypted session with if they don't exists, those 2 methods will be ignored. @@ -115,7 +115,7 @@ return cls.plugins @classmethod - def getPlugin(cls, namespace): + def get_plugin(cls, namespace): try: return next(p for p in cls.plugins if p.namespace == namespace) except StopIteration: @@ -124,12 +124,12 @@ namespace=namespace)) @classmethod - def getNamespaces(cls): + def get_namespaces(cls): """Get available plugin namespaces""" return {p.namespace for p in cls.getPlugins()} @classmethod - def getNSFromName(cls, name): + def get_ns_from_name(cls, name): """Retrieve plugin namespace from its name @param name(unicode): name of the plugin (case insensitive) @@ -143,7 +143,7 @@ "Can't find a plugin with the name \"{name}\".".format( name=name))) - def getBridgeData(self, session): + def get_bridge_data(self, session): """Retrieve session data serialized for bridge. @param session(dict): encryption session @@ -159,7 +159,7 @@ return data_format.serialise(bridge_data) - async def _startEncryption(self, plugin, entity): + async def _start_encryption(self, plugin, entity): """Start encryption with a plugin This method must be called just before adding a plugin session. @@ -168,14 +168,14 @@ if not plugin.directed: await self._stored_session.aset(entity.userhost(), plugin.namespace) try: - start_encryption = plugin.instance.startEncryption + start_encryption = plugin.instance.start_encryption except AttributeError: - log.debug(f"No startEncryption method found for {plugin.namespace}") + log.debug(f"No start_encryption method found for {plugin.namespace}") else: # we copy entity to avoid having the resource changed by stop_encryption - await utils.asDeferred(start_encryption, self.client, copy.copy(entity)) + await utils.as_deferred(start_encryption, self.client, copy.copy(entity)) - async def _stopEncryption(self, plugin, entity): + async def _stop_encryption(self, plugin, entity): """Stop encryption with a plugin This method must be called just before removing a plugin session. @@ -186,12 +186,12 @@ except KeyError: pass try: - stop_encryption = plugin.instance.stopEncryption + stop_encryption = plugin.instance.stop_encryption except AttributeError: - log.debug(f"No stopEncryption method found for {plugin.namespace}") + log.debug(f"No stop_encryption method found for {plugin.namespace}") else: # we copy entity to avoid having the resource changed by stop_encryption - return utils.asDeferred(stop_encryption, self.client, copy.copy(entity)) + return utils.as_deferred(stop_encryption, self.client, copy.copy(entity)) async def start(self, entity, namespace=None, replace=False): """Start an encryption session with an entity @@ -211,7 +211,7 @@ if namespace is None: plugin = self.plugins[0] else: - plugin = self.getPlugin(namespace) + plugin = self.get_plugin(namespace) bare_jid = entity.userhostJID() if bare_jid in self._sessions: @@ -227,7 +227,7 @@ # there is a conflict, but replacement is requested # so we stop previous encryption to use new one del self._sessions[bare_jid] - await self._stopEncryption(former_plugin, entity) + await self._stop_encryption(former_plugin, entity) else: msg = (_("Session with {bare_jid} is already encrypted with {name}. " "Please stop encryption session before changing algorithm.") @@ -238,7 +238,7 @@ data = {"plugin": plugin} if plugin.directed: if not entity.resource: - entity.resource = self.host.memory.getMainResource(self.client, entity) + entity.resource = self.host.memory.main_resource_get(self.client, entity) if not entity.resource: raise exceptions.NotFound( _("No resource found for {destinee}, can't encrypt with {name}") @@ -251,14 +251,14 @@ elif entity.resource: raise ValueError(_("{name} encryption must be used with bare jids.")) - await self._startEncryption(plugin, entity) + await self._start_encryption(plugin, entity) self._sessions[entity.userhostJID()] = data log.info(_("Encryption session has been set for {entity_jid} with " "{encryption_name}").format( entity_jid=entity.full(), encryption_name=plugin.name)) - self.host.bridge.messageEncryptionStarted( + self.host.bridge.message_encryption_started( entity.full(), - self.getBridgeData(data), + self.get_bridge_data(data), self.client.profile) msg = D_("Encryption session started: your messages with {destinee} are " "now end to end encrypted using {name} algorithm.").format( @@ -312,16 +312,16 @@ # we stop the whole session # see comment below for deleting session before stopping encryption del self._sessions[entity.userhostJID()] - await self._stopEncryption(plugin, entity) + await self._stop_encryption(plugin, entity) else: - # plugin's stopEncryption may call stop again (that's the case with OTR) - # so we need to remove plugin from session before calling self._stopEncryption + # plugin's stop_encryption may call stop again (that's the case with OTR) + # so we need to remove plugin from session before calling self._stop_encryption del self._sessions[entity.userhostJID()] - await self._stopEncryption(plugin, entity) + await self._stop_encryption(plugin, entity) log.info(_("encryption session stopped with entity {entity}").format( entity=entity.full())) - self.host.bridge.messageEncryptionStopped( + self.host.bridge.message_encryption_stopped( entity.full(), {'name': plugin.name, 'namespace': plugin.namespace, @@ -358,7 +358,7 @@ return None return session["plugin"].namespace - def getTrustUI(self, entity_jid, namespace=None): + def get_trust_ui(self, entity_jid, namespace=None): """Retrieve encryption UI @param entity_jid(jid.JID): get the UI for this entity @@ -379,53 +379,53 @@ .format(entity_jid=entity_jid.full())) plugin = session['plugin'] else: - plugin = self.getPlugin(namespace) + plugin = self.get_plugin(namespace) try: - get_trust_ui = plugin.instance.getTrustUI + get_trust_ui = plugin.instance.get_trust_ui except AttributeError: raise NotImplementedError( "Encryption plugin doesn't handle trust management UI") else: - return utils.asDeferred(get_trust_ui, self.client, entity_jid) + return utils.as_deferred(get_trust_ui, self.client, entity_jid) ## Menus ## @classmethod - def _importMenus(cls, host): - host.importMenu( + def _import_menus(cls, host): + host.import_menu( (D_("Encryption"), D_("unencrypted (plain text)")), - partial(cls._onMenuUnencrypted, host=host), + partial(cls._on_menu_unencrypted, host=host), security_limit=0, help_string=D_("End encrypted session"), type_=C.MENU_SINGLE, ) for plg in cls.getPlugins(): - host.importMenu( + host.import_menu( (D_("Encryption"), plg.name), - partial(cls._onMenuName, host=host, plg=plg), + partial(cls._on_menu_name, host=host, plg=plg), security_limit=0, help_string=D_("Start {name} session").format(name=plg.name), type_=C.MENU_SINGLE, ) - host.importMenu( + host.import_menu( (D_("Encryption"), D_("⛨ {name} trust").format(name=plg.name)), - partial(cls._onMenuTrust, host=host, plg=plg), + partial(cls._on_menu_trust, host=host, plg=plg), security_limit=0, help_string=D_("Manage {name} trust").format(name=plg.name), type_=C.MENU_SINGLE, ) @classmethod - def _onMenuUnencrypted(cls, data, host, profile): - client = host.getClient(profile) + def _on_menu_unencrypted(cls, data, host, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']).userhostJID() d = defer.ensureDeferred(client.encryption.stop(peer_jid)) d.addCallback(lambda __: {}) return d @classmethod - def _onMenuName(cls, data, host, plg, profile): - client = host.getClient(profile) + def _on_menu_name(cls, data, host, plg, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']) if not plg.directed: peer_jid = peer_jid.userhostJID() @@ -436,15 +436,15 @@ @classmethod @defer.inlineCallbacks - def _onMenuTrust(cls, data, host, plg, profile): - client = host.getClient(profile) + def _on_menu_trust(cls, data, host, plg, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']).userhostJID() - ui = yield client.encryption.getTrustUI(peer_jid, plg.namespace) + ui = yield client.encryption.get_trust_ui(peer_jid, plg.namespace) defer.returnValue({'xmlui': ui.toXml()}) ## Triggers ## - def setEncryptionFlag(self, mess_data): + def set_encryption_flag(self, mess_data): """Set "encryption" key in mess_data if session with destinee is encrypted""" to_jid = mess_data['to'] encryption = self._sessions.get(to_jid.userhostJID()) @@ -455,11 +455,11 @@ f"encryption flag must not be set for groupchat if encryption algorithm " f"({encryption['plugin'].name}) is directed!") mess_data[C.MESS_KEY_ENCRYPTION] = encryption - self.markAsEncrypted(mess_data, plugin.namespace) + self.mark_as_encrypted(mess_data, plugin.namespace) ## Misc ## - def markAsEncrypted(self, mess_data, namespace): + def mark_as_encrypted(self, mess_data, namespace): """Helper method to mark a message as having been e2e encrypted. This should be used in the post_treat workflow of messageReceived trigger of @@ -483,7 +483,7 @@ return mess_data - def isEncryptionRequested( + def is_encryption_requested( self, mess_data: MessageData, namespace: Optional[str] = None @@ -513,7 +513,7 @@ return mess_data['extra'].get(C.MESS_KEY_ENCRYPTED, False) - def markAsTrusted(self, mess_data): + def mark_as_trusted(self, mess_data): """Helper methor to mark a message as sent from a trusted entity. This should be used in the post_treat workflow of messageReceived trigger of @@ -523,7 +523,7 @@ mess_data[C.MESS_KEY_TRUSTED] = True return mess_data - def markAsUntrusted(self, mess_data): + def mark_as_untrusted(self, mess_data): """Helper methor to mark a message as sent from an untrusted entity. This should be used in the post_treat workflow of messageReceived trigger of diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/memory.py --- a/sat/memory/memory.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/memory.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,13 +65,13 @@ self.timeout = timeout or Sessions.DEFAULT_TIMEOUT self.resettable_timeout = resettable_timeout - def newSession(self, session_data=None, session_id=None, profile=None): + def new_session(self, session_data=None, session_id=None, profile=None): """Create a new session @param session_data: mutable data to use, default to a dict @param session_id (str): force the session_id to the given string @param profile: if set, the session is owned by the profile, - and profileGet must be used instead of __getitem__ + and profile_get must be used instead of __getitem__ @return: session_id, session_data """ if session_id is None: @@ -80,7 +80,7 @@ raise exceptions.ConflictError( "Session id {} is already used".format(session_id) ) - timer = reactor.callLater(self.timeout, self._purgeSession, session_id) + timer = reactor.callLater(self.timeout, self._purge_session, session_id) if session_data is None: session_data = {} self._sessions[session_id] = ( @@ -88,7 +88,7 @@ ) return session_id, session_data - def _purgeSession(self, session_id): + def _purge_session(self, session_id): try: timer, session_data, profile = self._sessions[session_id] except ValueError: @@ -113,7 +113,7 @@ def __contains__(self, session_id): return session_id in self._sessions - def profileGet(self, session_id, profile): + def profile_get(self, session_id, profile): try: timer, session_data, profile_set = self._sessions[session_id] except ValueError: @@ -133,7 +133,7 @@ timer, session_data = self._sessions[session_id] except ValueError: raise exceptions.InternalError( - "You need to use profileGet instead of __getitem__ when profile is set" + "You need to use profile_get instead of __getitem__ when profile is set" ) except KeyError: raise failure.Failure(KeyError(MSG_NO_SESSION)) @@ -142,11 +142,11 @@ return session_data def __setitem__(self, key, value): - raise NotImplementedError("You need do use newSession to create a session") + raise NotImplementedError("You need do use new_session to create a session") def __delitem__(self, session_id): """ delete the session data """ - self._purgeSession(session_id) + self._purge_session(session_id) def keys(self): return list(self._sessions.keys()) @@ -160,7 +160,7 @@ used as the key to retrieve data or delete a session (instead of session id). """ - def _profileGetAllIds(self, profile): + def _profile_get_all_ids(self, profile): """Return a list of the sessions ids that are associated to the given profile. @param profile: %(doc_profile)s @@ -176,7 +176,7 @@ ret.append(session_id) return ret - def profileGetUnique(self, profile): + def profile_get_unique(self, profile): """Return the data of the unique session that is associated to the given profile. @param profile: %(doc_profile)s @@ -185,25 +185,25 @@ - None if no session is associated to the profile - raise an error if more than one session are found """ - ids = self._profileGetAllIds(profile) + ids = self._profile_get_all_ids(profile) if len(ids) > 1: raise exceptions.InternalError( - "profileGetUnique has been used but more than one session has been found!" + "profile_get_unique has been used but more than one session has been found!" ) return ( - self.profileGet(ids[0], profile) if len(ids) == 1 else None + self.profile_get(ids[0], profile) if len(ids) == 1 else None ) # XXX: timeout might be reset - def profileDelUnique(self, profile): + def profile_del_unique(self, profile): """Delete the unique session that is associated to the given profile. @param profile: %(doc_profile)s @return: None, but raise an error if more than one session are found """ - ids = self._profileGetAllIds(profile) + ids = self._profile_get_all_ids(profile) if len(ids) > 1: raise exceptions.InternalError( - "profileDelUnique has been used but more than one session has been found!" + "profile_del_unique has been used but more than one session has been found!" ) if len(ids) == 1: del self._sessions[ids[0]] @@ -217,7 +217,7 @@ def __init__(self, timeout=None): ProfileSessions.__init__(self, timeout, resettable_timeout=False) - def _purgeSession(self, session_id): + def _purge_session(self, session_id): log.debug( "FIXME: PasswordSessions should ask for the profile password after the session expired" ) @@ -237,9 +237,9 @@ self.subscriptions = {} self.auth_sessions = PasswordSessions() # remember the authenticated profiles self.disco = Discovery(host) - self.config = tools_config.parseMainConf(log_filenames=True) - self._cache_path = Path(self.getConfig("", "local_dir"), C.CACHE_DIR) - self.admins = self.getConfig("", "admins_list", []) + self.config = tools_config.parse_main_conf(log_filenames=True) + self._cache_path = Path(self.config_get("", "local_dir"), C.CACHE_DIR) + self.admins = self.config_get("", "admins_list", []) self.admin_jids = set() @@ -256,7 +256,7 @@ await self.disco.load() for admin in self.admins: try: - admin_jid_s = await self.asyncGetParamA( + admin_jid_s = await self.param_get_a_async( "JabberID", "Connection", profile_key=admin ) except Exception as e: @@ -273,7 +273,7 @@ ## Configuration ## - def getConfig(self, section, name, default=None): + def config_get(self, section, name, default=None): """Get the main configuration option @param section: section of the config file (None or '' for DEFAULT) @@ -281,7 +281,7 @@ @param default: value to use if not found @return: str, list or dict """ - return tools_config.getConfig(self.config, section, name, default) + return tools_config.config_get(self.config, section, name, default) def load_xml(self, filename): """Load parameters template from xml file @@ -322,16 +322,16 @@ def load(self): """Load parameters and all memory things from db""" # parameters data - return self.params.loadGenParams() + return self.params.load_gen_params() - def loadIndividualParams(self, profile): + def load_individual_params(self, profile): """Load individual parameters for a profile @param profile: %(doc_profile)s""" - return self.params.loadIndParams(profile) + return self.params.load_ind_params(profile) ## Profiles/Sessions management ## - def startSession(self, password, profile): + def start_session(self, password, profile): """"Iniatialise session for a profile @param password(unicode): profile session password @@ -340,59 +340,59 @@ @raise exceptions.ProfileUnknownError if profile doesn't exists @raise exceptions.PasswordError: the password does not match """ - profile = self.getProfileName(profile) + profile = self.get_profile_name(profile) - def createSession(__): + def create_session(__): """Called once params are loaded.""" self._entities_cache[profile] = {} log.info("[{}] Profile session started".format(profile)) return False - def backendInitialised(__): - def doStartSession(__=None): - if self.isSessionStarted(profile): + def backend_initialised(__): + def do_start_session(__=None): + if self.is_session_started(profile): log.info("Session already started!") return True try: # if there is a value at this point in self._entities_cache, - # it is the loadIndividualParams Deferred, the session is starting + # it is the load_individual_params Deferred, the session is starting session_d = self._entities_cache[profile] except KeyError: # else we do request the params - session_d = self._entities_cache[profile] = self.loadIndividualParams( + session_d = self._entities_cache[profile] = self.load_individual_params( profile ) - session_d.addCallback(createSession) + session_d.addCallback(create_session) finally: return session_d - auth_d = defer.ensureDeferred(self.profileAuthenticate(password, profile)) - auth_d.addCallback(doStartSession) + auth_d = defer.ensureDeferred(self.profile_authenticate(password, profile)) + auth_d.addCallback(do_start_session) return auth_d if self.host.initialised.called: - return defer.succeed(None).addCallback(backendInitialised) + return defer.succeed(None).addCallback(backend_initialised) else: - return self.host.initialised.addCallback(backendInitialised) + return self.host.initialised.addCallback(backend_initialised) - def stopSession(self, profile): + def stop_session(self, profile): """Delete a profile session @param profile: %(doc_profile)s """ - if self.host.isConnected(profile): + if self.host.is_connected(profile): log.debug("Disconnecting profile because of session stop") self.host.disconnect(profile) - self.auth_sessions.profileDelUnique(profile) + self.auth_sessions.profile_del_unique(profile) try: self._entities_cache[profile] except KeyError: log.warning("Profile was not in cache") - def _isSessionStarted(self, profile_key): - return self.isSessionStarted(self.getProfileName(profile_key)) + def _is_session_started(self, profile_key): + return self.is_session_started(self.get_profile_name(profile_key)) - def isSessionStarted(self, profile): + def is_session_started(self, profile): try: # XXX: if the value in self._entities_cache is a Deferred, # the session is starting but not started yet @@ -400,20 +400,20 @@ except KeyError: return False - async def profileAuthenticate(self, password, profile): + async def profile_authenticate(self, password, profile): """Authenticate the profile. @param password (unicode): the SàT profile password @return: None in case of success (an exception is raised otherwise) @raise exceptions.PasswordError: the password does not match """ - if not password and self.auth_sessions.profileGetUnique(profile): + if not password and self.auth_sessions.profile_get_unique(profile): # XXX: this allows any frontend to connect with the empty password as soon as # the profile has been authenticated at least once before. It is OK as long as # submitting a form with empty passwords is restricted to local frontends. return - sat_cipher = await self.asyncGetParamA( + sat_cipher = await self.param_get_a_async( C.PROFILE_PASS_PATH[1], C.PROFILE_PASS_PATH[0], profile_key=profile ) valid = PasswordHasher.verify(password, sat_cipher) @@ -421,9 +421,9 @@ log.warning(_("Authentication failure of profile {profile}").format( profile=profile)) raise exceptions.PasswordError("The provided profile password doesn't match.") - return await self.newAuthSession(password, profile) + return await self.new_auth_session(password, profile) - async def newAuthSession(self, key, profile): + async def new_auth_session(self, key, profile): """Start a new session for the authenticated profile. If there is already an existing session, no new one is created @@ -435,18 +435,18 @@ data = await PersistentDict(C.MEMORY_CRYPTO_NAMESPACE, profile).load() personal_key = BlockCipher.decrypt(key, data[C.MEMORY_CRYPTO_KEY]) # Create the session for this profile and store the personal key - session_data = self.auth_sessions.profileGetUnique(profile) + session_data = self.auth_sessions.profile_get_unique(profile) if not session_data: - self.auth_sessions.newSession( + self.auth_sessions.new_session( {C.MEMORY_CRYPTO_KEY: personal_key}, profile=profile ) log.debug("auth session created for profile %s" % profile) - def purgeProfileSession(self, profile): + def purge_profile_session(self, profile): """Delete cache of data of profile @param profile: %(doc_profile)s""" log.info(_("[%s] Profile session purge" % profile)) - self.params.purgeProfile(profile) + self.params.purge_profile(profile) try: del self._entities_cache[profile] except KeyError: @@ -457,7 +457,7 @@ % profile ) - def getProfilesList(self, clients=True, components=False): + def get_profiles_list(self, clients=True, components=False): """retrieve profiles list @param clients(bool): if True return clients profiles @@ -467,18 +467,18 @@ if not clients and not components: log.warning(_("requesting no profiles at all")) return [] - profiles = self.storage.getProfilesList() + profiles = self.storage.get_profiles_list() if clients and components: return sorted(profiles) - isComponent = self.storage.profileIsComponent + is_component = self.storage.profile_is_component if clients: - p_filter = lambda p: not isComponent(p) + p_filter = lambda p: not is_component(p) else: - p_filter = lambda p: isComponent(p) + p_filter = lambda p: is_component(p) return sorted(p for p in profiles if p_filter(p)) - def getProfileName(self, profile_key, return_profile_keys=False): + def get_profile_name(self, profile_key, return_profile_keys=False): """Return name of profile from keyword @param profile_key: can be the profile name or a keyword (like @DEFAULT@) @@ -486,19 +486,19 @@ @return: requested profile name @raise exceptions.ProfileUnknownError if profile doesn't exists """ - return self.params.getProfileName(profile_key, return_profile_keys) + return self.params.get_profile_name(profile_key, return_profile_keys) - def profileSetDefault(self, profile): + def profile_set_default(self, profile): """Set default profile @param profile: %(doc_profile)s """ # we want to be sure that the profile exists - profile = self.getProfileName(profile) + profile = self.get_profile_name(profile) self.memory_data["Profile_default"] = profile - def createProfile(self, name, password, component=None): + def create_profile(self, name, password, component=None): """Create a new profile @param name(unicode): profile name @@ -532,40 +532,40 @@ #   raise ValueError(_(u"Plugin {component} is not an entry point !".format( #   component = component))) - d = self.params.createProfile(name, component) + d = self.params.create_profile(name, component) - def initPersonalKey(__): + def init_personal_key(__): # be sure to call this after checking that the profile doesn't exist yet # generated once for all and saved in a PersistentDict - personal_key = BlockCipher.getRandomKey( + personal_key = BlockCipher.get_random_key( base64=True ).decode('utf-8') - self.auth_sessions.newSession( + self.auth_sessions.new_session( {C.MEMORY_CRYPTO_KEY: personal_key}, profile=name - ) # will be encrypted by setParam + ) # will be encrypted by param_set - def startFakeSession(__): - # avoid ProfileNotConnected exception in setParam + def start_fake_session(__): + # avoid ProfileNotConnected exception in param_set self._entities_cache[name] = None - self.params.loadIndParams(name) + self.params.load_ind_params(name) - def stopFakeSession(__): + def stop_fake_session(__): del self._entities_cache[name] - self.params.purgeProfile(name) + self.params.purge_profile(name) - d.addCallback(initPersonalKey) - d.addCallback(startFakeSession) + d.addCallback(init_personal_key) + d.addCallback(start_fake_session) d.addCallback( - lambda __: self.setParam( + lambda __: self.param_set( C.PROFILE_PASS_PATH[1], password, C.PROFILE_PASS_PATH[0], profile_key=name ) ) - d.addCallback(stopFakeSession) - d.addCallback(lambda __: self.auth_sessions.profileDelUnique(name)) + d.addCallback(stop_fake_session) + d.addCallback(lambda __: self.auth_sessions.profile_del_unique(name)) return d - def asyncDeleteProfile(self, name, force=False): + def profile_delete_async(self, name, force=False): """Delete an existing profile @param name: Name of the profile @@ -574,55 +574,55 @@ @return: a Deferred instance """ - def cleanMemory(__): - self.auth_sessions.profileDelUnique(name) + def clean_memory(__): + self.auth_sessions.profile_del_unique(name) try: del self._entities_cache[name] except KeyError: pass - d = self.params.asyncDeleteProfile(name, force) - d.addCallback(cleanMemory) + d = self.params.profile_delete_async(name, force) + d.addCallback(clean_memory) return d - def isComponent(self, profile_name): + def is_component(self, profile_name): """Tell if a profile is a component @param profile_name(unicode): name of the profile @return (bool): True if profile is a component @raise exceptions.NotFound: profile doesn't exist """ - return self.storage.profileIsComponent(profile_name) + return self.storage.profile_is_component(profile_name) - def getEntryPoint(self, profile_name): + def get_entry_point(self, profile_name): """Get a component entry point @param profile_name(unicode): name of the profile @return (bool): True if profile is a component @raise exceptions.NotFound: profile doesn't exist """ - return self.storage.getEntryPoint(profile_name) + return self.storage.get_entry_point(profile_name) ## History ## - def addToHistory(self, client, data): - return self.storage.addToHistory(data, client.profile) + def add_to_history(self, client, data): + return self.storage.add_to_history(data, client.profile) - def _historyGetSerialise(self, history_data): + def _history_get_serialise(self, history_data): return [ (uid, timestamp, from_jid, to_jid, message, subject, mess_type, data_format.serialise(extra)) for uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra in history_data ] - def _historyGet(self, from_jid_s, to_jid_s, limit=C.HISTORY_LIMIT_NONE, between=True, + def _history_get(self, from_jid_s, to_jid_s, limit=C.HISTORY_LIMIT_NONE, between=True, filters=None, profile=C.PROF_KEY_NONE): - d = self.historyGet(jid.JID(from_jid_s), jid.JID(to_jid_s), limit, between, + d = self.history_get(jid.JID(from_jid_s), jid.JID(to_jid_s), limit, between, filters, profile) - d.addCallback(self._historyGetSerialise) + d.addCallback(self._history_get_serialise) return d - def historyGet(self, from_jid, to_jid, limit=C.HISTORY_LIMIT_NONE, between=True, + def history_get(self, from_jid, to_jid, limit=C.HISTORY_LIMIT_NONE, between=True, filters=None, profile=C.PROF_KEY_NONE): """Retrieve messages in history @@ -636,31 +636,31 @@ @param filters (dict[unicode, unicode]): pattern to filter the history results (see bridge API for details) @param profile (str): %(doc_profile)s - @return (D(list)): list of message data as in [messageNew] + @return (D(list)): list of message data as in [message_new] """ assert profile != C.PROF_KEY_NONE if limit == C.HISTORY_LIMIT_DEFAULT: - limit = int(self.getParamA(C.HISTORY_LIMIT, "General", profile_key=profile)) + limit = int(self.param_get_a(C.HISTORY_LIMIT, "General", profile_key=profile)) elif limit == C.HISTORY_LIMIT_NONE: limit = None if limit == 0: return defer.succeed([]) - return self.storage.historyGet(from_jid, to_jid, limit, between, filters, profile) + return self.storage.history_get(from_jid, to_jid, limit, between, filters, profile) ## Statuses ## - def _getPresenceStatuses(self, profile_key): - ret = self.getPresenceStatuses(profile_key) + def _get_presence_statuses(self, profile_key): + ret = self.presence_statuses_get(profile_key) return {entity.full(): data for entity, data in ret.items()} - def getPresenceStatuses(self, profile_key): + def presence_statuses_get(self, profile_key): """Get all the presence statuses of a profile @param profile_key: %(doc_profile_key)s @return: presence data: key=entity JID, value=presence data for this entity """ - client = self.host.getClient(profile_key) - profile_cache = self._getProfileCache(client) + client = self.host.get_client(profile_key) + profile_cache = self._get_profile_cache(client) entities_presence = {} for entity_jid, entity_data in profile_cache.items(): @@ -668,7 +668,7 @@ full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: continue entities_presence.setdefault(entity_jid, {})[ @@ -677,7 +677,7 @@ return entities_presence - def setPresenceStatus(self, entity_jid, show, priority, statuses, profile_key): + def set_presence_status(self, entity_jid, show, priority, statuses, profile_key): """Change the presence status of an entity @param entity_jid: jid.JID of the entity @@ -686,26 +686,26 @@ @param statuses: dictionary of statuses @param profile_key: %(doc_profile_key)s """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) presence_data = PresenceTuple(show, priority, statuses) - self.updateEntityData( + self.update_entity_data( client, entity_jid, "presence", presence_data ) if entity_jid.resource and show != C.PRESENCE_UNAVAILABLE: # If a resource is available, bare jid should not have presence information try: - self.delEntityDatum(client, entity_jid.userhostJID(), "presence") + self.del_entity_datum(client, entity_jid.userhostJID(), "presence") except (KeyError, exceptions.UnknownEntityError): pass ## Resources ## - def _getAllResource(self, jid_s, profile_key): - client = self.host.getClient(profile_key) + def _get_all_resource(self, jid_s, profile_key): + client = self.host.get_client(profile_key) jid_ = jid.JID(jid_s) - return self.getAllResources(client, jid_) + return self.get_all_resources(client, jid_) - def getAllResources(self, client, entity_jid): + def get_all_resources(self, client, entity_jid): """Return all resource from jid for which we have had data in this session @param entity_jid: bare jid of the entity @@ -717,9 +717,9 @@ # FIXME: is there a need to keep cache data for resources which are not connected anymore? if entity_jid.resource: raise ValueError( - "getAllResources must be used with a bare jid (got {})".format(entity_jid) + "get_all_resources must be used with a bare jid (got {})".format(entity_jid) ) - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) try: entity_data = profile_cache[entity_jid.userhostJID()] except KeyError: @@ -730,21 +730,21 @@ resources.discard(None) return resources - def getAvailableResources(self, client, entity_jid): + def get_available_resources(self, client, entity_jid): """Return available resource for entity_jid - This method differs from getAllResources by returning only available resources + This method differs from get_all_resources by returning only available resources @param entity_jid: bare jid of the entit return (list[unicode]): list of available resources @raise exceptions.UnknownEntityError: if entity is not in cache """ available = [] - for resource in self.getAllResources(client, entity_jid): + for resource in self.get_all_resources(client, entity_jid): full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: log.debug("Can't get presence data for {}".format(full_jid)) else: @@ -752,12 +752,12 @@ available.append(resource) return available - def _getMainResource(self, jid_s, profile_key): - client = self.host.getClient(profile_key) + def _get_main_resource(self, jid_s, profile_key): + client = self.host.get_client(profile_key) jid_ = jid.JID(jid_s) - return self.getMainResource(client, jid_) or "" + return self.main_resource_get(client, jid_) or "" - def getMainResource(self, client, entity_jid): + def main_resource_get(self, client, entity_jid): """Return the main resource used by an entity @param entity_jid: bare entity jid @@ -765,15 +765,15 @@ """ if entity_jid.resource: raise ValueError( - "getMainResource must be used with a bare jid (got {})".format(entity_jid) + "main_resource_get must be used with a bare jid (got {})".format(entity_jid) ) try: - if self.host.plugins["XEP-0045"].isJoinedRoom(client, entity_jid): + if self.host.plugins["XEP-0045"].is_joined_room(client, entity_jid): return None # MUC rooms have no main resource except KeyError: # plugin not found pass try: - resources = self.getAllResources(client, entity_jid) + resources = self.get_all_resources(client, entity_jid) except exceptions.UnknownEntityError: log.warning("Entity is not in cache, we can't find any resource") return None @@ -782,7 +782,7 @@ full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: log.debug("No presence information for {}".format(full_jid)) continue @@ -795,7 +795,7 @@ ## Entities data ## - def _getProfileCache(self, client): + def _get_profile_cache(self, client): """Check profile validity and return its cache @param client: SatXMPPClient @@ -803,7 +803,7 @@ """ return self._entities_cache[client.profile] - def setSignalOnUpdate(self, key, signal=True): + def set_signal_on_update(self, key, signal=True): """Set a signal flag on the key When the key will be updated, a signal will be sent to frontends @@ -815,13 +815,13 @@ else: self._key_signals.discard(key) - def getAllEntitiesIter(self, client, with_bare=False): + def get_all_entities_iter(self, client, with_bare=False): """Return an iterator of full jids of all entities in cache @param with_bare: if True, include bare jids @return (list[unicode]): list of jids """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) # we construct a list of all known full jids (bare jid of entities x resources) for bare_jid, entity_data in profile_cache.items(): for resource in entity_data.keys(): @@ -831,22 +831,22 @@ full_jid.resource = resource yield full_jid - def updateEntityData( + def update_entity_data( self, client, entity_jid, key, value, silent=False ): """Set a misc data for an entity - If key was registered with setSignalOnUpdate, a signal will be sent to frontends + If key was registered with set_signal_on_update, a signal will be sent to frontends @param entity_jid: JID of the entity, C.ENTITY_ALL_RESOURCES for all resources of all entities, C.ENTITY_ALL for all entities (all resources + bare jids) @param key: key to set (eg: C.ENTITY_TYPE) @param value: value for this key (eg: C.ENTITY_TYPE_MUC) @param silent(bool): if True, doesn't send signal to frontend, even if there is a - signal flag (see setSignalOnUpdate) + signal flag (see set_signal_on_update) """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) if entity_jid in (C.ENTITY_ALL_RESOURCES, C.ENTITY_ALL): - entities = self.getAllEntitiesIter(client, entity_jid == C.ENTITY_ALL) + entities = self.get_all_entities_iter(client, entity_jid == C.ENTITY_ALL) else: entities = (entity_jid,) @@ -857,14 +857,14 @@ entity_data[key] = value if key in self._key_signals and not silent: - self.host.bridge.entityDataUpdated( + self.host.bridge.entity_data_updated( jid_.full(), key, data_format.serialise(value), client.profile ) - def delEntityDatum(self, client, entity_jid, key): + def del_entity_datum(self, client, entity_jid, key): """Delete a data for an entity @param entity_jid: JID of the entity, C.ENTITY_ALL_RESOURCES for all resources of all entities, @@ -874,9 +874,9 @@ @raise exceptions.UnknownEntityError: if entity is not in cache @raise KeyError: key is not in cache """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) if entity_jid in (C.ENTITY_ALL_RESOURCES, C.ENTITY_ALL): - entities = self.getAllEntitiesIter(client, entity_jid == C.ENTITY_ALL) + entities = self.get_all_entities_iter(client, entity_jid == C.ENTITY_ALL) else: entities = (entity_jid,) @@ -895,9 +895,9 @@ else: raise e - def _getEntitiesData(self, entities_jids, keys_list, profile_key): - client = self.host.getClient(profile_key) - ret = self.getEntitiesData( + def _get_entities_data(self, entities_jids, keys_list, profile_key): + client = self.host.get_client(profile_key) + ret = self.entities_data_get( client, [jid.JID(jid_) for jid_ in entities_jids], keys_list ) return { @@ -905,7 +905,7 @@ for jid_, data in ret.items() } - def getEntitiesData(self, client, entities_jids, keys_list=None): + def entities_data_get(self, client, entities_jids, keys_list=None): """Get a list of cached values for several entities at once @param entities_jids: jids of the entities, or empty list for all entities in cache @@ -920,7 +920,7 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - def fillEntityData(entity_cache_data): + def fill_entity_data(entity_cache_data): entity_data = {} if keys_list is None: entity_data = entity_cache_data @@ -932,7 +932,7 @@ continue return entity_data - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) ret_data = {} if entities_jids: for entity in entities_jids: @@ -942,21 +942,21 @@ ] except KeyError: continue - ret_data[entity.full()] = fillEntityData(entity_cache_data, keys_list) + ret_data[entity.full()] = fill_entity_data(entity_cache_data, keys_list) else: for bare_jid, data in profile_cache.items(): for resource, entity_cache_data in data.items(): full_jid = copy.copy(bare_jid) full_jid.resource = resource - ret_data[full_jid] = fillEntityData(entity_cache_data) + ret_data[full_jid] = fill_entity_data(entity_cache_data) return ret_data - def _getEntityData(self, entity_jid_s, keys_list=None, profile=C.PROF_KEY_NONE): - return self.getEntityData( - self.host.getClient(profile), jid.JID(entity_jid_s), keys_list) + def _get_entity_data(self, entity_jid_s, keys_list=None, profile=C.PROF_KEY_NONE): + return self.entity_data_get( + self.host.get_client(profile), jid.JID(entity_jid_s), keys_list) - def getEntityData(self, client, entity_jid, keys_list=None): + def entity_data_get(self, client, entity_jid, keys_list=None): """Get a list of cached values for entity @param entity_jid: JID of the entity @@ -968,7 +968,7 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) try: entity_data = profile_cache[entity_jid.userhostJID()][entity_jid.resource] except KeyError: @@ -982,7 +982,7 @@ return {key: entity_data[key] for key in keys_list if key in entity_data} - def getEntityDatum(self, client, entity_jid, key): + def get_entity_datum(self, client, entity_jid, key): """Get a datum from entity @param entity_jid: JID of the entity @@ -992,9 +992,9 @@ @raise exceptions.UnknownEntityError: if entity is not in cache @raise KeyError: if there is no value for this key and this entity """ - return self.getEntityData(client, entity_jid, (key,))[key] + return self.entity_data_get(client, entity_jid, (key,))[key] - def delEntityCache( + def del_entity_cache( self, entity_jid, delete_all_resources=True, profile_key=C.PROF_KEY_NONE ): """Remove all cached data for entity @@ -1005,8 +1005,8 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - client = self.host.getClient(profile_key) - profile_cache = self._getProfileCache(client) + client = self.host.get_client(profile_key) + profile_cache = self._get_profile_cache(client) if delete_all_resources: if entity_jid.resource: @@ -1027,7 +1027,7 @@ ## Encryption ## - def encryptValue(self, value, profile): + def encrypt_value(self, value, profile): """Encrypt a value for the given profile. The personal key must be loaded already in the profile session, that should be the case if the profile is already authenticated. @@ -1037,7 +1037,7 @@ @return: the deferred encrypted value """ try: - personal_key = self.auth_sessions.profileGetUnique(profile)[ + personal_key = self.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1047,7 +1047,7 @@ ) return BlockCipher.encrypt(personal_key, value) - def decryptValue(self, value, profile): + def decrypt_value(self, value, profile): """Decrypt a value for the given profile. The personal key must be loaded already in the profile session, that should be the case if the profile is already authenticated. @@ -1057,7 +1057,7 @@ @return: the deferred decrypted value """ try: - personal_key = self.auth_sessions.profileGetUnique(profile)[ + personal_key = self.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1067,7 +1067,7 @@ ) return BlockCipher.decrypt(personal_key, value) - def encryptPersonalData(self, data_key, data_value, crypto_key, profile): + def encrypt_personal_data(self, data_key, data_value, crypto_key, profile): """Re-encrypt a personal data (saved to a PersistentDict). @param data_key: key for the individual PersistentDict instance @@ -1077,7 +1077,7 @@ @return: a deferred None value """ - def gotIndMemory(data): + def got_ind_memory(data): data[data_key] = BlockCipher.encrypt(crypto_key, data_value) return data.force(data_key) @@ -1088,28 +1088,28 @@ ) d = PersistentDict(C.MEMORY_CRYPTO_NAMESPACE, profile).load() - return d.addCallback(gotIndMemory).addCallback(done) + return d.addCallback(got_ind_memory).addCallback(done) ## Subscription requests ## - def addWaitingSub(self, type_, entity_jid, profile_key): + def add_waiting_sub(self, type_, entity_jid, profile_key): """Called when a subcription request is received""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) assert profile if profile not in self.subscriptions: self.subscriptions[profile] = {} self.subscriptions[profile][entity_jid] = type_ - def delWaitingSub(self, entity_jid, profile_key): + def del_waiting_sub(self, entity_jid, profile_key): """Called when a subcription request is finished""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) assert profile if profile in self.subscriptions and entity_jid in self.subscriptions[profile]: del self.subscriptions[profile][entity_jid] - def getWaitingSub(self, profile_key): + def sub_waiting_get(self, profile_key): """Called to get a list of currently waiting subscription requests""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Asking waiting subscriptions for a non-existant profile")) return {} @@ -1120,13 +1120,13 @@ ## Parameters ## - def getStringParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - return self.params.getStringParamA(name, category, attr, profile_key) + def get_string_param_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + return self.params.get_string_param_a(name, category, attr, profile_key) - def getParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - return self.params.getParamA(name, category, attr, profile_key=profile_key) + def param_get_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + return self.params.param_get_a(name, category, attr, profile_key=profile_key) - def asyncGetParamA( + def param_get_a_async( self, name, category, @@ -1134,33 +1134,33 @@ security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE, ): - return self.params.asyncGetParamA( + return self.params.param_get_a_async( name, category, attr, security_limit, profile_key ) - def _getParamsValuesFromCategory( + def _get_params_values_from_category( self, category, security_limit, app, extra_s, profile_key ): - return self.params._getParamsValuesFromCategory( + return self.params._get_params_values_from_category( category, security_limit, app, extra_s, profile_key ) - def asyncGetStringParamA( + def async_get_string_param_a( self, name, category, attribute="value", security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): - profile = self.getProfileName(profile_key) - return defer.ensureDeferred(self.params.asyncGetStringParamA( + profile = self.get_profile_name(profile_key) + return defer.ensureDeferred(self.params.async_get_string_param_a( name, category, attribute, security_limit, profile )) - def _getParamsUI(self, security_limit, app, extra_s, profile_key): - return self.params._getParamsUI(security_limit, app, extra_s, profile_key) + def _get_params_ui(self, security_limit, app, extra_s, profile_key): + return self.params._get_params_ui(security_limit, app, extra_s, profile_key) - def getParamsCategories(self): - return self.params.getParamsCategories() + def params_categories_get(self): + return self.params.params_categories_get() - def setParam( + def param_set( self, name, value, @@ -1168,43 +1168,43 @@ security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE, ): - return self.params.setParam(name, value, category, security_limit, profile_key) + return self.params.param_set(name, value, category, security_limit, profile_key) - def updateParams(self, xml): - return self.params.updateParams(xml) + def update_params(self, xml): + return self.params.update_params(xml) - def paramsRegisterApp(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): - return self.params.paramsRegisterApp(xml, security_limit, app) + def params_register_app(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): + return self.params.params_register_app(xml, security_limit, app) - def setDefault(self, name, category, callback, errback=None): - return self.params.setDefault(name, category, callback, errback) + def set_default(self, name, category, callback, errback=None): + return self.params.set_default(name, category, callback, errback) ## Private Data ## - def _privateDataSet(self, namespace, key, data_s, profile_key): - client = self.host.getClient(profile_key) + def _private_data_set(self, namespace, key, data_s, profile_key): + client = self.host.get_client(profile_key) # we accept any type data = data_format.deserialise(data_s, type_check=None) - return defer.ensureDeferred(self.storage.setPrivateValue( + return defer.ensureDeferred(self.storage.set_private_value( namespace, key, data, binary=True, profile=client.profile)) - def _privateDataGet(self, namespace, key, profile_key): - client = self.host.getClient(profile_key) + def _private_data_get(self, namespace, key, profile_key): + client = self.host.get_client(profile_key) d = defer.ensureDeferred( - self.storage.getPrivates( + self.storage.get_privates( namespace, [key], binary=True, profile=client.profile) ) d.addCallback(lambda data_dict: data_format.serialise(data_dict.get(key))) return d - def _privateDataDelete(self, namespace, key, profile_key): - client = self.host.getClient(profile_key) - return defer.ensureDeferred(self.storage.delPrivateValue( + def _private_data_delete(self, namespace, key, profile_key): + client = self.host.get_client(profile_key) + return defer.ensureDeferred(self.storage.del_private_value( namespace, key, binary=True, profile=client.profile)) ## Files ## - def checkFilePermission( + def check_file_permission( self, file_data: dict, peer_jid: Optional[jid.JID], @@ -1213,7 +1213,7 @@ ) -> None: """Check that an entity has the right permission on a file - @param file_data: data of one file, as returned by getFiles + @param file_data: data of one file, as returned by get_files @param peer_jid: entity trying to access the file @param perms_to_check: permissions to check tuple of C.ACCESS_PERM_* @@ -1268,15 +1268,15 @@ _("unknown access type: {type}").format(type=perm_type) ) - async def checkPermissionToRoot(self, client, file_data, peer_jid, perms_to_check): - """do checkFilePermission on file_data and all its parents until root""" + async def check_permission_to_root(self, client, file_data, peer_jid, perms_to_check): + """do check_file_permission on file_data and all its parents until root""" current = file_data while True: - self.checkFilePermission(current, peer_jid, perms_to_check) + self.check_file_permission(current, peer_jid, perms_to_check) parent = current["parent"] if not parent: break - files_data = await self.getFiles( + files_data = await self.get_files( client, peer_jid=None, file_id=parent, perms_to_check=None ) try: @@ -1284,7 +1284,7 @@ except IndexError: raise exceptions.DataError("Missing parent") - async def _getParentDir( + async def _get_parent_dir( self, client, path, parent, namespace, owner, peer_jid, perms_to_check ): """Retrieve parent node from a path, or last existing directory @@ -1308,7 +1308,7 @@ # non existing directories will be created parent = "" for idx, path_elt in enumerate(path_elts): - directories = await self.storage.getFiles( + directories = await self.storage.get_files( client, parent=parent, type_=C.FILE_TYPE_DIRECTORY, @@ -1325,11 +1325,11 @@ ) else: directory = directories[0] - self.checkFilePermission(directory, peer_jid, perms_to_check) + self.check_file_permission(directory, peer_jid, perms_to_check) parent = directory["id"] return (parent, []) - def getFileAffiliations(self, file_data: dict) -> Dict[jid.JID, str]: + def get_file_affiliations(self, file_data: dict) -> Dict[jid.JID, str]: """Convert file access to pubsub like affiliations""" affiliations = {} access_data = file_data['access'] @@ -1352,7 +1352,7 @@ return affiliations - def _setFileAffiliationsUpdate( + def _set_file_affiliations_update( self, access: dict, file_data: dict, @@ -1401,7 +1401,7 @@ else: raise ValueError(f"unknown affiliation: {affiliation!r}") - async def setFileAffiliations( + async def set_file_affiliations( self, client, file_data: dict, @@ -1417,17 +1417,17 @@ - "none" removes both read and write permissions """ file_id = file_data['id'] - await self.fileUpdate( + await self.file_update( file_id, 'access', update_cb=partial( - self._setFileAffiliationsUpdate, + self._set_file_affiliations_update, file_data=file_data, affiliations=affiliations ), ) - def _setFileAccessModelUpdate( + def _set_file_access_model_update( self, access: dict, file_data: dict, @@ -1445,7 +1445,7 @@ if requested_type == C.ACCESS_TYPE_WHITELIST and 'jids' not in read_data: read_data['jids'] = [] - async def setFileAccessModel( + async def set_file_access_model( self, client, file_data: dict, @@ -1458,17 +1458,17 @@ - "whitelist": set whitelist to file/dir """ file_id = file_data['id'] - await self.fileUpdate( + await self.file_update( file_id, 'access', update_cb=partial( - self._setFileAccessModelUpdate, + self._set_file_access_model_update, file_data=file_data, access_model=access_model ), ) - def getFilesOwner( + def get_files_owner( self, client, owner: Optional[jid.JID], @@ -1499,7 +1499,7 @@ ) return peer_jid.userhostJID() - async def getFiles( + async def get_files( self, client, peer_jid, file_id=None, version=None, parent=None, path=None, type_=None, file_hash=None, hash_algo=None, name=None, namespace=None, mime_type=None, public_id=None, owner=None, access=None, projection=None, @@ -1526,7 +1526,7 @@ @param mime_type(unicode, None): filter on this mime type @param public_id(unicode, None): filter on this public id @param owner(jid.JID, None): if not None, only get files from this owner - @param access(dict, None): get file with given access (see [setFile]) + @param access(dict, None): get file with given access (see [set_file]) @param projection(list[unicode], None): name of columns to retrieve None to retrieve all @param unique(bool): if True will remove duplicates @@ -1534,7 +1534,7 @@ must be a tuple of C.ACCESS_PERM_* or None if None, permission will no be checked (peer_jid must be None too in this case) - other params are the same as for [setFile] + other params are the same as for [set_file] @return (list[dict]): files corresponding to filters @raise exceptions.NotFound: parent directory not found (when path is specified) @raise exceptions.PermissionError: peer_jid can't use perms_to_check for one of @@ -1546,11 +1546,11 @@ "if you want to disable permission check, both peer_jid and " "perms_to_check must be None" ) - owner = self.getFilesOwner(client, owner, peer_jid, file_id, parent) + owner = self.get_files_owner(client, owner, peer_jid, file_id, parent) if path is not None: path = str(path) - # permission are checked by _getParentDir - parent, remaining_path_elts = await self._getParentDir( + # permission are checked by _get_parent_dir + parent, remaining_path_elts = await self._get_parent_dir( client, path, parent, namespace, owner, peer_jid, perms_to_check ) if remaining_path_elts: @@ -1560,16 +1560,16 @@ if parent and peer_jid: # if parent is given directly and permission check is requested, # we need to check all the parents - parent_data = await self.storage.getFiles(client, file_id=parent) + parent_data = await self.storage.get_files(client, file_id=parent) try: parent_data = parent_data[0] except IndexError: raise exceptions.DataError("mising parent") - await self.checkPermissionToRoot( + await self.check_permission_to_root( client, parent_data, peer_jid, perms_to_check ) - files = await self.storage.getFiles( + files = await self.storage.get_files( client, file_id=file_id, version=version, @@ -1592,7 +1592,7 @@ to_remove = [] for file_data in files: try: - self.checkFilePermission( + self.check_file_permission( file_data, peer_jid, perms_to_check, set_affiliation=True ) except exceptions.PermissionError: @@ -1601,7 +1601,7 @@ files.remove(file_data) return files - async def setFile( + async def set_file( self, client, name, file_id=None, version="", parent=None, path=None, type_=C.FILE_TYPE_FILE, file_hash=None, hash_algo=None, size=None, namespace=None, mime_type=None, public_id=None, created=None, modified=None, @@ -1678,18 +1678,18 @@ raise ValueError( "version, file_hash, size and mime_type can't be set for a directory" ) - owner = self.getFilesOwner(client, owner, peer_jid, file_id, parent) + owner = self.get_files_owner(client, owner, peer_jid, file_id, parent) if path is not None: path = str(path) - # _getParentDir will check permissions if peer_jid is set, so we use owner - parent, remaining_path_elts = await self._getParentDir( + # _get_parent_dir will check permissions if peer_jid is set, so we use owner + parent, remaining_path_elts = await self._get_parent_dir( client, path, parent, namespace, owner, owner, perms_to_check ) # if remaining directories don't exist, we have to create them for new_dir in remaining_path_elts: new_dir_id = shortuuid.uuid() - await self.storage.setFile( + await self.storage.set_file( client, name=new_dir, file_id=new_dir_id, @@ -1706,7 +1706,7 @@ elif parent is None: parent = "" - await self.storage.setFile( + await self.storage.set_file( client, file_id=file_id, version=version, @@ -1726,7 +1726,7 @@ extra=extra, ) - async def fileGetUsedSpace( + async def file_get_used_space( self, client, peer_jid: jid.JID, @@ -1736,15 +1736,15 @@ @param peer_jid: entity requesting the size @param owner: entity owning the file to check. If None, will be determined by - getFilesOwner + get_files_owner @return: size of total space used by files of this owner """ - owner = self.getFilesOwner(client, owner, peer_jid) + owner = self.get_files_owner(client, owner, peer_jid) if peer_jid.userhostJID() != owner and client.profile not in self.admins: raise exceptions.PermissionError("You are not allowed to check this size") - return await self.storage.fileGetUsedSpace(client, owner) + return await self.storage.file_get_used_space(client, owner) - def fileUpdate(self, file_id, column, update_cb): + def file_update(self, file_id, column, update_cb): """Update a file column taking care of race condition access is NOT checked in this method, it must be checked beforehand @@ -1754,10 +1754,10 @@ the method will take older value as argument, and must update it in place Note that the callable must be thread-safe """ - return self.storage.fileUpdate(file_id, column, update_cb) + return self.storage.file_update(file_id, column, update_cb) @defer.inlineCallbacks - def _deleteFile( + def _delete_file( self, client, peer_jid: jid.JID, @@ -1778,7 +1778,7 @@ "file {file_name} can't be deleted, {peer_jid} is not the owner" .format(file_name=file_data['name'], peer_jid=peer_jid.full())) if file_data['type'] == C.FILE_TYPE_DIRECTORY: - sub_files = yield self.getFiles(client, peer_jid, parent=file_data['id']) + sub_files = yield self.get_files(client, peer_jid, parent=file_data['id']) if sub_files and not recursive: raise exceptions.DataError(_("Can't delete directory, it is not empty")) # we first delete the sub-files @@ -1787,15 +1787,15 @@ sub_file_path = files_path / sub_file_data['name'] else: sub_file_path = files_path - yield self._deleteFile( + yield self._delete_file( client, peer_jid, recursive, sub_file_path, sub_file_data) # then the directory itself - yield self.storage.fileDelete(file_data['id']) + yield self.storage.file_delete(file_data['id']) elif file_data['type'] == C.FILE_TYPE_FILE: log.info(_("deleting file {name} with hash {file_hash}").format( name=file_data['name'], file_hash=file_data['file_hash'])) - yield self.storage.fileDelete(file_data['id']) - references = yield self.getFiles( + yield self.storage.file_delete(file_data['id']) + references = yield self.get_files( client, peer_jid, file_hash=file_data['file_hash']) if references: log.debug("there are still references to the file, we keep it") @@ -1811,7 +1811,7 @@ raise exceptions.InternalError('Unexpected file type: {file_type}' .format(file_type=file_data['type'])) - async def fileDelete(self, client, peer_jid, file_id, recursive=False): + async def file_delete(self, client, peer_jid, file_id, recursive=False): """Delete a single file or a directory and all its sub-files @param file_id(unicode): id of the file to delete @@ -1821,7 +1821,7 @@ """ # FIXME: we only allow owner of file to delete files for now, but WRITE access # should be checked too - files_data = await self.getFiles(client, peer_jid, file_id) + files_data = await self.get_files(client, peer_jid, file_id) if not files_data: raise exceptions.NotFound("Can't find the file with id {file_id}".format( file_id=file_id)) @@ -1829,11 +1829,11 @@ if file_data["type"] != C.FILE_TYPE_DIRECTORY and recursive: raise ValueError("recursive can only be set for directories") files_path = self.host.get_local_path(None, C.FILES_DIR) - await self._deleteFile(client, peer_jid, recursive, files_path, file_data) + await self._delete_file(client, peer_jid, recursive, files_path, file_data) ## Cache ## - def getCachePath(self, namespace: str, *args: str) -> Path: + def get_cache_path(self, namespace: str, *args: str) -> Path: """Get path to use to get a common path for a namespace This can be used by plugins to manage permanent data. It's the responsability @@ -1844,13 +1844,13 @@ namespace = namespace.strip().lower() return Path( self._cache_path, - regex.pathEscape(namespace), - *(regex.pathEscape(a) for a in args) + regex.path_escape(namespace), + *(regex.path_escape(a) for a in args) ) ## Misc ## - def isEntityAvailable(self, client, entity_jid): + def is_entity_available(self, client, entity_jid): """Tell from the presence information if the given entity is available. @param entity_jid (JID): the entity to check (if bare jid is used, all resources are tested) @@ -1858,20 +1858,20 @@ """ if not entity_jid.resource: return bool( - self.getAvailableResources(client, entity_jid) + self.get_available_resources(client, entity_jid) ) # is any resource is available, entity is available try: - presence_data = self.getEntityDatum(client, entity_jid, "presence") + presence_data = self.get_entity_datum(client, entity_jid, "presence") except KeyError: log.debug("No presence information for {}".format(entity_jid)) return False return presence_data.show != C.PRESENCE_UNAVAILABLE - def isAdmin(self, profile: str) -> bool: + def is_admin(self, profile: str) -> bool: """Tell if given profile has administrator privileges""" return profile in self.admins - def isAdminJID(self, entity: jid.JID) -> bool: + def is_admin_jid(self, entity: jid.JID) -> bool: """Tells if an entity jid correspond to an admin one It is sometime not possible to use the profile alone to check if an entity is an diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/migration/env.py --- a/sat/memory/migration/env.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/migration/env.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,7 +38,7 @@ script output. """ - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() context.configure( url=db_config["url"], target_metadata=target_metadata, @@ -76,7 +76,7 @@ and associate a connection with the context. """ - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() engine = create_async_engine( db_config["url"], poolclass=pool.NullPool, diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/params.py --- a/sat/memory/params.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/params.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ from twisted.python.failure import Failure from twisted.words.xish import domish from twisted.words.protocols.jabber import jid -from sat.tools.xml_tools import paramsXML2XMLUI, getText +from sat.tools.xml_tools import params_xml_2_xmlui, get_text from sat.tools.common import data_format from xml.sax.saxutils import quoteattr @@ -38,7 +38,7 @@ # this need an overall simplification to make maintenance easier -def createJidElts(jids): +def create_jid_elts(jids): """Generator which return elements from jids @param jids(iterable[id.jID]): jids to use @@ -101,18 +101,18 @@ def load_default_params(self): self.dom = minidom.parseString(Params.default_xml.encode("utf-8")) - def _mergeParams(self, source_node, dest_node): + def _merge_params(self, source_node, dest_node): """Look for every node in source_node and recursively copy them to dest if they don't exists""" - def getNodesMap(children): + def get_nodes_map(children): ret = {} for child in children: if child.nodeType == child.ELEMENT_NODE: ret[(child.tagName, child.getAttribute("name"))] = child return ret - source_map = getNodesMap(source_node.childNodes) - dest_map = getNodesMap(dest_node.childNodes) + source_map = get_nodes_map(source_node.childNodes) + dest_map = get_nodes_map(dest_node.childNodes) source_set = set(source_map.keys()) dest_set = set(dest_map.keys()) to_add = source_set.difference(dest_set) @@ -122,22 +122,22 @@ to_recurse = source_set - to_add for node_key in to_recurse: - self._mergeParams(source_map[node_key], dest_map[node_key]) + self._merge_params(source_map[node_key], dest_map[node_key]) def load_xml(self, xml_file): """Load parameters template from xml file""" self.dom = minidom.parse(xml_file) default_dom = minidom.parseString(Params.default_xml.encode("utf-8")) - self._mergeParams(default_dom.documentElement, self.dom.documentElement) + self._merge_params(default_dom.documentElement, self.dom.documentElement) - def loadGenParams(self): + def load_gen_params(self): """Load general parameters data from storage @return: deferred triggered once params are loaded """ - return self.storage.loadGenParams(self.params_gen) + return self.storage.load_gen_params(self.params_gen) - def loadIndParams(self, profile, cache=None): + def load_ind_params(self, profile, cache=None): """Load individual parameters set self.params cache or a temporary cache @@ -147,11 +147,11 @@ """ if cache is None: self.params[profile] = {} - return self.storage.loadIndParams( + return self.storage.load_ind_params( self.params[profile] if cache is None else cache, profile ) - def purgeProfile(self, profile): + def purge_profile(self, profile): """Remove cache data of a profile @param profile: %(doc_profile)s @@ -176,7 +176,7 @@ self.params = {} self.params_gen = {} - def createProfile(self, profile, component): + def create_profile(self, profile, component): """Create a new profile @param profile(unicode): name of the profile @@ -184,14 +184,14 @@ @param callback: called when the profile actually exists in database and memory @return: a Deferred instance """ - if self.storage.hasProfile(profile): + if self.storage.has_profile(profile): log.info(_("The profile name already exists")) return defer.fail(exceptions.ConflictError()) if not self.host.trigger.point("ProfileCreation", profile): return defer.fail(exceptions.CancelError()) - return self.storage.createProfile(profile, component or None) + return self.storage.create_profile(profile, component or None) - def asyncDeleteProfile(self, profile, force=False): + def profile_delete_async(self, profile, force=False): """Delete an existing profile @param profile: name of the profile @@ -199,18 +199,18 @@ To be used for direct calls only (not through the bridge). @return: a Deferred instance """ - if not self.storage.hasProfile(profile): + if not self.storage.has_profile(profile): log.info(_("Trying to delete an unknown profile")) return defer.fail(Failure(exceptions.ProfileUnknownError(profile))) - if self.host.isConnected(profile): + if self.host.is_connected(profile): if force: self.host.disconnect(profile) else: log.info(_("Trying to delete a connected profile")) return defer.fail(Failure(exceptions.ProfileConnected)) - return self.storage.deleteProfile(profile) + return self.storage.delete_profile(profile) - def getProfileName(self, profile_key, return_profile_keys=False): + def get_profile_name(self, profile_key, return_profile_keys=False): """return profile according to profile_key @param profile_key: profile name or key which can be @@ -229,7 +229,7 @@ try: default = self.host.memory.memory_data[ "Profile_default" - ] = self.storage.getProfilesList()[0] + ] = self.storage.get_profiles_list()[0] except IndexError: log.info(_("No profile exist yet")) raise exceptions.ProfileUnknownError(profile_key) @@ -240,7 +240,7 @@ raise exceptions.ProfileNotSetError elif return_profile_keys and profile_key in [C.PROF_KEY_ALL]: return profile_key # this value must be managed by the caller - if not self.storage.hasProfile(profile_key): + if not self.storage.has_profile(profile_key): log.error(_("Trying to access an unknown profile (%s)") % profile_key) raise exceptions.ProfileUnknownError(profile_key) return profile_key @@ -260,7 +260,7 @@ # the node is new return None - def updateParams(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): + def update_params(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): """import xml in parameters, update if the param already exists If security_limit is specified and greater than -1, the parameters @@ -287,7 +287,7 @@ 0 ) # count the params to be removed from current category for node in cat_node.childNodes: - if node.nodeName != "param" or not self.checkSecurityLimit( + if node.nodeName != "param" or not self.check_security_limit( node, security_limit ): to_remove.append(node) @@ -324,7 +324,7 @@ pre_process_app_node(src_parent, security_limit, app) import_node(self.dom.documentElement, src_parent) - def paramsRegisterApp(self, xml, security_limit, app): + def params_register_app(self, xml, security_limit, app): """Register frontend's specific parameters If security_limit is specified and greater than -1, the parameters @@ -351,12 +351,12 @@ ) return self.frontends_cache.append(app) - self.updateParams(xml, security_limit, app) + self.update_params(xml, security_limit, app) log.debug("Frontends parameters registered for %(app)s" % {"app": app}) def __default_ok(self, value, name, category): # FIXME: will not work with individual parameters - self.setParam(name, value, category) + self.param_set(name, value, category) def __default_ko(self, failure, name, category): log.error( @@ -364,7 +364,7 @@ % {"category": category, "name": name, "reason": str(failure.value)} ) - def setDefault(self, name, category, callback, errback=None): + def set_default(self, name, category, callback, errback=None): """Set default value of parameter 'default_cb' attibute of parameter must be set to 'yes' @@ -376,10 +376,10 @@ # TODO: send signal param update if value changed # TODO: manage individual paramaters log.debug( - "setDefault called for %(category)s/%(name)s" + "set_default called for %(category)s/%(name)s" % {"category": category, "name": name} ) - node = self._getParamNode(name, category, "@ALL@") + node = self._get_param_node(name, category, "@ALL@") if not node: log.error( _( @@ -390,15 +390,15 @@ return if node[1].getAttribute("default_cb") == "yes": # del node[1].attributes['default_cb'] # default_cb is not used anymore as a flag to know if we have to set the default value, - # and we can still use it later e.g. to call a generic setDefault method - value = self._getParam(category, name, C.GENERAL) + # and we can still use it later e.g. to call a generic set_default method + value = self._get_param(category, name, C.GENERAL) if value is None: # no value set by the user: we have the default value log.debug("Default value to set, using callback") d = defer.maybeDeferred(callback) d.addCallback(self.__default_ok, name, category) d.addErrback(errback or self.__default_ko, name, category) - def _getAttr_internal(self, node, attr, value): + def _get_attr_internal(self, node, attr, value): """Get attribute value. /!\ This method would return encrypted password values. @@ -464,7 +464,7 @@ "\t" ) # FIXME: it's not good to use tabs as separator ! else: # no user defined value, take default value from the XML - jids = [getText(jid_) for jid_ in node.getElementsByTagName("jid")] + jids = [get_text(jid_) for jid_ in node.getElementsByTagName("jid")] to_delete = [] for idx, value in enumerate(jids): try: @@ -480,7 +480,7 @@ return value_to_use return node.getAttribute(attr) - def _getAttr(self, node, attr, value): + def _get_attr(self, node, attr, value): """Get attribute value (synchronous). /!\ This method can not be used to retrieve password values. @@ -491,11 +491,11 @@ """ if attr == "value" and node.getAttribute("type") == "password": raise exceptions.InternalError( - "To retrieve password values, use _asyncGetAttr instead of _getAttr" + "To retrieve password values, use _async_get_attr instead of _get_attr" ) - return self._getAttr_internal(node, attr, value) + return self._get_attr_internal(node, attr, value) - def _asyncGetAttr(self, node, attr, value, profile=None): + def _async_get_attr(self, node, attr, value, profile=None): """Get attribute value. Profile passwords are returned hashed (if not empty), @@ -506,7 +506,7 @@ @param profile: %(doc_profile)s @return (unicode, bool, int, list): Deferred value to retrieve """ - value = self._getAttr_internal(node, attr, value) + value = self._get_attr_internal(node, attr, value) if attr != "value" or node.getAttribute("type") != "password": return defer.succeed(value) param_cat = node.parentNode.getAttribute("name") @@ -519,7 +519,7 @@ raise exceptions.ProfileNotSetError( "The profile is needed to decrypt a password" ) - password = self.host.memory.decryptValue(value, profile) + password = self.host.memory.decrypt_value(value, profile) if password is None: raise exceptions.InternalError("password should never be None") @@ -528,25 +528,25 @@ def _type_to_str(self, result): """Convert result to string, according to its type """ if isinstance(result, bool): - return C.boolConst(result) + return C.bool_const(result) elif isinstance(result, (list, set, tuple)): return ', '.join(self._type_to_str(r) for r in result) else: return str(result) - def getStringParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - """ Same as getParamA but for bridge: convert non string value to string """ + def get_string_param_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + """ Same as param_get_a but for bridge: convert non string value to string """ return self._type_to_str( - self.getParamA(name, category, attr, profile_key=profile_key) + self.param_get_a(name, category, attr, profile_key=profile_key) ) - def getParamA( + def param_get_a( self, name, category, attr="value", use_default=True, profile_key=C.PROF_KEY_NONE ): """Helper method to get a specific attribute. /!\ This method would return encrypted password values, - to get the plain values you have to use asyncGetParamA. + to get the plain values you have to use param_get_a_async. @param name: name of the parameter @param category: category of the parameter @param attr: name of the attribute (default: "value") @@ -557,7 +557,7 @@ """ # FIXME: looks really dirty and buggy, need to be reviewed/refactored # FIXME: security_limit is not managed here ! - node = self._getParamNode(name, category) + node = self._get_param_node(name, category) if not node: log.error( _( @@ -569,18 +569,18 @@ if attr == "value" and node[1].getAttribute("type") == "password": raise exceptions.InternalError( - "To retrieve password values, use asyncGetParamA instead of getParamA" + "To retrieve password values, use param_get_a_async instead of param_get_a" ) if node[0] == C.GENERAL: - value = self._getParam(category, name, C.GENERAL) + value = self._get_param(category, name, C.GENERAL) if value is None and attr == "value" and not use_default: return value - return self._getAttr(node[1], attr, value) + return self._get_attr(node[1], attr, value) assert node[0] == C.INDIVIDUAL - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Requesting a param for an non-existant profile")) raise exceptions.ProfileUnknownError(profile_key) @@ -590,19 +590,19 @@ raise exceptions.ProfileNotConnected(profile) if attr == "value": - value = self._getParam(category, name, profile=profile) + value = self._get_param(category, name, profile=profile) if value is None and attr == "value" and not use_default: return value - return self._getAttr(node[1], attr, value) + return self._get_attr(node[1], attr, value) - async def asyncGetStringParamA( + async def async_get_string_param_a( self, name, category, attr="value", security_limit=C.NO_SECURITY_LIMIT, profile=C.PROF_KEY_NONE): - value = await self.asyncGetParamA( + value = await self.param_get_a_async( name, category, attr, security_limit, profile_key=profile) return self._type_to_str(value) - def asyncGetParamA( + def param_get_a_async( self, name, category, @@ -618,7 +618,7 @@ @param profile: owner of the param (@ALL@ for everyone) @return (defer.Deferred): parameter value, with corresponding type (bool, int, list, etc) """ - node = self._getParamNode(name, category) + node = self._get_param_node(name, category) if not node: log.error( _( @@ -628,7 +628,7 @@ ) raise ValueError("Requested param doesn't exist") - if not self.checkSecurityLimit(node[1], security_limit): + if not self.check_security_limit(node[1], security_limit): log.warning( _( "Trying to get parameter '%(param)s' in category '%(cat)s' without authorization!!!" @@ -638,12 +638,12 @@ raise exceptions.PermissionError if node[0] == C.GENERAL: - value = self._getParam(category, name, C.GENERAL) - return self._asyncGetAttr(node[1], attr, value) + value = self._get_param(category, name, C.GENERAL) + return self._async_get_attr(node[1], attr, value) assert node[0] == C.INDIVIDUAL - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: raise exceptions.InternalError( _("Requesting a param for a non-existant profile") @@ -652,23 +652,23 @@ if attr != "value": return defer.succeed(node[1].getAttribute(attr)) try: - value = self._getParam(category, name, profile=profile) - return self._asyncGetAttr(node[1], attr, value, profile) + value = self._get_param(category, name, profile=profile) + return self._async_get_attr(node[1], attr, value, profile) except exceptions.ProfileNotInCacheError: # We have to ask data to the storage manager - d = self.storage.getIndParam(category, name, profile) + d = self.storage.get_ind_param(category, name, profile) return d.addCallback( - lambda value: self._asyncGetAttr(node[1], attr, value, profile) + lambda value: self._async_get_attr(node[1], attr, value, profile) ) - def _getParamsValuesFromCategory( + def _get_params_values_from_category( self, category, security_limit, app, extra_s, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) - return defer.ensureDeferred(self.getParamsValuesFromCategory( + return defer.ensureDeferred(self.get_params_values_from_category( client, category, security_limit, app, extra)) - async def getParamsValuesFromCategory( + async def get_params_values_from_category( self, client, category, security_limit, app='', extra=None): """Get all parameters "attribute" for a category @@ -676,14 +676,14 @@ @param security_limit(int): NO_SECURITY_LIMIT (-1) to return all the params. Otherwise sole the params which have a security level defined *and* lower or equal to the specified value are returned. - @param app(str): see [getParams] - @param extra(dict): see [getParams] + @param app(str): see [get_params] + @param extra(dict): see [get_params] @return (dict): key: param name, value: param value (converted to string if needed) """ # TODO: manage category of general type (without existant profile) if extra is None: extra = {} - prof_xml = await self._constructProfileXml(client, security_limit, app, extra) + prof_xml = await self._construct_profile_xml(client, security_limit, app, extra) ret = {} for category_node in prof_xml.getElementsByTagName("category"): if category_node.getAttribute("name") == category: @@ -696,7 +696,7 @@ ) ) continue - value = await self.asyncGetStringParamA( + value = await self.async_get_string_param_a( name, category, security_limit=security_limit, profile=client.profile) @@ -706,7 +706,7 @@ prof_xml.unlink() return ret - def _getParam( + def _get_param( self, category, name, type_=C.INDIVIDUAL, cache=None, profile=C.PROF_KEY_NONE ): """Return the param, or None if it doesn't exist @@ -736,7 +736,7 @@ return None return cache[(category, name)] - async def _constructProfileXml(self, client, security_limit, app, extra): + async def _construct_profile_xml(self, client, security_limit, app, extra): """Construct xml for asked profile, filling values when needed /!\ as noticed in doc, don't forget to unlink the minidom.Document @@ -749,18 +749,18 @@ """ profile = client.profile - def checkNode(node): + def check_node(node): """Check the node against security_limit, app and extra""" - return (self.checkSecurityLimit(node, security_limit) - and self.checkApp(node, app) - and self.checkExtra(node, extra)) + return (self.check_security_limit(node, security_limit) + and self.check_app(node, app) + and self.check_extra(node, extra)) if profile in self.params: profile_cache = self.params[profile] else: # profile is not in cache, we load values in a short time cache profile_cache = {} - await self.loadIndParams(profile, profile_cache) + await self.load_ind_params(profile, profile_cache) # init the result document prof_xml = minidom.parseString("") @@ -782,7 +782,7 @@ for node in dest_cat.childNodes: if node.nodeName != "param": continue - if not checkNode(node): + if not check_node(node): to_remove.append(node) continue dest_params[node.getAttribute("name")] = node @@ -799,14 +799,14 @@ # we have to merge new params (we are parsing individual parameters, we have to add them # to the previously parsed general ones) name = param_node.getAttribute("name") - if not checkNode(param_node): + if not check_node(param_node): continue if name not in dest_params: # this is reached when a previous category exists dest_params[name] = param_node.cloneNode(True) dest_cat.appendChild(dest_params[name]) - profile_value = self._getParam( + profile_value = self._get_param( category, name, type_node.nodeName, @@ -867,12 +867,12 @@ return prof_xml - def _getParamsUI(self, security_limit, app, extra_s, profile_key): - client = self.host.getClient(profile_key) + def _get_params_ui(self, security_limit, app, extra_s, profile_key): + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) - return defer.ensureDeferred(self.getParamsUI(client, security_limit, app, extra)) + return defer.ensureDeferred(self.param_ui_get(client, security_limit, app, extra)) - async def getParamsUI(self, client, security_limit, app, extra=None): + async def param_ui_get(self, client, security_limit, app, extra=None): """Get XMLUI to handle parameters @param security_limit: NO_SECURITY_LIMIT (-1) to return all the params. @@ -883,10 +883,10 @@ - ignore: list of (category/name) values to remove from parameters @return(str): a SàT XMLUI for parameters """ - param_xml = await self.getParams(client, security_limit, app, extra) - return paramsXML2XMLUI(param_xml) + param_xml = await self.get_params(client, security_limit, app, extra) + return params_xml_2_xmlui(param_xml) - async def getParams(self, client, security_limit, app, extra=None): + async def get_params(self, client, security_limit, app, extra=None): """Construct xml for asked profile, take params xml as skeleton @param security_limit: NO_SECURITY_LIMIT (-1) to return all the params. @@ -900,12 +900,12 @@ """ if extra is None: extra = {} - prof_xml = await self._constructProfileXml(client, security_limit, app, extra) + prof_xml = await self._construct_profile_xml(client, security_limit, app, extra) return_xml = prof_xml.toxml() prof_xml.unlink() return "\n".join((line for line in return_xml.split("\n") if line)) - def _getParamNode(self, name, category, type_="@ALL@"): # FIXME: is type_ useful ? + def _get_param_node(self, name, category, type_="@ALL@"): # FIXME: is type_ useful ? """Return a node from the param_xml @param name: name of the node @param category: category of the node @@ -931,7 +931,7 @@ return (type_node.nodeName, param) return None - def getParamsCategories(self): + def params_categories_get(self): """return the categories availables""" categories = [] for cat in self.dom.getElementsByTagName("category"): @@ -940,7 +940,7 @@ categories.append(cat.getAttribute("name")) return categories - def setParam(self, name, value, category, security_limit=C.NO_SECURITY_LIMIT, + def param_set(self, name, value, category, security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): """Set a parameter, return None if the parameter is not in param xml. @@ -955,14 +955,14 @@ @param profile_key (str): %(doc_profile_key)s @return: a deferred None value when everything is done """ - # FIXME: setParam should accept the right type for value, not only str ! + # FIXME: param_set should accept the right type for value, not only str ! if profile_key != C.PROF_KEY_NONE: - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Trying to set parameter for an unknown profile")) raise exceptions.ProfileUnknownError(profile_key) - node = self._getParamNode(name, category, "@ALL@") + node = self._get_param_node(name, category, "@ALL@") if not node: log.error( _("Requesting an unknown parameter (%(category)s/%(name)s)") @@ -970,7 +970,7 @@ ) return defer.succeed(None) - if not self.checkSecurityLimit(node[1], security_limit): + if not self.check_security_limit(node[1], security_limit): msg = _( "{profile!r} is trying to set parameter {name!r} in category " "{category!r} without authorization!!!").format( @@ -1018,12 +1018,12 @@ if node[0] == C.GENERAL: self.params_gen[(category, name)] = value - self.storage.setGenParam(category, name, value) - for profile in self.storage.getProfilesList(): - if self.host.memory.isSessionStarted(profile): - self.host.bridge.paramUpdate(name, value, category, profile) + self.storage.set_gen_param(category, name, value) + for profile in self.storage.get_profiles_list(): + if self.host.memory.is_session_started(profile): + self.host.bridge.param_update(name, value, category, profile) self.host.trigger.point( - "paramUpdateTrigger", name, value, category, node[0], profile + "param_update_trigger", name, value, category, node[0], profile ) return defer.succeed(None) @@ -1035,7 +1035,7 @@ return defer.succeed(None) elif type_ == "password": try: - personal_key = self.host.memory.auth_sessions.profileGetUnique(profile)[ + personal_key = self.host.memory.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1044,7 +1044,7 @@ ) if (category, name) == C.PROFILE_PASS_PATH: # using 'value' as the encryption key to encrypt another encryption key... could be confusing! - d = self.host.memory.encryptPersonalData( + d = self.host.memory.encrypt_personal_data( data_key=C.MEMORY_CRYPTO_KEY, data_value=personal_key, crypto_key=value, @@ -1060,21 +1060,21 @@ else: d = defer.succeed(value) - def gotFinalValue(value): - if self.host.memory.isSessionStarted(profile): + def got_final_value(value): + if self.host.memory.is_session_started(profile): self.params[profile][(category, name)] = value - self.host.bridge.paramUpdate(name, value, category, profile) + self.host.bridge.param_update(name, value, category, profile) self.host.trigger.point( - "paramUpdateTrigger", name, value, category, node[0], profile + "param_update_trigger", name, value, category, node[0], profile ) - return self.storage.setIndParam(category, name, value, profile) + return self.storage.set_ind_param(category, name, value, profile) else: raise exceptions.ProfileNotConnected - d.addCallback(gotFinalValue) + d.addCallback(got_final_value) return d - def _getNodesOfTypes(self, attr_type, node_type="@ALL@"): + def _get_nodes_of_types(self, attr_type, node_type="@ALL@"): """Return all the nodes matching the given types. TODO: using during the dev but not anymore... remove if not needed @@ -1105,7 +1105,7 @@ ret[(cat, param.getAttribute("name"))] = param return ret - def checkSecurityLimit(self, node, security_limit): + def check_security_limit(self, node, security_limit): """Check the given node against the given security limit. The value NO_SECURITY_LIMIT (-1) means that everything is allowed. @return: True if this node can be accessed with the given security limit. @@ -1117,7 +1117,7 @@ return True return False - def checkApp(self, node, app): + def check_app(self, node, app): """Check the given node against the given app. @param node: parameter node @@ -1128,7 +1128,7 @@ return True return node.getAttribute("app") == app - def checkExtra(self, node, extra): + def check_extra(self, node, extra): """Check the given node against the extra filters. @param node: parameter node @@ -1147,7 +1147,7 @@ return True -def makeOptions(options, selected=None): +def make_options(options, selected=None): """Create option XML form dictionary @param options(dict): option's name => option's label map diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/persistent.py --- a/sat/memory/persistent.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/persistent.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ self.namespace = namespace self.profile = profile - def _setCache(self, data): + def _set_cache(self, data): self._cache = data def load(self): @@ -60,10 +60,10 @@ need to be called before any other operation @return: defers the PersistentDict instance itself """ - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) - d.addCallback(self._setCache) + d.addCallback(self._set_cache) d.addCallback(lambda __: self) return d @@ -117,20 +117,20 @@ def __setitem__(self, key, value): defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) return self._cache.__setitem__(key, value) def __delitem__(self, key): - self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + self.storage.del_private_value(self.namespace, key, self.binary, self.profile) return self._cache.__delitem__(key) def clear(self): """Delete all values from this namespace""" self._cache.clear() - return self.storage.delPrivateNamespace(self.namespace, self.binary, self.profile) + return self.storage.del_private_namespace(self.namespace, self.binary, self.profile) def get(self, key, default=None): return self._cache.get(key, default) @@ -139,7 +139,7 @@ """Async set, return a Deferred fired when value is actually stored""" self._cache.__setitem__(key, value) return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) @@ -147,7 +147,7 @@ def adel(self, key): """Async del, return a Deferred fired when value is actually deleted""" self._cache.__delitem__(key) - return self.storage.delPrivateValue( + return self.storage.del_private_value( self.namespace, key, self.binary, self.profile) def setdefault(self, key, default): @@ -163,7 +163,7 @@ @return: deferred fired when data is actually saved """ return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, name, self._cache[name], self.binary, self.profile ) ) @@ -192,14 +192,14 @@ raise NotImplementedError def items(self): - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) d.addCallback(lambda data_dict: data_dict.items()) return d def all(self): - return defer.ensureDeferred(self.storage.getPrivates( + return defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) @@ -252,7 +252,7 @@ def __getitem__(self, key): """get the value as a Deferred""" - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, keys=[key], binary=self.binary, profile=self.profile )) d.addCallback(self._data2value, key) @@ -260,21 +260,21 @@ def __setitem__(self, key, value): defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) def __delitem__(self, key): - self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + self.storage.del_private_value(self.namespace, key, self.binary, self.profile) - def _defaultOrException(self, failure_, default): + def _default_or_exception(self, failure_, default): failure_.trap(KeyError) return default def get(self, key, default=None): d = self.__getitem__(key) - d.addErrback(self._defaultOrException, default=default) + d.addErrback(self._default_or_exception, default=default) return d def aset(self, key, value): @@ -282,7 +282,7 @@ # FIXME: redundant with force, force must be removed # XXX: similar as PersistentDict.aset, but doesn't use cache return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) @@ -290,7 +290,7 @@ def adel(self, key): """Async del, return a Deferred fired when value is actually deleted""" # XXX: similar as PersistentDict.adel, but doesn't use cache - return self.storage.delPrivateValue( + return self.storage.del_private_value( self.namespace, key, self.binary, self.profile) def setdefault(self, key, default): @@ -303,7 +303,7 @@ @return: deferred fired when data is actually saved """ return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, name, value, self.binary, self.profile ) ) @@ -314,4 +314,4 @@ @param key(unicode): key to delete @return (D): A deferred fired when delete is done """ - return self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + return self.storage.del_private_value(self.namespace, key, self.binary, self.profile) diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/sqla.py --- a/sat/memory/sqla.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/sqla.py Sat Apr 08 13:54:42 2023 +0200 @@ -122,10 +122,10 @@ # profile id to component entry point self.components: Dict[int, str] = {} - def getProfileById(self, profile_id): + def get_profile_by_id(self, profile_id): return self.profiles.get(profile_id) - async def migrateApply(self, *args: str, log_output: bool = False) -> None: + async def migrate_apply(self, *args: str, log_output: bool = False) -> None: """Do a migration command Commands are applied by running Alembic in a subprocess. @@ -167,7 +167,7 @@ await conn.run_sync(Base.metadata.create_all) log.debug("stamping the database") - await self.migrateApply("stamp", "head") + await self.migrate_apply("stamp", "head") log.debug("stamping done") def _check_db_is_up_to_date(self, conn: Connection) -> bool: @@ -193,14 +193,14 @@ else: log.info("Database needs to be updated") log.info("updating…") - await self.migrateApply("upgrade", "head", log_output=True) + await self.migrate_apply("upgrade", "head", log_output=True) log.info("Database is now up-to-date") @aio async def initialise(self) -> None: log.info(_("Connecting database")) - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() engine = create_async_engine( db_config["url"], future=True, @@ -288,31 +288,31 @@ ## Profiles - def getProfilesList(self) -> List[str]: + def get_profiles_list(self) -> List[str]: """"Return list of all registered profiles""" return list(self.profiles.keys()) - def hasProfile(self, profile_name: str) -> bool: + def has_profile(self, profile_name: str) -> bool: """return True if profile_name exists @param profile_name: name of the profile to check """ return profile_name in self.profiles - def profileIsComponent(self, profile_name: str) -> bool: + def profile_is_component(self, profile_name: str) -> bool: try: return self.profiles[profile_name] in self.components except KeyError: raise exceptions.NotFound("the requested profile doesn't exists") - def getEntryPoint(self, profile_name: str) -> str: + def get_entry_point(self, profile_name: str) -> str: try: return self.components[self.profiles[profile_name]] except KeyError: raise exceptions.NotFound("the requested profile doesn't exists or is not a component") @aio - async def createProfile(self, name: str, component_ep: Optional[str] = None) -> None: + async def create_profile(self, name: str, component_ep: Optional[str] = None) -> None: """Create a new profile @param name: name of the profile @@ -331,7 +331,7 @@ return profile @aio - async def deleteProfile(self, name: str) -> None: + async def delete_profile(self, name: str) -> None: """Delete profile @param name: name of the profile @@ -349,7 +349,7 @@ ## Params @aio - async def loadGenParams(self, params_gen: dict) -> None: + async def load_gen_params(self, params_gen: dict) -> None: """Load general parameters @param params_gen: dictionary to fill @@ -361,7 +361,7 @@ params_gen[(p.category, p.name)] = p.value @aio - async def loadIndParams(self, params_ind: dict, profile: str) -> None: + async def load_ind_params(self, params_ind: dict, profile: str) -> None: """Load individual parameters @param params_ind: dictionary to fill @@ -376,7 +376,7 @@ params_ind[(p.category, p.name)] = p.value @aio - async def getIndParam(self, category: str, name: str, profile: str) -> Optional[str]: + async def get_ind_param(self, category: str, name: str, profile: str) -> Optional[str]: """Ask database for the value of one specific individual parameter @param category: category of the parameter @@ -395,7 +395,7 @@ return result.scalar_one_or_none() @aio - async def getIndParamValues(self, category: str, name: str) -> Dict[str, str]: + async def get_ind_param_values(self, category: str, name: str) -> Dict[str, str]: """Ask database for the individual values of a parameter for all profiles @param category: category of the parameter @@ -414,7 +414,7 @@ return {param.profile.name: param.value for param in result.scalars()} @aio - async def setGenParam(self, category: str, name: str, value: Optional[str]) -> None: + async def set_gen_param(self, category: str, name: str, value: Optional[str]) -> None: """Save the general parameters in database @param category: category of the parameter @@ -436,7 +436,7 @@ await session.commit() @aio - async def setIndParam( + async def set_ind_param( self, category:str, name: str, @@ -489,7 +489,7 @@ return History.source == jid_.userhost() @aio - async def historyGet( + async def history_get( self, from_jid: Optional[jid.JID], to_jid: Optional[jid.JID], @@ -509,7 +509,7 @@ - None for unlimited @param between: confound source and dest (ignore the direction) @param filters: pattern to filter the history results - @return: list of messages as in [messageNew], minus the profile which is already + @return: list of messages as in [message_new], minus the profile which is already known. """ # we have to set a default value to profile because it's last argument @@ -634,7 +634,7 @@ return [h.as_tuple() for h in result] @aio - async def addToHistory(self, data: dict, profile: str) -> None: + async def add_to_history(self, data: dict, profile: str) -> None: """Store a new message in history @param data: message data as build by SatMessageProtocol.onMessage @@ -682,7 +682,7 @@ ## Private values - def _getPrivateClass(self, binary, profile): + def _get_private_class(self, binary, profile): """Get ORM class to use for private values""" if profile is None: return PrivateGenBin if binary else PrivateGen @@ -691,7 +691,7 @@ @aio - async def getPrivates( + async def get_privates( self, namespace:str, keys: Optional[Iterable[str]] = None, @@ -714,7 +714,7 @@ f"{' binary' if binary else ''} private values from database for namespace " f"{namespace}{f' with keys {keys!r}' if keys is not None else ''}" ) - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = select(cls).filter_by(namespace=namespace) if keys: stmt = stmt.where(cls.key.in_(list(keys))) @@ -725,7 +725,7 @@ return {p.key: p.value for p in result.scalars()} @aio - async def setPrivateValue( + async def set_private_value( self, namespace: str, key:str, @@ -743,7 +743,7 @@ @param profile: profile to use for individual value if None, it's a general value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) values = { "namespace": namespace, @@ -768,7 +768,7 @@ await session.commit() @aio - async def delPrivateValue( + async def del_private_value( self, namespace: str, key: str, @@ -783,7 +783,7 @@ @param profile: profile to use for individual value if None, it's a general value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = delete(cls).filter_by(namespace=namespace, key=key) @@ -795,7 +795,7 @@ await session.commit() @aio - async def delPrivateNamespace( + async def del_private_namespace( self, namespace: str, binary: bool = False, @@ -805,9 +805,9 @@ Be really cautious when you use this method, as all data with given namespace are removed. - Params are the same as for delPrivateValue + Params are the same as for del_private_value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = delete(cls).filter_by(namespace=namespace) @@ -821,7 +821,7 @@ ## Files @aio - async def getFiles( + async def get_files( self, client: Optional[SatXMPPEntity], file_id: Optional[str] = None, @@ -852,7 +852,7 @@ @param projection: name of columns to retrieve None to retrieve all @param unique: if True will remove duplicates - other params are the same as for [setFile] + other params are the same as for [set_file] @return: files corresponding to filters """ if projection is None: @@ -910,7 +910,7 @@ return [dict(r) for r in result] @aio - async def setFile( + async def set_file( self, client: SatXMPPEntity, name: str, @@ -987,7 +987,7 @@ )) @aio - async def fileGetUsedSpace(self, client: SatXMPPEntity, owner: jid.JID) -> int: + async def file_get_used_space(self, client: SatXMPPEntity, owner: jid.JID) -> int: async with self.session() as session: result = await session.execute( select(sum_(File.size)).filter_by( @@ -998,7 +998,7 @@ return result.scalar_one_or_none() or 0 @aio - async def fileDelete(self, file_id: str) -> None: + async def file_delete(self, file_id: str) -> None: """Delete file metadata from the database @param file_id: id of the file to delete @@ -1010,7 +1010,7 @@ await session.commit() @aio - async def fileUpdate( + async def file_update( self, file_id: str, column: str, @@ -1068,7 +1068,7 @@ ) @aio - async def getPubsubNode( + async def get_pubsub_node( self, client: SatXMPPEntity, service: jid.JID, @@ -1085,7 +1085,7 @@ @param with_items: retrieve items in the same query @param with_subscriptions: retrieve subscriptions in the same query @param create: if the node doesn't exist in DB, create it - @param create_kwargs: keyword arguments to use with ``setPubsubNode`` if the node + @param create_kwargs: keyword arguments to use with ``set_pubsub_node`` if the node needs to be created. """ async with self.session() as session: @@ -1112,15 +1112,15 @@ if create_kwargs is None: create_kwargs = {} try: - return await as_future(self.setPubsubNode( + return await as_future(self.set_pubsub_node( client, service, name, **create_kwargs )) except IntegrityError as e: if "unique" in str(e.orig).lower(): # the node may already exist, if it has been created just after - # getPubsubNode above + # get_pubsub_node above log.debug("ignoring UNIQUE constraint error") - cached_node = await as_future(self.getPubsubNode( + cached_node = await as_future(self.get_pubsub_node( client, service, name, @@ -1133,7 +1133,7 @@ return ret @aio - async def setPubsubNode( + async def set_pubsub_node( self, client: SatXMPPEntity, service: jid.JID, @@ -1159,7 +1159,7 @@ return node @aio - async def updatePubsubNodeSyncState( + async def update_pubsub_node_sync_state( self, node: PubsubNode, state: SyncState @@ -1176,7 +1176,7 @@ ) @aio - async def deletePubsubNode( + async def delete_pubsub_node( self, profiles: Optional[List[str]], services: Optional[List[jid.JID]], @@ -1207,7 +1207,7 @@ await session.commit() @aio - async def cachePubsubItems( + async def cache_pubsub_items( self, client: SatXMPPEntity, node: PubsubNode, @@ -1240,7 +1240,7 @@ await session.commit() @aio - async def deletePubsubItems( + async def delete_pubsub_items( self, node: PubsubNode, items_names: Optional[List[str]] = None @@ -1264,7 +1264,7 @@ await session.commit() @aio - async def purgePubsubItems( + async def purge_pubsub_items( self, services: Optional[List[jid.JID]] = None, names: Optional[List[str]] = None, @@ -1313,7 +1313,7 @@ await session.commit() @aio - async def getItems( + async def get_items( self, node: PubsubNode, max_items: Optional[int] = None, @@ -1352,7 +1352,7 @@ metadata = { "service": node.service, "node": node.name, - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=node.service.full(), node=node.name, @@ -1487,7 +1487,7 @@ result.reverse() return result, metadata - def _getSqlitePath( + def _get_sqlite_path( self, path: List[Union[str, int]] ) -> str: @@ -1495,7 +1495,7 @@ return f"${''.join(f'[{p}]' if isinstance(p, int) else f'.{p}' for p in path)}" @aio - async def searchPubsubItems( + async def search_pubsub_items( self, query: dict, ) -> Tuple[List[PubsubItem]]: @@ -1626,7 +1626,7 @@ op_attr = OP_MAP[operator] except KeyError: raise ValueError(f"invalid operator: {operator!r}") - sqlite_path = self._getSqlitePath(path) + sqlite_path = self._get_sqlite_path(path) if operator in ("overlap", "ioverlap", "disjoint", "idisjoint"): col = literal_column("json_each.value") if operator[0] == "i": @@ -1683,7 +1683,7 @@ raise NotImplementedError(f"Unknown {order!r} order") else: # we have a JSON path - # sqlite_path = self._getSqlitePath(path) + # sqlite_path = self._get_sqlite_path(path) col = PubsubItem.parsed[path] direction = order_data.get("direction", "ASC").lower() if not direction in ("asc", "desc"): diff -r c4464d7ae97b -r 524856bd7b19 sat/memory/sqla_config.py --- a/sat/memory/sqla_config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/sqla_config.py Sat Apr 08 13:54:42 2023 +0200 @@ -22,15 +22,15 @@ from sat.tools import config -def getDbConfig() -> dict: +def get_db_config() -> dict: """Get configuration for database @return: dict with following keys: - type: only "sqlite" for now - path: path to the sqlite DB """ - main_conf = config.parseMainConf() - local_dir = Path(config.getConfig(main_conf, "", "local_dir")) + main_conf = config.parse_main_conf() + local_dir = Path(config.config_get(main_conf, "", "local_dir")) database_path = (local_dir / C.SAVEFILE_DATABASE).expanduser() url = f"sqlite+aiosqlite:///{quote(str(database_path))}?timeout=30" return { diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_adhoc_dbus.py --- a/sat/plugins/plugin_adhoc_dbus.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_adhoc_dbus.py Sat Apr 08 13:54:42 2023 +0200 @@ -98,39 +98,39 @@ log.info(_("plugin Ad-Hoc D-Bus initialization")) self.host = host if etree is not None: - host.bridge.addMethod( - "adHocDBusAddAuto", + host.bridge.add_method( + "ad_hoc_dbus_add_auto", ".plugin", in_sign="sasasasasasass", out_sign="(sa(sss))", - method=self._adHocDBusAddAuto, + method=self._ad_hoc_dbus_add_auto, async_=True, ) - host.bridge.addMethod( - "adHocRemotesGet", + host.bridge.add_method( + "ad_hoc_remotes_get", ".plugin", in_sign="s", out_sign="a(sss)", - method=self._adHocRemotesGet, + method=self._ad_hoc_remotes_get, async_=True, ) self._c = host.plugins["XEP-0050"] - host.registerNamespace("mediaplayer", NS_MEDIA_PLAYER) + host.register_namespace("mediaplayer", NS_MEDIA_PLAYER) if dbus is not None: self.session_bus = dbus.SessionBus() self.fd_object = self.session_bus.get_object( FD_NAME, FD_PATH, introspect=False) - def profileConnected(self, client): + def profile_connected(self, client): if dbus is not None: - self._c.addAdHocCommand( - client, self.localMediaCb, D_("Media Players"), + self._c.add_ad_hoc_command( + client, self.local_media_cb, D_("Media Players"), node=NS_MEDIA_PLAYER, timeout=60*60*6 # 6 hours timeout, to avoid breaking remote # in the middle of a movie ) - def _DBusAsyncCall(self, proxy, method, *args, **kwargs): + def _dbus_async_call(self, proxy, method, *args, **kwargs): """ Call a DBus method asynchronously and return a deferred @param proxy: DBus object proxy, as returner by get_object @@ -149,18 +149,18 @@ proxy.get_dbus_method(method, dbus_interface=interface)(*args, **kwargs) return d - def _DBusGetProperty(self, proxy, interface, name): - return self._DBusAsyncCall( + def _dbus_get_property(self, proxy, interface, name): + return self._dbus_async_call( proxy, "Get", interface, name, interface="org.freedesktop.DBus.Properties") - def _DBusListNames(self): - return self._DBusAsyncCall(self.fd_object, "ListNames") + def _dbus_list_names(self): + return self._dbus_async_call(self.fd_object, "ListNames") - def _DBusIntrospect(self, proxy): - return self._DBusAsyncCall(proxy, INTROSPECT_METHOD, interface=INTROSPECT_IFACE) + def _dbus_introspect(self, proxy): + return self._dbus_async_call(proxy, INTROSPECT_METHOD, interface=INTROSPECT_IFACE) - def _acceptMethod(self, method): + def _accept_method(self, method): """ Return True if we accept the method for a command @param method: etree.Element @return: True if the method is acceptable @@ -175,7 +175,7 @@ @defer.inlineCallbacks def _introspect(self, methods, bus_name, proxy): log.debug("introspecting path [%s]" % proxy.object_path) - introspect_xml = yield self._DBusIntrospect(proxy) + introspect_xml = yield self._dbus_introspect(proxy) el = etree.fromstring(introspect_xml) for node in el.iterchildren("node", "interface"): if node.tag == "node": @@ -191,23 +191,23 @@ continue log.debug("introspecting interface [%s]" % name) for method in node.iterchildren("method"): - if self._acceptMethod(method): + if self._accept_method(method): method_name = method.get("name") log.debug("method accepted: [%s]" % method_name) methods.add((proxy.object_path, name, method_name)) - def _adHocDBusAddAuto(self, prog_name, allowed_jids, allowed_groups, allowed_magics, + def _ad_hoc_dbus_add_auto(self, prog_name, allowed_jids, allowed_groups, allowed_magics, forbidden_jids, forbidden_groups, flags, profile_key): - client = self.host.getClient(profile_key) - return self.adHocDBusAddAuto( + client = self.host.get_client(profile_key) + return self.ad_hoc_dbus_add_auto( client, prog_name, allowed_jids, allowed_groups, allowed_magics, forbidden_jids, forbidden_groups, flags) @defer.inlineCallbacks - def adHocDBusAddAuto(self, client, prog_name, allowed_jids=None, allowed_groups=None, + def ad_hoc_dbus_add_auto(self, client, prog_name, allowed_jids=None, allowed_groups=None, allowed_magics=None, forbidden_jids=None, forbidden_groups=None, flags=None): - bus_names = yield self._DBusListNames() + bus_names = yield self._dbus_list_names() bus_names = [bus_name for bus_name in bus_names if "." + prog_name in bus_name] if not bus_names: log.info("Can't find any bus for [%s]" % prog_name) @@ -223,7 +223,7 @@ yield self._introspect(methods, bus_name, proxy) if methods: - self._addCommand( + self._add_command( client, prog_name, bus_name, @@ -238,13 +238,13 @@ defer.returnValue((str(bus_name), methods)) - def _addCommand(self, client, adhoc_name, bus_name, methods, allowed_jids=None, + def _add_command(self, client, adhoc_name, bus_name, methods, allowed_jids=None, allowed_groups=None, allowed_magics=None, forbidden_jids=None, forbidden_groups=None, flags=None): if flags is None: flags = set() - def DBusCallback(client, command_elt, session_data, action, node): + def d_bus_callback(client, command_elt, session_data, action, node): actions = session_data.setdefault("actions", []) names_map = session_data.setdefault("names_map", {}) actions.append(action) @@ -283,7 +283,7 @@ path, iface, command = names_map[command] proxy = self.session_bus.get_object(bus_name, path) - self._DBusAsyncCall(proxy, command, interface=iface) + self._dbus_async_call(proxy, command, interface=iface) # job done, we can end the session, except if we have FLAG_LOOP if FLAG_LOOP in flags: @@ -292,7 +292,7 @@ # is OK) del actions[:] names_map.clear() - return DBusCallback( + return d_bus_callback( client, None, session_data, self._c.ACTION.EXECUTE, node ) form = data_form.Form("form", title=_("Updated")) @@ -305,9 +305,9 @@ return (payload, status, None, note) - self._c.addAdHocCommand( + self._c.add_ad_hoc_command( client, - DBusCallback, + d_bus_callback, adhoc_name, allowed_jids=allowed_jids, allowed_groups=allowed_groups, @@ -318,18 +318,18 @@ ## Local media ## - def _adHocRemotesGet(self, profile): - return self.adHocRemotesGet(self.host.getClient(profile)) + def _ad_hoc_remotes_get(self, profile): + return self.ad_hoc_remotes_get(self.host.get_client(profile)) @defer.inlineCallbacks - def adHocRemotesGet(self, client): + def ad_hoc_remotes_get(self, client): """Retrieve available remote media controlers in our devices @return (list[tuple[unicode, unicode, unicode]]): list of devices with: - entity full jid - device name - device label """ - found_data = yield defer.ensureDeferred(self.host.findByFeatures( + found_data = yield defer.ensureDeferred(self.host.find_by_features( client, [self.host.ns_map['commands']], service=False, roster=False, own_jid=True, local_device=True)) @@ -344,7 +344,7 @@ try: result_elt = yield self._c.do(client, device_jid, NS_MEDIA_PLAYER, timeout=5) - command_elt = self._c.getCommandElt(result_elt) + command_elt = self._c.get_command_elt(result_elt) form = data_form.findForm(command_elt, NS_MEDIA_PLAYER) if form is None: continue @@ -368,7 +368,7 @@ break defer.returnValue(remotes) - def doMPRISCommand(self, proxy, command): + def do_mpris_command(self, proxy, command): iface, command = command.rsplit(".", 1) if command == CMD_GO_BACK: command = 'Seek' @@ -378,9 +378,9 @@ args = [SEEK_OFFSET] else: args = [] - return self._DBusAsyncCall(proxy, command, *args, interface=iface) + return self._dbus_async_call(proxy, command, *args, interface=iface) - def addMPRISMetadata(self, form, metadata): + def add_mpris_metadata(self, form, metadata): """Serialise MRPIS Metadata according to MPRIS_METADATA_MAP""" for mpris_key, name in MPRIS_METADATA_MAP.items(): if mpris_key in metadata: @@ -390,7 +390,7 @@ value=value)) @defer.inlineCallbacks - def localMediaCb(self, client, command_elt, session_data, action, node): + def local_media_cb(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -399,7 +399,7 @@ if command_form is None or len(command_form.fields) == 0: # root request, we looks for media players - bus_names = yield self._DBusListNames() + bus_names = yield self._dbus_list_names() bus_names = [b for b in bus_names if b.startswith(MPRIS_PREFIX)] if len(bus_names) == 0: note = (self._c.NOTE.INFO, D_("No media player found.")) @@ -445,7 +445,7 @@ except KeyError: pass else: - yield self.doMPRISCommand(proxy, command) + yield self.do_mpris_command(proxy, command) # we construct the remote control form form = data_form.Form("form", title=D_("Media Player Selection")) @@ -455,13 +455,13 @@ for iface, properties_names in MPRIS_PROPERTIES.items(): for name in properties_names: try: - value = yield self._DBusGetProperty(proxy, iface, name) + value = yield self._dbus_get_property(proxy, iface, name) except Exception as e: log.warning(_("Can't retrieve attribute {name}: {reason}") .format(name=name, reason=e)) continue if name == MPRIS_METADATA_KEY: - self.addMPRISMetadata(form, value) + self.add_mpris_metadata(form, value) else: form.addField(data_form.Field(fieldType="fixed", var=name, diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_blog_import.py --- a/sat/plugins/plugin_blog_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,7 +61,7 @@ OPT_DEFAULTS = {OPT_UPLOAD_IMAGES: True, OPT_IGNORE_TLS: False} def __init__(self, host): - log.info(_("plugin Blog Import initialization")) + log.info(_("plugin Blog import initialization")) self.host = host self._u = host.plugins["UPLOAD"] self._p = host.plugins["XEP-0060"] @@ -69,10 +69,10 @@ self._s = self.host.plugins["TEXT_SYNTAXES"] host.plugins["IMPORT"].initialize(self, "blog") - def importItem( + def import_item( self, client, item_import_data, session, options, return_data, service, node ): - """importItem specialized for blog import + """import_item specialized for blog import @param item_import_data(dict): * mandatory keys: @@ -116,7 +116,7 @@ except KeyError: pass else: - new_uri = return_data[URL_REDIRECT_PREFIX + old_uri] = self._p.getNodeURI( + new_uri = return_data[URL_REDIRECT_PREFIX + old_uri] = self._p.get_node_uri( service if service is not None else client.jid.userhostJID(), node or self._m.namespace, item_id, @@ -126,14 +126,14 @@ return mb_data @defer.inlineCallbacks - def importSubItems(self, client, item_import_data, mb_data, session, options): + def import_sub_items(self, client, item_import_data, mb_data, session, options): # comments data if len(item_import_data["comments"]) != 1: raise NotImplementedError("can't manage multiple comment links") allow_comments = C.bool(mb_data.get("allow_comments", C.BOOL_FALSE)) if allow_comments: - comments_service = yield self._m.getCommentsService(client) - comments_node = self._m.getCommentsNode(mb_data["id"]) + comments_service = yield self._m.get_comments_service(client) + comments_node = self._m.get_comments_node(mb_data["id"]) mb_data["comments_service"] = comments_service.full() mb_data["comments_node"] = comments_node recurse_kwargs = { @@ -149,7 +149,7 @@ ) defer.returnValue(None) - def publishItem(self, client, mb_data, service, node, session): + def publish_item(self, client, mb_data, service, node, session): log.debug( "uploading item [{id}]: {title}".format( id=mb_data["id"], title=mb_data.get("title", "") @@ -158,7 +158,7 @@ return self._m.send(client, mb_data, service, node) @defer.inlineCallbacks - def itemFilters(self, client, mb_data, session, options): + def item_filters(self, client, mb_data, session, options): """Apply filters according to options modify mb_data in place @@ -188,7 +188,7 @@ ) # we convert rich syntax to XHTML here, so we can handle filters easily converted = yield self._s.convert( - rich, self._s.getCurrentSyntax(client.profile), safe=False + rich, self._s.get_current_syntax(client.profile), safe=False ) mb_data["{}_xhtml".format(prefix)] = converted del mb_data["{}_rich".format(prefix)] @@ -220,7 +220,7 @@ ) except domish.ParserError: # we clean the xml and try again our luck - cleaned = yield self._s.cleanXHTML(mb_data["content_xhtml"]) + cleaned = yield self._s.clean_xhtml(mb_data["content_xhtml"]) top_elt = xml_tools.ElementParser()(cleaned, namespace=C.NS_XHTML) opt_host = options.get(OPT_HOST) if opt_host: @@ -239,8 +239,8 @@ tmp_dir = tempfile.mkdtemp() try: # TODO: would be nice to also update the hyperlinks to these images, e.g. when you have - for img_elt in xml_tools.findAll(top_elt, names=["img"]): - yield self.imgFilters(client, img_elt, options, opt_host, tmp_dir) + for img_elt in xml_tools.find_all(top_elt, names=["img"]): + yield self.img_filters(client, img_elt, options, opt_host, tmp_dir) finally: os.rmdir(tmp_dir) # XXX: tmp_dir should be empty, or something went wrong @@ -248,7 +248,7 @@ mb_data["content_xhtml"] = top_elt.toXml() @defer.inlineCallbacks - def imgFilters(self, client, img_elt, options, opt_host, tmp_dir): + def img_filters(self, client, img_elt, options, opt_host, tmp_dir): """Filters handling images url without host are fixed (if possible) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_blog_import_dokuwiki.py --- a/sat/plugins/plugin_blog_import_dokuwiki.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import_dokuwiki.py Sat Apr 08 13:54:42 2023 +0200 @@ -123,7 +123,7 @@ self.limit = limit self.posts_data = OrderedDict() - def getPostId(self, post): + def get_post_id(self, post): """Return a unique and constant post id @param post(dict): parsed post data @@ -131,7 +131,7 @@ """ return str(post["id"]) - def getPostUpdated(self, post): + def get_post_updated(self, post): """Return the update date. @param post(dict): parsed post data @@ -139,7 +139,7 @@ """ return str(post["mtime"]) - def getPostPublished(self, post): + def get_post_published(self, post): """Try to parse the date from the message ID, else use "mtime". The date can be extracted if the message ID looks like one of: @@ -162,16 +162,16 @@ return default return str(calendar.timegm(time_struct)) - def processPost(self, post, profile_jid): + def process_post(self, post, profile_jid): """Process a single page. @param post (dict): parsed post data @param profile_jid """ # get main information - id_ = self.getPostId(post) - updated = self.getPostUpdated(post) - published = self.getPostPublished(post) + id_ = self.get_post_id(post) + updated = self.get_post_updated(post) + published = self.get_post_published(post) # manage links backlinks = self.pages.backlinks(id_) @@ -182,7 +182,7 @@ backlinks.append(page[1:] if page.startswith(":") else page) self.pages.get(id_) - content_xhtml = self.processContent(self.pages.html(id_), backlinks, profile_jid) + content_xhtml = self.process_content(self.pages.html(id_), backlinks, profile_jid) # XXX: title is already in content_xhtml and difficult to remove, so leave it # title = content.split("\n")[0].strip(u"\ufeff= ") @@ -230,14 +230,14 @@ count = 0 for page in pages_list: - self.processPost(page, profile_jid) + self.process_post(page, profile_jid) count += 1 if count >= self.limit: break return (iter(self.posts_data.values()), len(self.posts_data)) - def processContent(self, text, backlinks, profile_jid): + def process_content(self, text, backlinks, profile_jid): """Do text substitutions and file copy. @param text (unicode): message content @@ -259,7 +259,7 @@ if re.match(r"^\w*://", link): # absolute URL to link directly continue if self.media_repo: - self.moveMedia(link, subs) + self.move_media(link, subs) elif link not in subs: subs[link] = urllib.parse.urljoin(self.url, link) @@ -267,7 +267,7 @@ text = text.replace(url, new_url) return text - def moveMedia(self, link, subs): + def move_media(self, link, subs): """Move a media from the DokuWiki host to the new repository. This also updates the hyperlinks to internal media files. @@ -304,17 +304,17 @@ return filepath = os.path.join(self.temp_dir, filename) - self.downloadMedia(url, filepath) + self.download_media(url, filepath) if thumb_width: filename = os.path.join("thumbs", thumb_width, filename) thumbnail = os.path.join(self.temp_dir, filename) - self.createThumbnail(filepath, thumbnail, thumb_width) + self.create_thumbnail(filepath, thumbnail, thumb_width) new_url = os.path.join(self.media_repo, filename) subs[link] = new_url - def downloadMedia(self, source, dest): + def download_media(self, source, dest): """Copy media to localhost. @param source (unicode): source url @@ -327,7 +327,7 @@ urllib.request.urlretrieve(source, dest) log.debug("DokuWiki media file copied to %s" % dest) - def createThumbnail(self, source, dest, width): + def create_thumbnail(self, source, dest, width): """Create a thumbnail. @param source (unicode): source file path @@ -348,13 +348,13 @@ class DokuwikiImport(object): def __init__(self, host): - log.info(_("plugin Dokuwiki Import initialization")) + log.info(_("plugin Dokuwiki import initialization")) self.host = host self._blog_import = host.plugins["BLOG_IMPORT"] - self._blog_import.register("dokuwiki", self.DkImport, SHORT_DESC, LONG_DESC) + self._blog_import.register("dokuwiki", self.dk_import, SHORT_DESC, LONG_DESC) - def DkImport(self, client, location, options=None): - """Import from DokuWiki to PubSub + def dk_import(self, client, location, options=None): + """import from DokuWiki to PubSub @param location (unicode): DokuWiki site URL @param options (dict, None): DokuWiki import parameters @@ -407,7 +407,7 @@ info_msg = info_msg.format( temp_dir=dk_importer.temp_dir, media_repo=media_repo, location=location ) - self.host.actionNew( + self.host.action_new( {"xmlui": xml_tools.note(info_msg).toXml()}, profile=client.profile ) d = threads.deferToThread(dk_importer.process, client, namespace) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_blog_import_dotclear.py --- a/sat/plugins/plugin_blog_import_dotclear.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import_dotclear.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,7 +50,7 @@ To use it, you'll need to export your blog to a flat file. You must go in your admin interface and select Plugins/Maintenance then Backup. Export only one blog if you have many, i.e. select "Download database of current blog" -Depending on your configuration, your may need to use Import/Export plugin and export as a flat file. +Depending on your configuration, your may need to use import/Export plugin and export as a flat file. location: you must use the absolute path to your backup for the location parameter """ @@ -77,7 +77,7 @@ self.posts_data = OrderedDict() self.tags = {} - def getPostId(self, post): + def get_post_id(self, post): """Return a unique and constant post id @param post(dict): parsed post data @@ -91,7 +91,7 @@ post["post_url"], ) - def getCommentId(self, comment): + def get_comment_id(self, comment): """Return a unique and constant comment id @param comment(dict): parsed comment @@ -110,7 +110,7 @@ """ return time.mktime(time.strptime(data[key], "%Y-%m-%d %H:%M:%S")) - def readFields(self, fields_data): + def read_fields(self, fields_data): buf = [] idx = 0 while True: @@ -148,13 +148,13 @@ buf.append(char) def parseFields(self, headers, data): - return dict(zip(headers, self.readFields(data))) + return dict(zip(headers, self.read_fields(data))) - def postHandler(self, headers, data, index): + def post_handler(self, headers, data, index): post = self.parseFields(headers, data) log.debug("({}) post found: {}".format(index, post["post_title"])) mb_data = { - "id": self.getPostId(post), + "id": self.get_post_id(post), "published": self.getTime(post, "post_creadt"), "updated": self.getTime(post, "post_upddt"), "author": post["user_id"], # there use info are not in the archive @@ -163,7 +163,7 @@ post["post_content_xhtml"], post["post_excerpt_xhtml"] ), "title": post["post_title"], - "allow_comments": C.boolConst(bool(int(post["post_open_comment"]))), + "allow_comments": C.bool_const(bool(int(post["post_open_comment"]))), } self.posts_data[post["post_id"]] = { "blog": mb_data, @@ -171,18 +171,18 @@ "url": "/post/{}".format(post["post_url"]), } - def metaHandler(self, headers, data, index): + def meta_handler(self, headers, data, index): meta = self.parseFields(headers, data) if meta["meta_type"] == "tag": tags = self.tags.setdefault(meta["post_id"], set()) tags.add(meta["meta_id"]) - def metaFinishedHandler(self): + def meta_finished_handler(self): for post_id, tags in self.tags.items(): data_format.iter2dict("tag", tags, self.posts_data[post_id]["blog"]) del self.tags - def commentHandler(self, headers, data, index): + def comment_handler(self, headers, data, index): comment = self.parseFields(headers, data) if comment["comment_site"]: # we don't use atom:uri because it's used for jid in XMPP @@ -193,7 +193,7 @@ else: content = comment["comment_content"] mb_data = { - "id": self.getCommentId(comment), + "id": self.get_comment_id(comment), "published": self.getTime(comment, "comment_dt"), "updated": self.getTime(comment, "comment_upddt"), "author": comment["comment_author"], @@ -263,13 +263,13 @@ class DotclearImport(object): def __init__(self, host): - log.info(_("plugin Dotclear Import initialization")) + log.info(_("plugin Dotclear import initialization")) self.host = host host.plugins["BLOG_IMPORT"].register( - "dotclear", self.DcImport, SHORT_DESC, LONG_DESC + "dotclear", self.dc_import, SHORT_DESC, LONG_DESC ) - def DcImport(self, client, location, options=None): + def dc_import(self, client, location, options=None): if not os.path.isabs(location): raise exceptions.DataError( "An absolute path to backup data need to be given as location" diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_ap_gateway/__init__.py --- a/sat/plugins/plugin_comp_ap_gateway/__init__.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/__init__.py Sat Apr 08 13:54:42 2023 +0200 @@ -151,9 +151,9 @@ self._t = host.plugins["TEXT_SYNTAXES"] self._i = host.plugins["IDENTITY"] self._events = host.plugins["XEP-0471"] - self._p.addManagedNode( + self._p.add_managed_node( "", - items_cb=self._itemsReceived, + items_cb=self._items_received, # we want to be sure that the callbacks are launched before pubsub cache's # one, as we need to inspect items before they are actually removed from cache # or updated @@ -162,20 +162,20 @@ self.pubsub_service = APPubsubService(self) self.ad_hoc = APAdHocService(self) self.ap_events = APEvents(self) - host.trigger.add("messageReceived", self._messageReceivedTrigger, priority=-1000) - host.trigger.add("XEP-0424_retractReceived", self._onMessageRetract) - host.trigger.add("XEP-0372_ref_received", self._onReferenceReceived) + host.trigger.add("messageReceived", self._message_received_trigger, priority=-1000) + host.trigger.add("XEP-0424_retractReceived", self._on_message_retract) + host.trigger.add("XEP-0372_ref_received", self._on_reference_received) - host.bridge.addMethod( - "APSend", + host.bridge.add_method( + "ap_send", ".plugin", in_sign="sss", out_sign="", - method=self._publishMessage, + method=self._publish_message, async_=True, ) - def getHandler(self, __): + def get_handler(self, __): return self.pubsub_service async def init(self, client): @@ -186,7 +186,7 @@ log.info(_("ActivityPub Gateway initialization")) # RSA keys - stored_data = await self.host.memory.storage.getPrivates( + stored_data = await self.host.memory.storage.get_privates( IMPORT_NAME, ["rsa_key"], profile=client.profile ) private_key_pem = stored_data.get("rsa_key") @@ -201,7 +201,7 @@ format=serialization.PrivateFormat.PKCS8, encryption_algorithm=serialization.NoEncryption() ).decode() - await self.host.memory.storage.setPrivateValue( + await self.host.memory.storage.set_private_value( IMPORT_NAME, "rsa_key", private_key_pem, profile=client.profile ) else: @@ -217,9 +217,9 @@ # params # URL and port - self.public_url = self.host.memory.getConfig( + self.public_url = self.host.memory.config_get( CONF_SECTION, "public_url" - ) or self.host.memory.getConfig( + ) or self.host.memory.config_get( CONF_SECTION, "xmpp_domain" ) if self.public_url is None: @@ -235,37 +235,37 @@ "\"public_url\" configuration option. ActivityPub Gateway won't be run." ) return - self.http_port = int(self.host.memory.getConfig( + self.http_port = int(self.host.memory.config_get( CONF_SECTION, 'http_port', 8123)) - connection_type = self.host.memory.getConfig( + connection_type = self.host.memory.config_get( CONF_SECTION, 'http_connection_type', 'https') if connection_type not in ('http', 'https'): raise exceptions.ConfigError( 'bad ap-gateay http_connection_type, you must use one of "http" or ' '"https"' ) - self.max_items = int(self.host.memory.getConfig( + self.max_items = int(self.host.memory.config_get( CONF_SECTION, 'new_node_max_items', 50 )) - self.comments_max_depth = int(self.host.memory.getConfig( + self.comments_max_depth = int(self.host.memory.config_get( CONF_SECTION, 'comments_max_depth', 0 )) - self.ap_path = self.host.memory.getConfig(CONF_SECTION, 'ap_path', '_ap') + self.ap_path = self.host.memory.config_get(CONF_SECTION, 'ap_path', '_ap') self.base_ap_url = parse.urljoin(f"https://{self.public_url}", f"{self.ap_path}/") # True (default) if we provide gateway only to entities/services from our server self.local_only = C.bool( - self.host.memory.getConfig(CONF_SECTION, 'local_only', C.BOOL_TRUE) + self.host.memory.config_get(CONF_SECTION, 'local_only', C.BOOL_TRUE) ) # if True (default), mention will be parsed in non-private content coming from # XMPP. This is necessary as XEP-0372 are coming separately from item where the # mention is done, which is hard to impossible to translate to ActivityPub (where # mention specified inside the item directly). See documentation for details. self.auto_mentions = C.bool( - self.host.memory.getConfig(CONF_SECTION, "auto_mentions", C.BOOL_TRUE) + self.host.memory.config_get(CONF_SECTION, "auto_mentions", C.BOOL_TRUE) ) - html_redirect: Dict[str, Union[str, dict]] = self.host.memory.getConfig( + html_redirect: Dict[str, Union[str, dict]] = self.host.memory.config_get( CONF_SECTION, 'html_redirect_dict', {} ) self.html_redirect: Dict[str, List[dict]] = {} @@ -291,13 +291,13 @@ if connection_type == 'http': reactor.listenTCP(self.http_port, self.server) else: - options = tls.getOptionsFromConfig( + options = tls.get_options_from_config( self.host.memory.config, CONF_SECTION) - tls.TLSOptionsCheck(options) - context_factory = tls.getTLSContextFactory(options) + tls.tls_options_check(options) + context_factory = tls.get_tls_context_factory(options) reactor.listenSSL(self.http_port, self.server, context_factory) - async def profileConnecting(self, client): + async def profile_connecting(self, client): self.client = client client.sendHistory = True client._ap_storage = persistent.LazyPersistentBinaryDict( @@ -306,10 +306,10 @@ ) await self.init(client) - def profileConnected(self, client): + def profile_connected(self, client): self.ad_hoc.init(client) - async def _itemsReceived( + async def _items_received( self, client: SatXMPPEntity, itemsEvent: pubsub.ItemsEvent @@ -326,7 +326,7 @@ return # we need recipient as JID and not gateway own JID to be able to use methods such # as "subscribe" - client = self.client.getVirtualClient(itemsEvent.sender) + client = self.client.get_virtual_client(itemsEvent.sender) recipient = itemsEvent.recipient if not recipient.user: log.debug("ignoring items event without local part specified") @@ -334,18 +334,18 @@ ap_account = self._e.unescape(recipient.user) - if self._pa.isAttachmentNode(itemsEvent.nodeIdentifier): - await self.convertAndPostAttachments( + if self._pa.is_attachment_node(itemsEvent.nodeIdentifier): + await self.convert_and_post_attachments( client, ap_account, itemsEvent.sender, itemsEvent.nodeIdentifier, itemsEvent.items ) else: - await self.convertAndPostItems( + await self.convert_and_post_items( client, ap_account, itemsEvent.sender, itemsEvent.nodeIdentifier, itemsEvent.items ) - async def getVirtualClient(self, actor_id: str) -> SatXMPPEntity: + async def get_virtual_client(self, actor_id: str) -> SatXMPPEntity: """Get client for this component with a specified jid This is needed to perform operations with the virtual JID corresponding to the AP @@ -353,8 +353,8 @@ @param actor_id: ID of the actor @return: virtual client """ - local_jid = await self.getJIDFromId(actor_id) - return self.client.getVirtualClient(local_jid) + local_jid = await self.get_jid_from_id(actor_id) + return self.client.get_virtual_client(local_jid) def is_activity(self, data: dict) -> bool: """Return True if the data has an activity type""" @@ -363,7 +363,7 @@ except (KeyError, TypeError): return False - async def apGet(self, url: str) -> dict: + async def ap_get(self, url: str) -> dict: """Retrieve AP JSON from given URL @raise error.StanzaError: "service-unavailable" is sent when something went wrong @@ -392,16 +392,16 @@ ) @overload - async def apGetObject(self, data: dict, key: str) -> Optional[dict]: + async def ap_get_object(self, data: dict, key: str) -> Optional[dict]: ... @overload - async def apGetObject( + async def ap_get_object( self, data: Union[str, dict], key: None = None ) -> dict: ... - async def apGetObject(self, data, key = None): + async def ap_get_object(self, data, key = None): """Retrieve an AP object, dereferencing when necessary This method is to be used with attributes marked as "Functional" in @@ -416,21 +416,21 @@ value = data if value is None: if key is None: - raise ValueError("None can't be used with apGetObject is key is None") + raise ValueError("None can't be used with ap_get_object is key is None") return None elif isinstance(value, dict): return value elif isinstance(value, str): - if self.isLocalURL(value): - return await self.apGetLocalObject(value) + if self.is_local_url(value): + return await self.ap_get_local_object(value) else: - return await self.apGet(value) + return await self.ap_get(value) else: raise NotImplementedError( "was expecting a string or a dict, got {type(value)}: {value!r}}" ) - async def apGetLocalObject( + async def ap_get_local_object( self, url: str ) -> dict: @@ -438,23 +438,23 @@ for now, only handle XMPP items to convert to AP """ - url_type, url_args = self.parseAPURL(url) + url_type, url_args = self.parse_apurl(url) if url_type == TYPE_ITEM: try: account, item_id = url_args except ValueError: raise ValueError(f"invalid URL: {url}") - author_jid, node = await self.getJIDAndNode(account) + author_jid, node = await self.get_jid_and_node(account) if node is None: node = self._m.namespace - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( self.client, author_jid, node ) if not cached_node: log.debug(f"node {node!r} at {author_jid} is not found in cache") found_item = None else: - cached_items, __ = await self.host.memory.storage.getItems( + cached_items, __ = await self.host.memory.storage.get_items( cached_node, item_ids=[item_id] ) if not cached_items: @@ -468,8 +468,8 @@ if found_item is None: # the node is not in cache, we have to make a request to retrieve the item - # If the node doesn't exist, getItems will raise a NotFound exception - found_items, __ = await self._p.getItems( + # If the node doesn't exist, get_items will raise a NotFound exception + found_items, __ = await self._p.get_items( self.client, author_jid, node, item_ids=[item_id] ) try: @@ -499,7 +499,7 @@ 'only object from "item" URLs can be retrieved for now' ) - async def apGetList( + async def ap_get_list( self, data: dict, key: str, @@ -507,7 +507,7 @@ ) -> Optional[List[Dict[str, Any]]]: """Retrieve a list of objects from AP data, dereferencing when necessary - This method is to be used with non functional vocabularies. Use ``apGetObject`` + This method is to be used with non functional vocabularies. Use ``ap_get_object`` otherwise. If the value is a dictionary, it will be wrapped in a list @param data: AP object where a list of objects is looked for @@ -519,10 +519,10 @@ if value is None: return None elif isinstance(value, str): - if self.isLocalURL(value): - value = await self.apGetLocalObject(value) + if self.is_local_url(value): + value = await self.ap_get_local_object(value) else: - value = await self.apGet(value) + value = await self.ap_get(value) if isinstance(value, dict): return [value] if not isinstance(value, list): @@ -533,9 +533,9 @@ for v in value ] else: - return [await self.apGetObject(i) for i in value] + return [await self.ap_get_object(i) for i in value] - async def apGetActors( + async def ap_get_actors( self, data: dict, key: str, @@ -575,11 +575,11 @@ f"list of actors is empty" ) if as_account: - return [await self.getAPAccountFromId(actor_id) for actor_id in value] + return [await self.get_ap_account_from_id(actor_id) for actor_id in value] else: return value - async def apGetSenderActor( + async def ap_get_sender_actor( self, data: dict, ) -> str: @@ -592,12 +592,12 @@ @raise exceptions.NotFound: no actor has been found in data """ try: - actors = await self.apGetActors(data, "actor", as_account=False) + actors = await self.ap_get_actors(data, "actor", as_account=False) except exceptions.DataError: actors = None if not actors: try: - actors = await self.apGetActors(data, "attributedTo", as_account=False) + actors = await self.ap_get_actors(data, "attributedTo", as_account=False) except exceptions.DataError: raise exceptions.NotFound( 'actor not specified in "actor" or "attributedTo"' @@ -607,7 +607,7 @@ except IndexError: raise exceptions.NotFound("list of actors is empty") - def mustEncode(self, text: str) -> bool: + def must_encode(self, text: str) -> bool: """Indicate if a text must be period encoded""" return ( not RE_ALLOWED_UNQUOTED.match(text) @@ -615,10 +615,10 @@ or "---" in text ) - def periodEncode(self, text: str) -> str: + def period_encode(self, text: str) -> str: """Period encode a text - see [getJIDAndNode] for reasons of period encoding + see [get_jid_and_node] for reasons of period encoding """ return ( parse.quote(text, safe="") @@ -629,7 +629,7 @@ .replace("%", ".") ) - async def getAPAccountFromJidAndNode( + async def get_ap_account_from_jid_and_node( self, jid_: jid.JID, node: Optional[str] @@ -644,28 +644,28 @@ if self.client is None: raise exceptions.InternalError("Client is not set yet") - if self.isVirtualJID(jid_): + if self.is_virtual_jid(jid_): # this is an proxy JID to an AP Actor return self._e.unescape(jid_.user) - if node and not jid_.user and not self.mustEncode(node): - is_pubsub = await self.isPubsub(jid_) + if node and not jid_.user and not self.must_encode(node): + is_pubsub = await self.is_pubsub(jid_) # when we have a pubsub service, the user part can be used to set the node # this produces more user-friendly AP accounts if is_pubsub: jid_.user = node node = None - is_local = self.isLocal(jid_) + is_local = self.is_local(jid_) user = jid_.user if is_local else jid_.userhost() if user is None: user = "" account_elts = [] - if node and self.mustEncode(node) or self.mustEncode(user): + if node and self.must_encode(node) or self.must_encode(user): account_elts = ["___"] if node: - node = self.periodEncode(node) - user = self.periodEncode(user) + node = self.period_encode(node) + user = self.period_encode(user) if not user: raise exceptions.InternalError("there should be a user part") @@ -678,21 +678,21 @@ )) return "".join(account_elts) - def isLocal(self, jid_: jid.JID) -> bool: + def is_local(self, jid_: jid.JID) -> bool: """Returns True if jid_ use a domain or subdomain of gateway's host""" local_host = self.client.host.split(".") assert local_host return jid_.host.split(".")[-len(local_host):] == local_host - async def isPubsub(self, jid_: jid.JID) -> bool: + async def is_pubsub(self, jid_: jid.JID) -> bool: """Indicate if a JID is a Pubsub service""" - host_disco = await self.host.getDiscoInfos(self.client, jid_) + host_disco = await self.host.get_disco_infos(self.client, jid_) return ( ("pubsub", "service") in host_disco.identities and not ("pubsub", "pep") in host_disco.identities ) - async def getJIDAndNode(self, ap_account: str) -> Tuple[jid.JID, Optional[str]]: + async def get_jid_and_node(self, ap_account: str) -> Tuple[jid.JID, Optional[str]]: """Decode raw AP account handle to get XMPP JID and Pubsub Node Username are case insensitive. @@ -767,7 +767,7 @@ # we need to check host disco, because disco request to user may be # blocked for privacy reason (see # https://xmpp.org/extensions/xep-0030.html#security) - is_pubsub = await self.isPubsub(jid.JID(domain)) + is_pubsub = await self.is_pubsub(jid.JID(domain)) if is_pubsub: # if the host is a pubsub service and not a PEP, we consider that username @@ -781,14 +781,14 @@ except RuntimeError: raise ValueError(f"Invalid jid: {jid_s!r}") - if self.local_only and not self.isLocal(jid_): + if self.local_only and not self.is_local(jid_): raise exceptions.PermissionError( "This gateway is configured to map only local entities and services" ) return jid_, node - def getLocalJIDFromAccount(self, account: str) -> jid.JID: + def get_local_jid_from_account(self, account: str) -> jid.JID: """Compute JID linking to an AP account The local jid is computer by escaping AP actor handle and using it as local part @@ -803,7 +803,7 @@ ) ) - async def getJIDFromId(self, actor_id: str) -> jid.JID: + async def get_jid_from_id(self, actor_id: str) -> jid.JID: """Compute JID linking to an AP Actor ID The local jid is computer by escaping AP actor handle and using it as local part @@ -811,17 +811,17 @@ If the actor_id comes from local server (checked with self.public_url), it means that we have an XMPP entity, and the original JID is returned """ - if self.isLocalURL(actor_id): - request_type, extra_args = self.parseAPURL(actor_id) + if self.is_local_url(actor_id): + request_type, extra_args = self.parse_apurl(actor_id) if request_type != TYPE_ACTOR or len(extra_args) != 1: raise ValueError(f"invalid actor id: {actor_id!r}") - actor_jid, __ = await self.getJIDAndNode(extra_args[0]) + actor_jid, __ = await self.get_jid_and_node(extra_args[0]) return actor_jid - account = await self.getAPAccountFromId(actor_id) - return self.getLocalJIDFromAccount(account) + account = await self.get_ap_account_from_id(actor_id) + return self.get_local_jid_from_account(account) - def parseAPURL(self, url: str) -> Tuple[str, List[str]]: + def parse_apurl(self, url: str) -> Tuple[str, List[str]]: """Parse an URL leading to an AP endpoint @param url: URL to parse (schema is not mandatory) @@ -831,7 +831,7 @@ type_, *extra_args = path[len(self.ap_path):].lstrip("/").split("/") return type_, [parse.unquote(a) for a in extra_args] - def buildAPURL(self, type_:str , *args: str) -> str: + def build_apurl(self, type_:str , *args: str) -> str: """Build an AP endpoint URL @param type_: type of AP endpoing @@ -842,18 +842,18 @@ str(Path(type_).joinpath(*(parse.quote_plus(a, safe="@") for a in args))) ) - def isLocalURL(self, url: str) -> bool: + def is_local_url(self, url: str) -> bool: """Tells if an URL link to this component ``public_url`` and ``ap_path`` are used to check the URL """ return url.startswith(self.base_ap_url) - def isVirtualJID(self, jid_: jid.JID) -> bool: + def is_virtual_jid(self, jid_: jid.JID) -> bool: """Tell if a JID is an AP actor mapped through this gateway""" return jid_.host == self.client.jid.userhost() - def buildSignatureHeader(self, values: Dict[str, str]) -> str: + def build_signature_header(self, values: Dict[str, str]) -> str: """Build key="" signature header from signature data""" fields = [] for key, value in values.items(): @@ -868,7 +868,7 @@ return ",".join(fields) - def getDigest(self, body: bytes, algo="SHA-256") -> Tuple[str, str]: + def get_digest(self, body: bytes, algo="SHA-256") -> Tuple[str, str]: """Get digest data to use in header and signature @param body: body of the request @@ -879,12 +879,12 @@ return algo, base64.b64encode(hashlib.sha256(body).digest()).decode() @async_lru(maxsize=LRU_MAX_SIZE) - async def getActorData(self, actor_id) -> dict: + async def get_actor_data(self, actor_id) -> dict: """Retrieve actor data with LRU cache""" - return await self.apGet(actor_id) + return await self.ap_get(actor_id) @async_lru(maxsize=LRU_MAX_SIZE) - async def getActorPubKeyData( + async def get_actor_pub_key_data( self, actor_id: str ) -> Tuple[str, str, rsa.RSAPublicKey]: @@ -894,7 +894,7 @@ @return: key_id, owner and public_key @raise KeyError: publicKey is missing from actor data """ - actor_data = await self.getActorData(actor_id) + actor_data = await self.get_actor_data(actor_id) pub_key_data = actor_data["publicKey"] key_id = pub_key_data["id"] owner = pub_key_data["owner"] @@ -947,11 +947,11 @@ return data - def getKeyId(self, actor_id: str) -> str: + def get_key_id(self, actor_id: str) -> str: """Get local key ID from actor ID""" return f"{actor_id}#main-key" - async def checkSignature( + async def check_signature( self, signature: str, key_id: str, @@ -971,11 +971,11 @@ to_sign = "\n".join(f"{k.lower()}: {v}" for k,v in headers.items()) if key_id.startswith("acct:"): actor = key_id[5:] - actor_id = await self.getAPActorIdFromAccount(actor) + actor_id = await self.get_ap_actor_id_from_account(actor) else: actor_id = key_id.split("#", 1)[0] - pub_key_id, pub_key_owner, pub_key = await self.getActorPubKeyData(actor_id) + pub_key_id, pub_key_owner, pub_key = await self.get_actor_pub_key_data(actor_id) if pub_key_id != key_id or pub_key_owner != actor_id: raise exceptions.EncryptionError("Public Key mismatch") @@ -994,7 +994,7 @@ return actor_id - def getSignatureData( + def get_signature_data( self, key_id: str, headers: Dict[str, str] @@ -1028,10 +1028,10 @@ "signature": signature } new_headers = {k: v for k,v in headers.items() if not k.startswith("(")} - new_headers["Signature"] = self.buildSignatureHeader(sign_data) + new_headers["Signature"] = self.build_signature_header(sign_data) return new_headers, sign_data - async def convertAndPostItems( + async def convert_and_post_items( self, client: SatXMPPEntity, ap_account: str, @@ -1049,11 +1049,11 @@ @param subscribe_extra_nodes: if True, extra data nodes will be automatically subscribed, that is comment nodes if present and attachments nodes. """ - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id) + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id) for item in items: if item.name == "item": - cached_item = await self.host.memory.storage.searchPubsubItems({ + cached_item = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [service], "nodes": [node], @@ -1070,10 +1070,10 @@ while root_elt.parent is not None: root_elt = root_elt.parent author_jid = jid.JID(root_elt["from"]).userhostJID() - if subscribe_extra_nodes and not self.isVirtualJID(author_jid): + if subscribe_extra_nodes and not self.is_virtual_jid(author_jid): # we subscribe automatically to comment nodes if any - recipient_jid = self.getLocalJIDFromAccount(ap_account) - recipient_client = self.client.getVirtualClient(recipient_jid) + recipient_jid = self.get_local_jid_from_account(ap_account) + recipient_client = self.client.get_virtual_client(recipient_jid) comments_data = event_data.get("comments") if comments_data: comment_service = jid.JID(comments_data["jid"]) @@ -1097,13 +1097,13 @@ # blog item mb_data = await self._m.item_2_mb_data(client, item, service, node) author_jid = jid.JID(mb_data["author_jid"]) - if subscribe_extra_nodes and not self.isVirtualJID(author_jid): + if subscribe_extra_nodes and not self.is_virtual_jid(author_jid): # we subscribe automatically to comment nodes if any - recipient_jid = self.getLocalJIDFromAccount(ap_account) - recipient_client = self.client.getVirtualClient(recipient_jid) + recipient_jid = self.get_local_jid_from_account(ap_account) + recipient_client = self.client.get_virtual_client(recipient_jid) for comment_data in mb_data.get("comments", []): comment_service = jid.JID(comment_data["service"]) - if self.isVirtualJID(comment_service): + if self.is_virtual_jid(comment_service): log.debug( f"ignoring virtual comment service: {comment_data}" ) @@ -1125,14 +1125,14 @@ url_actor = ap_item["actor"] elif item.name == "retract": - url_actor, ap_item = await self.apDeleteItem( + url_actor, ap_item = await self.ap_delete_item( client.jid, node, item["id"] ) else: raise exceptions.InternalError(f"unexpected element: {item.toXml()}") - await self.signAndPost(inbox, url_actor, ap_item) + await self.sign_and_post(inbox, url_actor, ap_item) - async def convertAndPostAttachments( + async def convert_and_post_attachments( self, client: SatXMPPEntity, ap_account: str, @@ -1162,8 +1162,8 @@ f"{len(items)})" ) - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id) + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id) item_elt = items[0] item_id = item_elt["id"] @@ -1179,16 +1179,16 @@ ) return - if self.isVirtualJID(publisher): + if self.is_virtual_jid(publisher): log.debug(f"ignoring item coming from local virtual JID {publisher}") return if publisher is not None: item_elt["publisher"] = publisher.userhost() - item_service, item_node, item_id = self._pa.attachmentNode2Item(node) - item_account = await self.getAPAccountFromJidAndNode(item_service, item_node) - if self.isVirtualJID(item_service): + item_service, item_node, item_id = self._pa.attachment_node_2_item(node) + item_account = await self.get_ap_account_from_jid_and_node(item_service, item_node) + if self.is_virtual_jid(item_service): # it's a virtual JID mapping to an external AP actor, we can use the # item_id directly item_url = item_id @@ -1199,9 +1199,9 @@ ) return else: - item_url = self.buildAPURL(TYPE_ITEM, item_account, item_id) + item_url = self.build_apurl(TYPE_ITEM, item_account, item_id) - old_attachment_pubsub_items = await self.host.memory.storage.searchPubsubItems({ + old_attachment_pubsub_items = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [service], "nodes": [node], @@ -1211,19 +1211,19 @@ old_attachment = {} else: old_attachment_items = [i.data for i in old_attachment_pubsub_items] - old_attachments = self._pa.items2attachmentData(client, old_attachment_items) + old_attachments = self._pa.items_2_attachment_data(client, old_attachment_items) try: old_attachment = old_attachments[0] except IndexError: # no known element was present in attachments old_attachment = {} - publisher_account = await self.getAPAccountFromJidAndNode( + publisher_account = await self.get_ap_account_from_jid_and_node( publisher, None ) - publisher_actor_id = self.buildAPURL(TYPE_ACTOR, publisher_account) + publisher_actor_id = self.build_apurl(TYPE_ACTOR, publisher_account) try: - attachments = self._pa.items2attachmentData(client, [item_elt])[0] + attachments = self._pa.items_2_attachment_data(client, [item_elt])[0] except IndexError: # no known element was present in attachments attachments = {} @@ -1232,24 +1232,24 @@ if "noticed" in attachments: if not "noticed" in old_attachment: # new "noticed" attachment, we translate to "Like" activity - activity_id = self.buildAPURL("like", item_account, item_id) + activity_id = self.build_apurl("like", item_account, item_id) activity = self.create_activity( TYPE_LIKE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) else: if "noticed" in old_attachment: # "noticed" attachment has been removed, we undo the "Like" activity - activity_id = self.buildAPURL("like", item_account, item_id) + activity_id = self.build_apurl("like", item_account, item_id) activity = self.create_activity( TYPE_LIKE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] undo = self.create_activity("Undo", publisher_actor_id, activity) - await self.signAndPost(inbox, publisher_actor_id, undo) + await self.sign_and_post(inbox, publisher_actor_id, undo) # reactions new_reactions = set(attachments.get("reactions", {}).get("reactions", [])) @@ -1258,7 +1258,7 @@ reactions_add = new_reactions - old_reactions for reactions, undo in ((reactions_remove, True), (reactions_add, False)): for reaction in reactions: - activity_id = self.buildAPURL( + activity_id = self.build_apurl( "reaction", item_account, item_id, reaction.encode().hex() ) reaction_activity = self.create_activity( @@ -1274,7 +1274,7 @@ ) else: activy = reaction_activity - await self.signAndPost(inbox, publisher_actor_id, activy) + await self.sign_and_post(inbox, publisher_actor_id, activy) # RSVP if "rsvp" in attachments: @@ -1282,39 +1282,39 @@ old_attending = old_attachment.get("rsvp", {}).get("attending", "no") if attending != old_attending: activity_type = TYPE_JOIN if attending == "yes" else TYPE_LEAVE - activity_id = self.buildAPURL(activity_type.lower(), item_account, item_id) + activity_id = self.build_apurl(activity_type.lower(), item_account, item_id) activity = self.create_activity( activity_type, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) else: if "rsvp" in old_attachment: old_attending = old_attachment.get("rsvp", {}).get("attending", "no") if old_attending == "yes": - activity_id = self.buildAPURL(TYPE_LEAVE.lower(), item_account, item_id) + activity_id = self.build_apurl(TYPE_LEAVE.lower(), item_account, item_id) activity = self.create_activity( TYPE_LEAVE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) - if service.user and self.isVirtualJID(service): + if service.user and self.is_virtual_jid(service): # the item is on a virtual service, we need to store it in cache log.debug("storing attachments item in cache") - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=True ) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( self.client, cached_node, [item_elt], [attachments] ) - async def signAndPost(self, url: str, actor_id: str, doc: dict) -> TReqResponse: + async def sign_and_post(self, url: str, actor_id: str, doc: dict) -> TReqResponse: """Sign a documentent and post it to AP server @param url: AP server endpoint @@ -1322,7 +1322,7 @@ @param doc: document to send """ if self.verbose: - __, actor_args = self.parseAPURL(actor_id) + __, actor_args = self.parse_apurl(actor_id) actor_account = actor_args[0] to_log = [ "", @@ -1331,7 +1331,7 @@ p_url = parse.urlparse(url) body = json.dumps(doc).encode() - digest_algo, digest_hash = self.getDigest(body) + digest_algo, digest_hash = self.get_digest(body) digest = f"{digest_algo}={digest_hash}" headers = { @@ -1343,7 +1343,7 @@ headers["Content-Type"] = ( 'application/activity+json' ) - headers, __ = self.getSignatureData(self.getKeyId(actor_id), headers) + headers, __ = self.get_signature_data(self.get_key_id(actor_id), headers) if self.verbose: if self.verbose>=3: @@ -1364,19 +1364,19 @@ log.info(f"==> response code: {resp.code}") return resp - def _publishMessage(self, mess_data_s: str, service_s: str, profile: str): + def _publish_message(self, mess_data_s: str, service_s: str, profile: str): mess_data: dict = data_format.deserialise(mess_data_s) # type: ignore service = jid.JID(service_s) - client = self.host.getClient(profile) - return defer.ensureDeferred(self.publishMessage(client, mess_data, service)) + client = self.host.get_client(profile) + return defer.ensureDeferred(self.publish_message(client, mess_data, service)) @async_lru(maxsize=LRU_MAX_SIZE) - async def getAPActorIdFromAccount(self, account: str) -> str: + async def get_ap_actor_id_from_account(self, account: str) -> str: """Retrieve account ID from it's handle using WebFinger Don't use this method to get local actor id from a local account derivated for JID: in this case, the actor ID is retrieve with - ``self.buildAPURL(TYPE_ACTOR, ap_account)`` + ``self.build_apurl(TYPE_ACTOR, ap_account)`` @param account: AP handle (user@domain.tld) @return: Actor ID (which is an URL) @@ -1408,21 +1408,21 @@ ) return href - async def getAPActorDataFromAccount(self, account: str) -> dict: + async def get_ap_actor_data_from_account(self, account: str) -> dict: """Retrieve ActivityPub Actor data @param account: ActivityPub Actor identifier """ - href = await self.getAPActorIdFromAccount(account) - return await self.apGet(href) + href = await self.get_ap_actor_id_from_account(account) + return await self.ap_get(href) - async def getAPInboxFromId(self, actor_id: str, use_shared: bool = True) -> str: + async def get_ap_inbox_from_id(self, actor_id: str, use_shared: bool = True) -> str: """Retrieve inbox of an actor_id @param use_shared: if True, and a shared inbox exists, it will be used instead of the user inbox """ - data = await self.getActorData(actor_id) + data = await self.get_actor_data(actor_id) if use_shared: try: return data["endpoints"]["sharedInbox"] @@ -1431,15 +1431,15 @@ return data["inbox"] @async_lru(maxsize=LRU_MAX_SIZE) - async def getAPAccountFromId(self, actor_id: str) -> str: + async def get_ap_account_from_id(self, actor_id: str) -> str: """Retrieve AP account from the ID URL Works with external or local actor IDs. @param actor_id: AP ID of the actor (URL to the actor data) @return: AP handle """ - if self.isLocalURL(actor_id): - url_type, url_args = self.parseAPURL(actor_id) + if self.is_local_url(actor_id): + url_type, url_args = self.parse_apurl(actor_id) if url_type != "actor" or not url_args: raise exceptions.DataError( f"invalid local actor ID: {actor_id}" @@ -1458,7 +1458,7 @@ return account url_parsed = parse.urlparse(actor_id) - actor_data = await self.getActorData(actor_id) + actor_data = await self.get_actor_data(actor_id) username = actor_data.get("preferredUsername") if not username: raise exceptions.DataError( @@ -1466,7 +1466,7 @@ ) account = f"{username}@{url_parsed.hostname}" # we try to retrieve the actor ID from the account to check it - found_id = await self.getAPActorIdFromAccount(account) + found_id = await self.get_ap_actor_id_from_account(account) if found_id != actor_id: # cf. https://socialhub.activitypub.rocks/t/how-to-retrieve-user-server-tld-handle-from-actors-url/2196 msg = ( @@ -1478,7 +1478,7 @@ raise exceptions.DataError(msg) return account - async def getAPItems( + async def get_ap_items( self, collection: dict, max_items: Optional[int] = None, @@ -1552,7 +1552,7 @@ retrieved_items = 0 current_page = collection["last"] while retrieved_items < count: - page_data, items = await self.parseAPPage( + page_data, items = await self.parse_ap_page( current_page, parser, only_ids ) if not items: @@ -1588,7 +1588,7 @@ found_after_id = False while retrieved_items < count: - __, page_items = await self.parseAPPage(page, parser, only_ids) + __, page_items = await self.parse_ap_page(page, parser, only_ids) if not page_items: break retrieved_items += len(page_items) @@ -1661,7 +1661,7 @@ __, item_elt = await self.ap_item_2_mb_data_and_elt(ap_item) return item_elt - async def parseAPPage( + async def parse_ap_page( self, page: Union[str, dict], parser: Callable[[dict], Awaitable[domish.Element]], @@ -1674,13 +1674,13 @@ @param only_ids: if True, only retrieve items IDs @return: page data, pubsub items """ - page_data = await self.apGetObject(page) + page_data = await self.ap_get_object(page) if page_data is None: log.warning('No data found in collection') return {}, [] - ap_items = await self.apGetList(page_data, "orderedItems", only_ids=only_ids) + ap_items = await self.ap_get_list(page_data, "orderedItems", only_ids=only_ids) if ap_items is None: - ap_items = await self.apGetList(page_data, "items", only_ids=only_ids) + ap_items = await self.ap_get_list(page_data, "items", only_ids=only_ids) if not ap_items: log.warning(f'No item field found in collection: {page_data!r}') return page_data, [] @@ -1699,7 +1699,7 @@ return page_data, items - async def getCommentsNodes( + async def get_comments_nodes( self, item_id: str, parent_id: Optional[str] @@ -1719,13 +1719,13 @@ """ if parent_id is None or not self.comments_max_depth: return ( - self._m.getCommentsNode(parent_id) if parent_id is not None else None, - self._m.getCommentsNode(item_id) + self._m.get_comments_node(parent_id) if parent_id is not None else None, + self._m.get_comments_node(item_id) ) parent_url = parent_id parents = [] for __ in range(COMMENTS_MAX_PARENTS): - parent_item = await self.apGet(parent_url) + parent_item = await self.ap_get(parent_url) parents.insert(0, parent_item) parent_url = parent_item.get("inReplyTo") if parent_url is None: @@ -1733,13 +1733,13 @@ parent_limit = self.comments_max_depth-1 if len(parents) <= parent_limit: return ( - self._m.getCommentsNode(parents[-1]["id"]), - self._m.getCommentsNode(item_id) + self._m.get_comments_node(parents[-1]["id"]), + self._m.get_comments_node(item_id) ) else: last_level_item = parents[parent_limit] return ( - self._m.getCommentsNode(last_level_item["id"]), + self._m.get_comments_node(last_level_item["id"]), None ) @@ -1755,7 +1755,7 @@ """ is_activity = self.is_activity(ap_item) if is_activity: - ap_object = await self.apGetObject(ap_item, "object") + ap_object = await self.ap_get_object(ap_item, "object") if not ap_object: log.warning(f'No "object" found in AP item {ap_item!r}') raise exceptions.DataError @@ -1815,16 +1815,16 @@ # author if is_activity: - authors = await self.apGetActors(ap_item, "actor") + authors = await self.ap_get_actors(ap_item, "actor") else: - authors = await self.apGetActors(ap_object, "attributedTo") + authors = await self.ap_get_actors(ap_object, "attributedTo") if len(authors) > 1: # we only keep first item as author # TODO: handle multiple actors log.warning("multiple actors are not managed") account = authors[0] - author_jid = self.getLocalJIDFromAccount(account).full() + author_jid = self.get_local_jid_from_account(account).full() mb_data["author"] = account.split("@", 1)[0] mb_data["author_jid"] = author_jid @@ -1848,12 +1848,12 @@ # comments in_reply_to = ap_object.get("inReplyTo") - __, comments_node = await self.getCommentsNodes(item_id, in_reply_to) + __, comments_node = await self.get_comments_nodes(item_id, in_reply_to) if comments_node is not None: comments_data = { "service": author_jid, "node": comments_node, - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=author_jid, node=comments_node @@ -1863,7 +1863,7 @@ return mb_data - async def getReplyToIdFromXMPPNode( + async def get_reply_to_id_from_xmpp_node( self, client: SatXMPPEntity, ap_account: str, @@ -1885,7 +1885,7 @@ """ # FIXME: propose a protoXEP to properly get parent item, node and service - found_items = await self.host.memory.storage.searchPubsubItems({ + found_items = await self.host.memory.storage.search_pubsub_items({ "profiles": [client.profile], "names": [parent_item] }) @@ -1894,7 +1894,7 @@ parent_ap_account = ap_account elif len(found_items) == 1: cached_node = found_items[0].node - parent_ap_account = await self.getAPAccountFromJidAndNode( + parent_ap_account = await self.get_ap_account_from_jid_and_node( cached_node.service, cached_node.name ) @@ -1917,12 +1917,12 @@ parent_ap_account = ap_account else: cached_node = cached_item.node - parent_ap_account = await self.getAPAccountFromJidAndNode( + parent_ap_account = await self.get_ap_account_from_jid_and_node( cached_node.service, cached_node.name ) - return self.buildAPURL( + return self.build_apurl( TYPE_ITEM, parent_ap_account, parent_item ) @@ -1937,11 +1937,11 @@ """ repeated = mb_data["extra"]["repeated"] repeater = jid.JID(repeated["by"]) - repeater_account = await self.getAPAccountFromJidAndNode( + repeater_account = await self.get_ap_account_from_jid_and_node( repeater, None ) - repeater_id = self.buildAPURL(TYPE_ACTOR, repeater_account) + repeater_id = self.build_apurl(TYPE_ACTOR, repeater_account) repeated_uri = repeated["uri"] if not repeated_uri.startswith("xmpp:"): @@ -1950,7 +1950,7 @@ f"item {mb_data}" ) raise NotImplementedError - parsed_url = uri.parseXMPPUri(repeated_uri) + parsed_url = uri.parse_xmpp_uri(repeated_uri) if parsed_url["type"] != "pubsub": log.warning( "Only pubsub URL are handled for repeated item at the moment, ignoring " @@ -1959,9 +1959,9 @@ raise NotImplementedError rep_service = jid.JID(parsed_url["path"]) rep_item = parsed_url["item"] - activity_id = self.buildAPURL("item", repeater.userhost(), mb_data["id"]) + activity_id = self.build_apurl("item", repeater.userhost(), mb_data["id"]) - if self.isVirtualJID(rep_service): + if self.is_virtual_jid(rep_service): # it's an AP actor linked through this gateway # in this case we can simply use the item ID if not rep_item.startswith("https:"): @@ -1974,18 +1974,18 @@ else: # the repeated item is an XMPP publication, we build the corresponding ID rep_node = parsed_url["node"] - repeated_account = await self.getAPAccountFromJidAndNode( + repeated_account = await self.get_ap_account_from_jid_and_node( rep_service, rep_node ) - announced_uri = self.buildAPURL("item", repeated_account, rep_item) + announced_uri = self.build_apurl("item", repeated_account, rep_item) announce = self.create_activity( "Announce", repeater_id, announced_uri, activity_id=activity_id ) announce["to"] = [NS_AP_PUBLIC] announce["cc"] = [ - self.buildAPURL(TYPE_FOLLOWERS, repeater_account), - await self.getAPActorIdFromAccount(repeated_account) + self.build_apurl(TYPE_FOLLOWERS, repeater_account), + await self.get_ap_actor_id_from_account(repeated_account) ] return announce @@ -2020,12 +2020,12 @@ mb_data["id"] = shortuuid.uuid() if not mb_data.get("author_jid"): mb_data["author_jid"] = client.jid.userhost() - ap_account = await self.getAPAccountFromJidAndNode( + ap_account = await self.get_ap_account_from_jid_and_node( jid.JID(mb_data["author_jid"]), None ) - url_actor = self.buildAPURL(TYPE_ACTOR, ap_account) - url_item = self.buildAPURL(TYPE_ITEM, ap_account, mb_data["id"]) + url_actor = self.build_apurl(TYPE_ACTOR, ap_account) + url_item = self.build_apurl(TYPE_ITEM, ap_account, mb_data["id"]) ap_object = { "id": url_item, "type": "Note", @@ -2076,7 +2076,7 @@ # references continue try: - mentioned_id = await self.getAPActorIdFromAccount(mentioned) + mentioned_id = await self.get_ap_actor_id_from_account(mentioned) except Exception as e: log.warning(f"Can't add mention to {mentioned!r}: {e}") else: @@ -2094,27 +2094,27 @@ raise exceptions.InternalError( "node or service is missing in mb_data" ) - target_ap_account = await self.getAPAccountFromJidAndNode( + target_ap_account = await self.get_ap_account_from_jid_and_node( service, node ) - if self.isVirtualJID(service): + if self.is_virtual_jid(service): # service is a proxy JID for AP account - actor_data = await self.getAPActorDataFromAccount(target_ap_account) + actor_data = await self.get_ap_actor_data_from_account(target_ap_account) followers = actor_data.get("followers") else: # service is a real XMPP entity - followers = self.buildAPURL(TYPE_FOLLOWERS, target_ap_account) + followers = self.build_apurl(TYPE_FOLLOWERS, target_ap_account) if followers: ap_object["cc"] = [followers] - if self._m.isCommentNode(node): - parent_item = self._m.getParentItem(node) - if self.isVirtualJID(service): + if self._m.is_comment_node(node): + parent_item = self._m.get_parent_item(node) + if self.is_virtual_jid(service): # the publication is on a virtual node (i.e. an XMPP node managed by # this gateway and linking to an ActivityPub actor) ap_object["inReplyTo"] = parent_item else: # the publication is from a followed real XMPP node - ap_object["inReplyTo"] = await self.getReplyToIdFromXMPPNode( + ap_object["inReplyTo"] = await self.get_reply_to_id_from_xmpp_node( client, ap_account, parent_item, @@ -2125,7 +2125,7 @@ "Create" if is_new else "Update", url_actor, ap_object, activity_id=url_item ) - async def publishMessage( + async def publish_message( self, client: SatXMPPEntity, mess_data: dict, @@ -2151,7 +2151,7 @@ if not service.user: raise ValueError("service must have a local part") account = self._e.unescape(service.user) - ap_actor_data = await self.getAPActorDataFromAccount(account) + ap_actor_data = await self.get_ap_actor_data_from_account(account) try: inbox_url = ap_actor_data["endpoints"]["sharedInbox"] @@ -2160,9 +2160,9 @@ item_data = await self.mb_data_2_ap_item(client, mess_data) url_actor = item_data["actor"] - resp = await self.signAndPost(inbox_url, url_actor, item_data) + resp = await self.sign_and_post(inbox_url, url_actor, item_data) - async def apDeleteItem( + async def ap_delete_item( self, jid_: jid.JID, node: Optional[str], @@ -2182,10 +2182,10 @@ if node is None: node = self._m.namespace - author_account = await self.getAPAccountFromJidAndNode(jid_, node) - author_actor_id = self.buildAPURL(TYPE_ACTOR, author_account) + author_account = await self.get_ap_account_from_jid_and_node(jid_, node) + author_actor_id = self.build_apurl(TYPE_ACTOR, author_account) - items = await self.host.memory.storage.searchPubsubItems({ + items = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [jid_], "names": [item_id] @@ -2210,7 +2210,7 @@ f"{items[0].toXml()}" ) - url_item = self.buildAPURL(TYPE_ITEM, author_account, item_id) + url_item = self.build_apurl(TYPE_ITEM, author_account, item_id) ap_item = self.create_activity( "Delete", author_actor_id, @@ -2223,7 +2223,7 @@ ap_item["to"] = [NS_AP_PUBLIC] return author_actor_id, ap_item - def _messageReceivedTrigger( + def _message_received_trigger( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2248,7 +2248,7 @@ if mess_data["type"] not in ("chat", "normal"): log.warning(f"ignoring message with unexpected type: {mess_data}") return mess_data - if not self.isLocal(mess_data["from"]): + if not self.is_local(mess_data["from"]): log.warning(f"ignoring non local message: {mess_data}") return mess_data if not mess_data["to"].user: @@ -2258,8 +2258,8 @@ return mess_data actor_account = self._e.unescape(mess_data["to"].user) - actor_id = await self.getAPActorIdFromAccount(actor_account) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) + actor_id = await self.get_ap_actor_id_from_account(actor_account) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) try: language, message = next(iter(mess_data["message"].items())) @@ -2282,7 +2282,7 @@ C.KEY_ATTACHMENTS: attachments } - client = self.client.getVirtualClient(mess_data["from"]) + client = self.client.get_virtual_client(mess_data["from"]) ap_item = await self.mb_data_2_ap_item(client, mb_data, public=False) ap_object = ap_item["object"] ap_object["to"] = ap_item["to"] = [actor_id] @@ -2294,10 +2294,10 @@ "name": f"@{actor_account}", }) - await self.signAndPost(inbox, ap_item["actor"], ap_item) + await self.sign_and_post(inbox, ap_item["actor"], ap_item) return mess_data - async def _onMessageRetract( + async def _on_message_retract( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2307,7 +2307,7 @@ if client != self.client: return True from_jid = jid.JID(message_elt["from"]) - if not self.isLocal(from_jid): + if not self.is_local(from_jid): log.debug( f"ignoring retract request from non local jid {from_jid}" ) @@ -2319,15 +2319,15 @@ f"Invalid destinee's JID: {to_jid.full()}" ) ap_account = self._e.unescape(to_jid.user) - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) - url_actor, ap_item = await self.apDeleteItem( + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) + url_actor, ap_item = await self.ap_delete_item( from_jid.userhostJID(), None, fastened_elts.id, public=False ) - resp = await self.signAndPost(inbox, url_actor, ap_item) + resp = await self.sign_and_post(inbox, url_actor, ap_item) return False - async def _onReferenceReceived( + async def _on_reference_received( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2352,7 +2352,7 @@ return False ap_account = self._e.unescape(mentioned.user) - actor_id = await self.getAPActorIdFromAccount(ap_account) + actor_id = await self.get_ap_actor_id_from_account(ap_account) parsed_anchor: dict = reference_data.get("parsed_anchor") if not parsed_anchor: @@ -2380,14 +2380,14 @@ log.warning(f"missing pubsub item in anchor: {reference_data['anchor']}") return False - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, pubsub_service, pubsub_node ) if not cached_node: log.warning(f"Anchored node not found in cache: {reference_data['anchor']}") return False - cached_items, __ = await self.host.memory.storage.getItems( + cached_items, __ = await self.host.memory.storage.get_items( cached_node, item_ids=[pubsub_item] ) if not cached_items: @@ -2410,13 +2410,13 @@ "name": ap_account, }) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) - resp = await self.signAndPost(inbox, ap_item["actor"], ap_item) + resp = await self.sign_and_post(inbox, ap_item["actor"], ap_item) return False - async def newReplyToXMPPItem( + async def new_reply_to_xmpp_item( self, client: SatXMPPEntity, ap_item: dict, @@ -2425,7 +2425,7 @@ ) -> None: """We got an AP item which is a reply to an XMPP item""" in_reply_to = ap_item["inReplyTo"] - url_type, url_args = self.parseAPURL(in_reply_to) + url_type, url_args = self.parse_apurl(in_reply_to) if url_type != "item": log.warning( "Ignoring AP item replying to an XMPP item with an unexpected URL " @@ -2440,12 +2440,12 @@ f"({in_reply_to!r}):\n{pformat(ap_item)}" ) return - parent_item_service, parent_item_node = await self.getJIDAndNode( + parent_item_service, parent_item_node = await self.get_jid_and_node( parent_item_account ) if parent_item_node is None: parent_item_node = self._m.namespace - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, parent_item_service, parent_item_node, item_ids=[parent_item_id] ) try: @@ -2463,17 +2463,17 @@ comment_node = parent_item_parsed["comments"][0]["node"] except (KeyError, IndexError): # we don't have a comment node set for this item - from sat.tools.xml_tools import ppElt - log.info(f"{ppElt(parent_item_elt.toXml())}") + from sat.tools.xml_tools import pp_elt + log.info(f"{pp_elt(parent_item_elt.toXml())}") raise NotImplementedError() else: __, item_elt = await self.ap_item_2_mb_data_and_elt(ap_item) await self._p.publish(client, comment_service, comment_node, [item_elt]) - await self.notifyMentions( + await self.notify_mentions( targets, mentions, comment_service, comment_node, item_elt["id"] ) - def getAPItemTargets( + def get_ap_item_targets( self, item: Dict[str, Any] ) -> Tuple[bool, Dict[str, Set[str]], List[Dict[str, str]]]: @@ -2499,9 +2499,9 @@ continue if not value: continue - if not self.isLocalURL(value): + if not self.is_local_url(value): continue - target_type = self.parseAPURL(value)[0] + target_type = self.parse_apurl(value)[0] if target_type != TYPE_ACTOR: log.debug(f"ignoring non actor type as a target: {href}") else: @@ -2517,9 +2517,9 @@ if not href: log.warning('Missing "href" field from mention object: {tag!r}') continue - if not self.isLocalURL(href): + if not self.is_local_url(href): continue - uri_type = self.parseAPURL(href)[0] + uri_type = self.parse_apurl(href)[0] if uri_type != TYPE_ACTOR: log.debug(f"ignoring non actor URI as a target: {href}") continue @@ -2531,7 +2531,7 @@ return is_public, targets, mentions - async def newAPItem( + async def new_ap_item( self, client: SatXMPPEntity, destinee: Optional[jid.JID], @@ -2544,14 +2544,14 @@ @param node: XMPP pubsub node @param item: AP object payload """ - is_public, targets, mentions = self.getAPItemTargets(item) + is_public, targets, mentions = self.get_ap_item_targets(item) if not is_public and targets.keys() == {TYPE_ACTOR}: # this is a direct message await self.handle_message_ap_item( client, targets, mentions, destinee, item ) else: - await self.handlePubsubAPItem( + await self.handle_pubsub_ap_item( client, targets, mentions, destinee, node, item, is_public ) @@ -2570,7 +2570,7 @@ @param item: AP object payload """ targets_jids = { - await self.getJIDFromId(t) + await self.get_jid_from_id(t) for t_set in targets.values() for t in t_set } @@ -2596,7 +2596,7 @@ ) await defer.DeferredList(defer_l) - async def notifyMentions( + async def notify_mentions( self, targets: Dict[str, Set[str]], mentions: List[Dict[str, str]], @@ -2612,14 +2612,14 @@ https://www.w3.org/TR/activitystreams-vocabulary/#microsyntaxes). """ - anchor = uri.buildXMPPUri("pubsub", path=service.full(), node=node, item=item_id) + anchor = uri.build_xmpp_uri("pubsub", path=service.full(), node=node, item=item_id) seen = set() # we start with explicit mentions because mentions' content will be used in the # future to fill "begin" and "end" reference attributes (we can't do it at the # moment as there is no way to specify the XML element to use in the blog item). for mention in mentions: - mentioned_jid = await self.getJIDFromId(mention["uri"]) - self._refs.sendReference( + mentioned_jid = await self.get_jid_from_id(mention["uri"]) + self._refs.send_reference( self.client, to_jid=mentioned_jid, anchor=anchor @@ -2627,18 +2627,18 @@ seen.add(mentioned_jid) remaining = { - await self.getJIDFromId(t) + await self.get_jid_from_id(t) for t_set in targets.values() for t in t_set } - seen for target in remaining: - self._refs.sendReference( + self._refs.send_reference( self.client, to_jid=target, anchor=anchor ) - async def handlePubsubAPItem( + async def handle_pubsub_ap_item( self, client: SatXMPPEntity, targets: Dict[str, Set[str]], @@ -2663,23 +2663,23 @@ if in_reply_to and isinstance(in_reply_to, list): in_reply_to = in_reply_to[0] if in_reply_to and isinstance(in_reply_to, str): - if self.isLocalURL(in_reply_to): + if self.is_local_url(in_reply_to): # this is a reply to an XMPP item - await self.newReplyToXMPPItem(client, item, targets, mentions) + await self.new_reply_to_xmpp_item(client, item, targets, mentions) return # this item is a reply to an AP item, we use or create a corresponding node # for comments - parent_node, __ = await self.getCommentsNodes(item["id"], in_reply_to) + parent_node, __ = await self.get_comments_nodes(item["id"], in_reply_to) node = parent_node or node - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=True, create_kwargs={"subscribed": True} ) else: # it is a root item (i.e. not a reply to an other item) create = node == self._events.namespace - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=create ) if cached_node is None: @@ -2693,7 +2693,7 @@ data, item_elt = await self.ap_events.ap_item_2_event_data_and_elt(item) else: data, item_elt = await self.ap_item_2_mb_data_and_elt(item) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( client, cached_node, [item_elt], @@ -2709,9 +2709,9 @@ [(subscription.subscriber, None, [item_elt])] ) - await self.notifyMentions(targets, mentions, service, node, item_elt["id"]) + await self.notify_mentions(targets, mentions, service, node, item_elt["id"]) - async def newAPDeleteItem( + async def new_ap_delete_item( self, client: SatXMPPEntity, destinee: Optional[jid.JID], @@ -2731,7 +2731,7 @@ raise exceptions.DataError('"id" attribute is missing in item') if not item_id.startswith("http"): raise exceptions.DataError(f"invalid id: {item_id!r}") - if self.isLocalURL(item_id): + if self.is_local_url(item_id): raise ValueError("Local IDs should not be used") # we have no way to know if a deleted item is a direct one (thus a message) or one @@ -2755,10 +2755,10 @@ ) raise exceptions.PermissionError("forbidden") - await self._r.retractByHistory(client, history) + await self._r.retract_by_history(client, history) else: # no history in cache with this ID, it's probably a pubsub item - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, client.jid, node, with_subscriptions=True ) if cached_node is None: @@ -2767,7 +2767,7 @@ "which is not cached" ) raise exceptions.NotFound - await self.host.memory.storage.deletePubsubItems(cached_node, [item_id]) + await self.host.memory.storage.delete_pubsub_items(cached_node, [item_id]) # notifyRetract is expecting domish.Element instances item_elt = domish.Element((None, "item")) item_elt["id"] = item_id diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_ap_gateway/ad_hoc.py --- a/sat/plugins/plugin_comp_ap_gateway/ad_hoc.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/ad_hoc.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,7 +38,7 @@ self._c = self.host.plugins["XEP-0050"] def init(self, client: SatXMPPEntity) -> None: - self._c.addAdHocCommand( + self._c.add_ad_hoc_command( client, self.xmpp_jid_node_2_ap_actor, "Convert XMPP JID/Node to AP actor", @@ -82,7 +82,7 @@ else: xmpp_jid = jid.JID(command_form["jid"]) xmpp_node = command_form.get("node") - actor = await self.apg.getAPAccountFromJidAndNode(xmpp_jid, xmpp_node) + actor = await self.apg.get_ap_account_from_jid_and_node(xmpp_jid, xmpp_node) note = (self._c.NOTE.INFO, actor) status = self._c.STATUS.COMPLETED payload = None diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_ap_gateway/events.py --- a/sat/plugins/plugin_comp_ap_gateway/events.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/events.py Sat Apr 08 13:54:42 2023 +0200 @@ -113,12 +113,12 @@ """ if not event_data.get("id"): event_data["id"] = shortuuid.uuid() - ap_account = await self.apg.getAPAccountFromJidAndNode( + ap_account = await self.apg.get_ap_account_from_jid_and_node( author_jid, self._events.namespace ) - url_actor = self.apg.buildAPURL(TYPE_ACTOR, ap_account) - url_item = self.apg.buildAPURL(TYPE_ITEM, ap_account, event_data["id"]) + url_actor = self.apg.build_apurl(TYPE_ACTOR, ap_account) + url_item = self.apg.build_apurl(TYPE_ITEM, ap_account, event_data["id"]) ap_object = { "actor": url_actor, "attributedTo": url_actor, @@ -246,7 +246,7 @@ """ is_activity = self.apg.is_activity(ap_item) if is_activity: - ap_object = await self.apg.apGetObject(ap_item, "object") + ap_object = await self.apg.ap_get_object(ap_item, "object") if not ap_object: log.warning(f'No "object" found in AP item {ap_item!r}') raise exceptions.DataError @@ -257,7 +257,7 @@ if "_repeated" in ap_item: # if the event is repeated, we use the original one ID repeated_uri = ap_item["_repeated"]["uri"] - parsed_uri = uri.parseXMPPUri(repeated_uri) + parsed_uri = uri.parse_xmpp_uri(repeated_uri) object_id = parsed_uri["item"] else: object_id = ap_object.get("id") @@ -268,10 +268,10 @@ raise exceptions.DataError("AP Object is not an event") # author - actor = await self.apg.apGetSenderActor(ap_object) + actor = await self.apg.ap_get_sender_actor(ap_object) - account = await self.apg.getAPAccountFromId(actor) - author_jid = self.apg.getLocalJIDFromAccount(account).full() + account = await self.apg.get_ap_account_from_id(actor) + author_jid = self.apg.get_local_jid_from_account(account).full() # name, start, end event_data = { @@ -370,7 +370,7 @@ # comments if ap_object.get("commentsEnabled"): - __, comments_node = await self.apg.getCommentsNodes(object_id, None) + __, comments_node = await self.apg.get_comments_nodes(object_id, None) event_data["comments"] = { "service": author_jid, "node": comments_node, diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_ap_gateway/http_server.py --- a/sat/plugins/plugin_comp_ap_gateway/http_server.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/http_server.py Sat Apr 08 13:54:42 2023 +0200 @@ -66,7 +66,7 @@ self._seen_digest = deque(maxlen=50) super().__init__() - def responseCode( + def response_code( self, request: "HTTPRequest", http_code: int, @@ -77,17 +77,17 @@ log.warning(msg) request.setResponseCode(http_code, None if msg is None else msg.encode()) - def _onRequestError(self, failure_: failure.Failure, request: "HTTPRequest") -> None: + def _on_request_error(self, failure_: failure.Failure, request: "HTTPRequest") -> None: exc = failure_.value if isinstance(exc, exceptions.NotFound): - self.responseCode( + self.response_code( request, http.NOT_FOUND, str(exc) ) else: log.exception(f"Internal error: {failure_.value}") - self.responseCode( + self.response_code( request, http.INTERNAL_SERVER_ERROR, f"internal error: {failure_.value}" @@ -107,7 +107,7 @@ http.BAD_REQUEST, "Bad Request" , "Invalid webfinger resource" ).render(request) - actor_url = self.apg.buildAPURL(TYPE_ACTOR, account) + actor_url = self.apg.build_apurl(TYPE_ACTOR, account) resp = { "aliases": [actor_url], @@ -124,7 +124,7 @@ request.write(json.dumps(resp).encode()) request.finish() - async def handleUndoActivity( + async def handle_undo_activity( self, request: "HTTPRequest", data: dict, @@ -136,7 +136,7 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) + client = await self.apg.get_virtual_client(signing_actor) object_ = data.get("object") if isinstance(object_, str): # we check first if it's not a cached object @@ -149,10 +149,10 @@ # because we'll undo the activity, we can remove it from cache await self.apg.client._ap_storage.remove(ap_cache_key) else: - objects = await self.apg.apGetList(data, "object") + objects = await self.apg.ap_get_list(data, "object") for obj in objects: type_ = obj.get("type") - actor = await self.apg.apGetSenderActor(obj) + actor = await self.apg.ap_get_sender_actor(obj) if actor != signing_actor: log.warning(f"ignoring object not attributed to signing actor: {data}") continue @@ -163,7 +163,7 @@ except KeyError: log.warning(f'ignoring invalid object, missing "object" key: {data}') continue - if not self.apg.isLocalURL(target_account): + if not self.apg.is_local_url(target_account): log.warning(f"ignoring unfollow request to non local actor: {data}") continue await self.apg._p.unsubscribe( @@ -175,17 +175,17 @@ elif type_ == "Announce": # we can use directly the Announce object, as only the "id" field is # needed - await self.apg.newAPDeleteItem(client, None, node, obj) + await self.apg.new_ap_delete_item(client, None, node, obj) elif type_ == TYPE_LIKE: - await self.handleAttachmentItem(client, obj, {"noticed": False}) + await self.handle_attachment_item(client, obj, {"noticed": False}) elif type_ == TYPE_REACTION: - await self.handleAttachmentItem(client, obj, { + await self.handle_attachment_item(client, obj, { "reactions": {"operation": "update", "remove": [obj["content"]]} }) else: log.warning(f"Unmanaged undo type: {type_!r}") - async def handleFollowActivity( + async def handle_follow_activity( self, request: "HTTPRequest", data: dict, @@ -197,14 +197,14 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) + client = await self.apg.get_virtual_client(signing_actor) try: subscription = await self.apg._p.subscribe( client, account_jid, node, # subscriptions from AP are always public - options=self.apg._pps.setPublicOpt() + options=self.apg._pps.set_public_opt() ) except pubsub.SubscriptionPending: log.info(f"subscription to node {node!r} of {account_jid} is pending") @@ -213,15 +213,15 @@ if subscription.state != "subscribed": # other states should raise an Exception raise exceptions.InternalError('"subscribed" state was expected') - inbox = await self.apg.getAPInboxFromId(signing_actor, use_shared=False) - actor_id = self.apg.buildAPURL(TYPE_ACTOR, ap_account) + inbox = await self.apg.get_ap_inbox_from_id(signing_actor, use_shared=False) + actor_id = self.apg.build_apurl(TYPE_ACTOR, ap_account) accept_data = self.apg.create_activity( "Accept", actor_id, object_=data ) - await self.apg.signAndPost(inbox, actor_id, accept_data) + await self.apg.sign_and_post(inbox, actor_id, accept_data) await self.apg._c.synchronise(client, account_jid, node, resync=False) - async def handleAcceptActivity( + async def handle_accept_activity( self, request: "HTTPRequest", data: dict, @@ -233,12 +233,12 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: type_ = obj.get("type") if type_ == "Follow": - follow_node = await self.apg.host.memory.storage.getPubsubNode( + follow_node = await self.apg.host.memory.storage.get_pubsub_node( client, client.jid, node, with_subscriptions=True ) if follow_node is None: @@ -270,7 +270,7 @@ else: log.warning(f"Unmanaged accept type: {type_!r}") - async def handleDeleteActivity( + async def handle_delete_activity( self, request: "HTTPRequest", data: dict, @@ -282,12 +282,12 @@ ): if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: - await self.apg.newAPDeleteItem(client, account_jid, node, obj) + await self.apg.new_ap_delete_item(client, account_jid, node, obj) - async def handleNewAPItems( + async def handle_new_ap_items( self, request: "HTTPRequest", data: dict, @@ -298,7 +298,7 @@ ): """Helper method to handle workflow for new AP items - accept globally the same parameter as for handleCreateActivity + accept globally the same parameter as for handle_create_activity @param repeated: if True, the item is an item republished from somewhere else """ if "_repeated" in data: @@ -307,25 +307,25 @@ f"happen. Ignoring object from {signing_actor}\n{data}" ) raise exceptions.DataError("unexpected field in item") - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: if node is None: if obj.get("type") == TYPE_EVENT: node = self.apg._events.namespace else: node = self.apg._m.namespace - sender = await self.apg.apGetSenderActor(obj) + sender = await self.apg.ap_get_sender_actor(obj) if repeated: # we don't check sender when item is repeated, as it should be different # from post author in this case - sender_jid = await self.apg.getJIDFromId(sender) - repeater_jid = await self.apg.getJIDFromId(signing_actor) + sender_jid = await self.apg.get_jid_from_id(sender) + repeater_jid = await self.apg.get_jid_from_id(signing_actor) repeated_item_id = obj["id"] - if self.apg.isLocalURL(repeated_item_id): + if self.apg.is_local_url(repeated_item_id): # the repeated object is from XMPP, we need to parse the URL to find # the right ID - url_type, url_args = self.apg.parseAPURL(repeated_item_id) + url_type, url_args = self.apg.parse_apurl(repeated_item_id) if url_type != "item": raise exceptions.DataError( "local URI is not an item: {repeated_id}" @@ -339,7 +339,7 @@ "local URI is invalid: {repeated_id}" ) else: - url_jid, url_node = await self.apg.getJIDAndNode(url_account) + url_jid, url_node = await self.apg.get_jid_and_node(url_account) if ((url_jid != sender_jid or url_node and url_node != self.apg._m.namespace)): raise exceptions.DataError( @@ -352,7 +352,7 @@ obj["_repeated"] = { "by": repeater_jid.full(), "at": data.get("published"), - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=sender_jid.full(), node=self.apg._m.namespace, @@ -369,9 +369,9 @@ ) continue - await self.apg.newAPItem(client, account_jid, node, obj) + await self.apg.new_ap_item(client, account_jid, node, obj) - async def handleCreateActivity( + async def handle_create_activity( self, request: "HTTPRequest", data: dict, @@ -381,9 +381,9 @@ ap_url: str, signing_actor: str ): - await self.handleNewAPItems(request, data, account_jid, node, signing_actor) + await self.handle_new_ap_items(request, data, account_jid, node, signing_actor) - async def handleUpdateActivity( + async def handle_update_activity( self, request: "HTTPRequest", data: dict, @@ -395,9 +395,9 @@ ): # Update is the same as create: the item ID stays the same, thus the item will be # overwritten - await self.handleNewAPItems(request, data, account_jid, node, signing_actor) + await self.handle_new_ap_items(request, data, account_jid, node, signing_actor) - async def handleAnnounceActivity( + async def handle_announce_activity( self, request: "HTTPRequest", data: dict, @@ -408,7 +408,7 @@ signing_actor: str ): # we create a new item - await self.handleNewAPItems( + await self.handle_new_ap_items( request, data, account_jid, @@ -417,7 +417,7 @@ repeated=True ) - async def handleAttachmentItem( + async def handle_attachment_item( self, client: SatXMPPEntity, data: dict, @@ -447,10 +447,10 @@ await client._ap_storage.aset(f"{ST_AP_CACHE}{data['id']}", data) for target_id in target_ids: - if not self.apg.isLocalURL(target_id): + if not self.apg.is_local_url(target_id): log.debug(f"ignoring non local target ID: {target_id}") continue - url_type, url_args = self.apg.parseAPURL(target_id) + url_type, url_args = self.apg.parse_apurl(target_id) if url_type != TYPE_ITEM: log.warning(f"unexpected local URL for attachment on item {target_id}") continue @@ -458,20 +458,20 @@ account, item_id = url_args except ValueError: raise ValueError(f"invalid URL: {target_id}") - author_jid, item_node = await self.apg.getJIDAndNode(account) + author_jid, item_node = await self.apg.get_jid_and_node(account) if item_node is None: item_node = self.apg._m.namespace - attachment_node = self.apg._pa.getAttachmentNodeName( + attachment_node = self.apg._pa.get_attachment_node_name( author_jid, item_node, item_id ) - cached_node = await self.apg.host.memory.storage.getPubsubNode( + cached_node = await self.apg.host.memory.storage.get_pubsub_node( client, author_jid, attachment_node, with_subscriptions=True, create=True ) - found_items, __ = await self.apg.host.memory.storage.getItems( + found_items, __ = await self.apg.host.memory.storage.get_items( cached_node, item_ids=[client.jid.userhost()] ) if not found_items: @@ -487,16 +487,16 @@ None ) # we reparse the element, as there can be other attachments - attachments_data = self.apg._pa.items2attachmentData(client, [item_elt]) + attachments_data = self.apg._pa.items_2_attachment_data(client, [item_elt]) # and we update the cache - await self.apg.host.memory.storage.cachePubsubItems( + await self.apg.host.memory.storage.cache_pubsub_items( client, cached_node, [item_elt], attachments_data or [{}] ) - if self.apg.isVirtualJID(author_jid): + if self.apg.is_virtual_jid(author_jid): # the attachment is on t a virtual pubsub service (linking to an AP item), # we notify all subscribers for subscription in cached_node.subscriptions: @@ -509,11 +509,11 @@ ) else: # the attachment is on an XMPP item, we publish it to the attachment node - await self.apg._p.sendItems( + await self.apg._p.send_items( client, author_jid, attachment_node, [item_elt] ) - async def handleLikeActivity( + async def handle_like_activity( self, request: "HTTPRequest", data: dict, @@ -523,10 +523,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"noticed": True}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"noticed": True}) - async def handleEmojireactActivity( + async def handle_emojireact_activity( self, request: "HTTPRequest", data: dict, @@ -536,12 +536,12 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, { + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, { "reactions": {"operation": "update", "add": [data["content"]]} }) - async def handleJoinActivity( + async def handle_join_activity( self, request: "HTTPRequest", data: dict, @@ -551,10 +551,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"rsvp": {"attending": "yes"}}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"rsvp": {"attending": "yes"}}) - async def handleLeaveActivity( + async def handle_leave_activity( self, request: "HTTPRequest", data: dict, @@ -564,10 +564,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"rsvp": {"attending": "no"}}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"rsvp": {"attending": "no"}}) - async def APActorRequest( + async def ap_actor_request( self, request: "HTTPRequest", data: Optional[dict], @@ -577,24 +577,24 @@ ap_url: str, signing_actor: Optional[str] ) -> dict: - inbox = self.apg.buildAPURL(TYPE_INBOX, ap_account) - shared_inbox = self.apg.buildAPURL(TYPE_SHARED_INBOX) - outbox = self.apg.buildAPURL(TYPE_OUTBOX, ap_account) - followers = self.apg.buildAPURL(TYPE_FOLLOWERS, ap_account) - following = self.apg.buildAPURL(TYPE_FOLLOWING, ap_account) + inbox = self.apg.build_apurl(TYPE_INBOX, ap_account) + shared_inbox = self.apg.build_apurl(TYPE_SHARED_INBOX) + outbox = self.apg.build_apurl(TYPE_OUTBOX, ap_account) + followers = self.apg.build_apurl(TYPE_FOLLOWERS, ap_account) + following = self.apg.build_apurl(TYPE_FOLLOWING, ap_account) # we have to use AP account as preferredUsername because it is used to retrieve # actor handle (see https://socialhub.activitypub.rocks/t/how-to-retrieve-user-server-tld-handle-from-actors-url/2196) preferred_username = ap_account.split("@", 1)[0] - identity_data = await self.apg._i.getIdentity(self.apg.client, account_jid) + identity_data = await self.apg._i.get_identity(self.apg.client, account_jid) if node and node.startswith(self.apg._events.namespace): events = outbox else: - events_account = await self.apg.getAPAccountFromJidAndNode( + events_account = await self.apg.get_ap_account_from_jid_and_node( account_jid, self.apg._events.namespace ) - events = self.apg.buildAPURL(TYPE_OUTBOX, events_account) + events = self.apg.build_apurl(TYPE_OUTBOX, events_account) actor_data = { "@context": [ @@ -636,7 +636,7 @@ except KeyError: log.error(f"incomplete avatar data: {identity_data!r}") else: - avatar_url = self.apg.buildAPURL("avatar", filename) + avatar_url = self.apg.build_apurl("avatar", filename) actor_data["icon"] = { "type": "Image", "url": avatar_url, @@ -645,14 +645,14 @@ return actor_data - def getCanonicalURL(self, request: "HTTPRequest") -> str: + def get_canonical_url(self, request: "HTTPRequest") -> str: return parse.urljoin( f"https://{self.apg.public_url}", request.path.decode().rstrip("/") - # we unescape "@" for the same reason as in [APActorRequest] + # we unescape "@" for the same reason as in [ap_actor_request] ).replace("%40", "@") - def queryData2RSMRequest( + def query_data_2_rsm_request( self, query_data: Dict[str, List[str]] ) -> rsm.RSMRequest: @@ -673,7 +673,7 @@ return rsm.RSMRequest(**kwargs) raise ValueError(f"Invalid query data: {query_data!r}") - async def APOutboxPageRequest( + async def ap_outbox_page_request( self, request: "HTTPRequest", data: Optional[dict], @@ -690,18 +690,18 @@ url_keys = sorted(set(query_data) & {"page", "index", "before", "after"}) query_data = {k: query_data[k] for k in url_keys} try: - items, metadata = await self.apg._p.getItems( + items, metadata = await self.apg._p.get_items( client=self.apg.client, service=account_jid, node=node, - rsm_request=self.queryData2RSMRequest(query_data), + rsm_request=self.query_data_2_rsm_request(query_data), extra = {C.KEY_USE_CACHE: False} ) except error.StanzaError as e: log.warning(f"Can't get data from pubsub node {node} at {account_jid}: {e}") return {} - base_url = self.getCanonicalURL(request) + base_url = self.get_canonical_url(request) url = f"{base_url}?{parse.urlencode(query_data, True)}" if node and node.startswith(self.apg._events.namespace): ordered_items = [ @@ -753,7 +753,7 @@ return ret_data - async def APOutboxRequest( + async def ap_outbox_request( self, request: "HTTPRequest", data: Optional[dict], @@ -769,7 +769,7 @@ parsed_url = parse.urlparse(request.uri.decode()) query_data = parse.parse_qs(parsed_url.query) if query_data: - return await self.APOutboxPageRequest( + return await self.ap_outbox_page_request( request, data, account_jid, node, ap_account, ap_url, query_data ) @@ -779,7 +779,7 @@ # The current workaround is to do a request as if RSM was available, and actually # check its availability according to result. try: - __, metadata = await self.apg._p.getItems( + __, metadata = await self.apg._p.get_items( client=self.apg.client, service=account_jid, node=node, @@ -799,7 +799,7 @@ ) items_count = 20 - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) url_first_page = f"{url}?{parse.urlencode({'page': 'first'})}" url_last_page = f"{url}?{parse.urlencode({'page': 'last'})}" return { @@ -811,7 +811,7 @@ "last": url_last_page, } - async def APInboxRequest( + async def ap_inbox_request( self, request: "HTTPRequest", data: Optional[dict], @@ -824,26 +824,26 @@ assert data is not None if signing_actor is None: raise exceptions.InternalError("signing_actor must be set for inbox requests") - await self.checkSigningActor(data, signing_actor) + await self.check_signing_actor(data, signing_actor) activity_type = (data.get("type") or "").lower() if not activity_type in ACTIVITY_TYPES_LOWER: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"request is not an activity, ignoring" ) if account_jid is None and activity_type not in ACTIVIY_NO_ACCOUNT_ALLOWED: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"{activity_type.title()!r} activity must target an account" ) try: - method = getattr(self, f"handle{activity_type.title()}Activity") + method = getattr(self, f"handle_{activity_type}_activity") except AttributeError: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"{activity_type.title()} activity is not yet supported" @@ -853,7 +853,7 @@ request, data, account_jid, node, ap_account, ap_url, signing_actor ) - async def APFollowersRequest( + async def ap_followers_request( self, request: "HTTPRequest", data: Optional[dict], @@ -866,20 +866,20 @@ if node is None: node = self.apg._m.namespace client = self.apg.client - subscribers = await self.apg._pps.getPublicNodeSubscriptions( + subscribers = await self.apg._pps.get_public_node_subscriptions( client, account_jid, node ) followers = [] for subscriber in subscribers.keys(): - if self.apg.isVirtualJID(subscriber): + if self.apg.is_virtual_jid(subscriber): # the subscriber is an AP user subscribed with this gateway ap_account = self.apg._e.unescape(subscriber.user) else: # regular XMPP user - ap_account = await self.apg.getAPAccountFromJidAndNode(subscriber, node) + ap_account = await self.apg.get_ap_account_from_jid_and_node(subscriber, node) followers.append(ap_account) - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) return { "@context": ["https://www.w3.org/ns/activitystreams"], "type": "OrderedCollection", @@ -892,7 +892,7 @@ } } - async def APFollowingRequest( + async def ap_following_request( self, request: "HTTPRequest", data: Optional[dict], @@ -909,17 +909,17 @@ following = [] for sub_dict in subscriptions: service = jid.JID(sub_dict["service"]) - if self.apg.isVirtualJID(service): + if self.apg.is_virtual_jid(service): # the subscription is to an AP actor with this gateway ap_account = self.apg._e.unescape(service.user) else: # regular XMPP user - ap_account = await self.apg.getAPAccountFromJidAndNode( + ap_account = await self.apg.get_ap_account_from_jid_and_node( service, sub_dict["node"] ) following.append(ap_account) - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) return { "@context": ["https://www.w3.org/ns/activitystreams"], "type": "OrderedCollection", @@ -953,7 +953,7 @@ to_log.append(f" headers:\n{headers}") return to_log - async def APRequest( + async def ap_request( self, request: "HTTPRequest", data: Optional[dict] = None, @@ -967,13 +967,13 @@ f"https://{self.apg.public_url}", path ) - request_type, extra_args = self.apg.parseAPURL(ap_url) + request_type, extra_args = self.apg.parse_apurl(ap_url) if ((MEDIA_TYPE_AP not in (request.getHeader("accept") or "") and request_type in self.apg.html_redirect)): # this is not a AP request, and we have a redirections for it kw = {} if extra_args: - kw["jid"], kw["node"] = await self.apg.getJIDAndNode(extra_args[0]) + kw["jid"], kw["node"] = await self.apg.get_jid_and_node(extra_args[0]) kw["jid_user"] = kw["jid"].user if kw["node"] is None: kw["node"] = self.apg._m.namespace @@ -1007,7 +1007,7 @@ if len(extra_args) == 0: if request_type != "shared_inbox": raise exceptions.DataError(f"Invalid request type: {request_type!r}") - ret_data = await self.APInboxRequest( + ret_data = await self.ap_inbox_request( request, data, None, None, None, ap_url, signing_actor ) elif request_type == "avatar": @@ -1017,14 +1017,14 @@ avatar_path = self.apg.host.common_cache.getPath(avatar_filename) return static.File(str(avatar_path)).render(request) elif request_type == "item": - ret_data = await self.apg.apGetLocalObject(ap_url) + ret_data = await self.apg.ap_get_local_object(ap_url) if "@context" not in ret_data: ret_data["@context"] = [NS_AP] else: if len(extra_args) > 1: log.warning(f"unexpected extra arguments: {extra_args!r}") ap_account = extra_args[0] - account_jid, node = await self.apg.getJIDAndNode(ap_account) + account_jid, node = await self.apg.get_jid_and_node(ap_account) if request_type not in AP_REQUEST_TYPES.get( request.method.decode().upper(), [] ): @@ -1046,12 +1046,12 @@ log.info("\n".join(to_log)) request.finish() - async def APPostRequest(self, request: "HTTPRequest") -> None: + async def ap_post_request(self, request: "HTTPRequest") -> None: try: data = json.load(request.content) if not isinstance(data, dict): log.warning(f"JSON data should be an object (uri={request.uri.decode()})") - self.responseCode( + self.response_code( request, http.BAD_REQUEST, f"invalid body, was expecting a JSON object" @@ -1059,7 +1059,7 @@ request.finish() return except (json.JSONDecodeError, ValueError) as e: - self.responseCode( + self.response_code( request, http.BAD_REQUEST, f"invalid json in inbox request: {e}" @@ -1081,14 +1081,14 @@ pass try: - signing_actor = await self.checkSignature(request) + signing_actor = await self.check_signature(request) except exceptions.EncryptionError as e: if self.apg.verbose: to_log = self._get_to_log(request) to_log.append(f" body: {request.content.read()!r}") request.content.seek(0) log.info("\n".join(to_log)) - self.responseCode( + self.response_code( request, http.FORBIDDEN, f"invalid signature: {e}" @@ -1096,7 +1096,7 @@ request.finish() return except Exception as e: - self.responseCode( + self.response_code( request, http.INTERNAL_SERVER_ERROR, f"Can't check signature: {e}" @@ -1115,27 +1115,27 @@ # default response code, may be changed, e.g. in case of exception try: - return await self.APRequest(request, data, signing_actor) + return await self.ap_request(request, data, signing_actor) except Exception as e: - self._onRequestError(failure.Failure(e), request) + self._on_request_error(failure.Failure(e), request) - async def checkSigningActor(self, data: dict, signing_actor: str) -> None: + async def check_signing_actor(self, data: dict, signing_actor: str) -> None: """That that signing actor correspond to actor declared in data @param data: request payload @param signing_actor: actor ID of the signing entity, as returned by - checkSignature + check_signature @raise exceptions.NotFound: no actor found in data @raise exceptions.EncryptionError: signing actor doesn't match actor in data """ - actor = await self.apg.apGetSenderActor(data) + actor = await self.apg.ap_get_sender_actor(data) if signing_actor != actor: raise exceptions.EncryptionError( f"signing actor ({signing_actor}) doesn't match actor in data ({actor})" ) - async def checkSignature(self, request: "HTTPRequest") -> str: + async def check_signature(self, request: "HTTPRequest") -> str: """Check and validate HTTP signature @return: id of the signing actor @@ -1242,7 +1242,7 @@ raise exceptions.EncryptionError( "Only SHA-256 algorithm is currently supported for digest" ) - __, computed_digest = self.apg.getDigest(body) + __, computed_digest = self.apg.get_digest(body) if given_digest != computed_digest: raise exceptions.EncryptionError( f"SHA-256 given and computed digest differ:\n" @@ -1275,7 +1275,7 @@ raise exceptions.EncryptionError("Signature has expired") try: - return await self.apg.checkSignature( + return await self.apg.check_signature( sign_data["signature"], key_id, headers @@ -1287,7 +1287,7 @@ "Using workaround for (request-target) encoding bug in signature, " "see https://github.com/mastodon/mastodon/issues/18871" ) - return await self.apg.checkSignature( + return await self.apg.check_signature( sign_data["signature"], key_id, headers @@ -1303,8 +1303,8 @@ defer.ensureDeferred(self.webfinger(request)) return server.NOT_DONE_YET elif path.startswith(self.apg.ap_path): - d = defer.ensureDeferred(self.APRequest(request)) - d.addErrback(self._onRequestError, request) + d = defer.ensureDeferred(self.ap_request(request)) + d.addErrback(self._on_request_error, request) return server.NOT_DONE_YET return web_resource.NoResource().render(request) @@ -1313,7 +1313,7 @@ path = request.path.decode().lstrip("/") if not path.startswith(self.apg.ap_path): return web_resource.NoResource().render(request) - defer.ensureDeferred(self.APPostRequest(request)) + defer.ensureDeferred(self.ap_post_request(request)) return server.NOT_DONE_YET diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_ap_gateway/pubsub_service.py --- a/sat/plugins/plugin_comp_ap_gateway/pubsub_service.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/pubsub_service.py Sat Apr 08 13:54:42 2023 +0200 @@ -34,7 +34,7 @@ from sat.core.constants import Const as C from sat.tools import image from sat.tools.utils import ensure_deferred -from sat.tools.web import downloadFile +from sat.tools.web import download_file from sat.memory.sqla_mapping import PubsubSub, SubscriptionState from .constants import ( @@ -74,7 +74,7 @@ "name": "Libervia ActivityPub Gateway", } - async def getAPActorIdsAndInbox( + async def get_ap_actor_ids_and_inbox( self, requestor: jid.JID, recipient: jid.JID, @@ -92,16 +92,16 @@ "item-not-found", text="No user part specified" ) - requestor_actor_id = self.apg.buildAPURL(TYPE_ACTOR, requestor.userhost()) + requestor_actor_id = self.apg.build_apurl(TYPE_ACTOR, requestor.userhost()) recipient_account = self.apg._e.unescape(recipient.user) - recipient_actor_id = await self.apg.getAPActorIdFromAccount(recipient_account) - inbox = await self.apg.getAPInboxFromId(recipient_actor_id, use_shared=False) + recipient_actor_id = await self.apg.get_ap_actor_id_from_account(recipient_account) + inbox = await self.apg.get_ap_inbox_from_id(recipient_actor_id, use_shared=False) return requestor_actor_id, recipient_actor_id, inbox @ensure_deferred async def publish(self, requestor, service, nodeIdentifier, items): - if self.apg.local_only and not self.apg.isLocal(requestor): + if self.apg.local_only and not self.apg.is_local(requestor): raise error.StanzaError( "forbidden", "Only local users can publish on this gateway." @@ -118,19 +118,19 @@ f"{ap_account!r} is not a valid ActivityPub actor account." ) - client = self.apg.client.getVirtualClient(requestor) - if self.apg._pa.isAttachmentNode(nodeIdentifier): - await self.apg.convertAndPostAttachments( + client = self.apg.client.get_virtual_client(requestor) + if self.apg._pa.is_attachment_node(nodeIdentifier): + await self.apg.convert_and_post_attachments( client, ap_account, service, nodeIdentifier, items, publisher=requestor ) else: - await self.apg.convertAndPostItems( + await self.apg.convert_and_post_items( client, ap_account, service, nodeIdentifier, items ) - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, nodeIdentifier, with_subscriptions=True, create=True ) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( client, cached_node, items @@ -144,27 +144,27 @@ [(subscription.subscriber, None, items)] ) - async def apFollowing2Elt(self, ap_item: dict) -> domish.Element: + async def ap_following_2_elt(self, ap_item: dict) -> domish.Element: """Convert actor ID from following collection to XMPP item""" actor_id = ap_item["id"] - actor_jid = await self.apg.getJIDFromId(actor_id) - subscription_elt = self.apg._pps.buildSubscriptionElt( + actor_jid = await self.apg.get_jid_from_id(actor_id) + subscription_elt = self.apg._pps.build_subscription_elt( self.apg._m.namespace, actor_jid ) item_elt = pubsub.Item(id=actor_id, payload=subscription_elt) return item_elt - async def apFollower2Elt(self, ap_item: dict) -> domish.Element: + async def ap_follower_2_elt(self, ap_item: dict) -> domish.Element: """Convert actor ID from followers collection to XMPP item""" actor_id = ap_item["id"] - actor_jid = await self.apg.getJIDFromId(actor_id) - subscriber_elt = self.apg._pps.buildSubscriberElt(actor_jid) + actor_jid = await self.apg.get_jid_from_id(actor_id) + subscriber_elt = self.apg._pps.build_subscriber_elt(actor_jid) item_elt = pubsub.Item(id=actor_id, payload=subscriber_elt) return item_elt - async def generateVCard(self, ap_account: str) -> domish.Element: + async def generate_v_card(self, ap_account: str) -> domish.Element: """Generate vCard4 (XEP-0292) item element from ap_account's metadata""" - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) identity_data = {} summary = actor_data.get("summary") @@ -181,13 +181,13 @@ value = actor_data.get(field) if value: identity_data.setdefault("nicknames", []).append(value) - vcard_elt = self.apg._v.dict2VCard(identity_data) + vcard_elt = self.apg._v.dict_2_v_card(identity_data) item_elt = domish.Element((pubsub.NS_PUBSUB, "item")) item_elt.addChild(vcard_elt) item_elt["id"] = self.apg._p.ID_SINGLETON return item_elt - async def getAvatarData( + async def get_avatar_data( self, client: SatXMPPEntity, ap_account: str @@ -197,9 +197,9 @@ ``cache_uid``, `path``` and ``media_type`` keys are always files ``base64`` key is only filled if the file was not already in cache """ - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) - for icon in await self.apg.apGetList(actor_data, "icon"): + for icon in await self.apg.ap_get_list(actor_data, "icon"): url = icon.get("url") if icon["type"] != "Image" or not url: continue @@ -221,19 +221,19 @@ if cache_uid is None: cache = None else: - cache = self.apg.host.common_cache.getMetadata(cache_uid) + cache = self.apg.host.common_cache.get_metadata(cache_uid) if cache is None: with tempfile.TemporaryDirectory() as dir_name: dest_path = Path(dir_name, filename) - await downloadFile(url, dest_path, max_size=MAX_AVATAR_SIZE) + await download_file(url, dest_path, max_size=MAX_AVATAR_SIZE) avatar_data = { "path": dest_path, "filename": filename, 'media_type': image.guess_type(dest_path), } - await self.apg._i.cacheAvatar( + await self.apg._i.cache_avatar( self.apg.IMPORT_NAME, avatar_data ) @@ -246,7 +246,7 @@ return avatar_data - async def generateAvatarMetadata( + async def generate_avatar_metadata( self, client: SatXMPPEntity, ap_account: str @@ -256,14 +256,14 @@ @raise StanzaError("item-not-found"): no avatar is present in actor data (in ``icon`` field) """ - avatar_data = await self.getAvatarData(client, ap_account) - return self.apg._a.buildItemMetadataElt(avatar_data) + avatar_data = await self.get_avatar_data(client, ap_account) + return self.apg._a.build_item_metadata_elt(avatar_data) - def _blockingB64EncodeAvatar(self, avatar_data: Dict[str, Any]) -> None: + def _blocking_b_6_4_encode_avatar(self, avatar_data: Dict[str, Any]) -> None: with avatar_data["path"].open("rb") as f: avatar_data["base64"] = b64encode(f.read()).decode() - async def generateAvatarData( + async def generate_avatar_data( self, client: SatXMPPEntity, ap_account: str, @@ -274,9 +274,9 @@ @raise StanzaError("item-not-found"): no avatar cached with requested ID """ if not itemIdentifiers: - avatar_data = await self.getAvatarData(client, ap_account) + avatar_data = await self.get_avatar_data(client, ap_account) if "base64" not in avatar_data: - await threads.deferToThread(self._blockingB64EncodeAvatar, avatar_data) + await threads.deferToThread(self._blocking_b_6_4_encode_avatar, avatar_data) else: if len(itemIdentifiers) > 1: # only a single item ID is supported @@ -284,16 +284,16 @@ item_id = itemIdentifiers[0] # just to be sure that that we don't have an empty string assert item_id - cache_data = self.apg.host.common_cache.getMetadata(item_id) + cache_data = self.apg.host.common_cache.get_metadata(item_id) if cache_data is None: raise error.StanzaError("item-not-found") avatar_data = { "cache_uid": item_id, "path": cache_data["path"] } - await threads.deferToThread(self._blockingB64EncodeAvatar, avatar_data) + await threads.deferToThread(self._blocking_b_6_4_encode_avatar, avatar_data) - return self.apg._a.buildItemDataElt(avatar_data) + return self.apg._a.build_item_data_elt(avatar_data) @ensure_deferred async def items( @@ -320,31 +320,31 @@ if node == self.apg._pps.subscriptions_node: collection_name = "following" - parser = self.apFollowing2Elt + parser = self.ap_following_2_elt kwargs["only_ids"] = True use_cache = False elif node.startswith(self.apg._pps.subscribers_node_prefix): collection_name = "followers" - parser = self.apFollower2Elt + parser = self.ap_follower_2_elt kwargs["only_ids"] = True use_cache = False elif node == self.apg._v.node: # vCard4 request - item_elt = await self.generateVCard(ap_account) + item_elt = await self.generate_v_card(ap_account) return [item_elt], None elif node == self.apg._a.namespace_metadata: - item_elt = await self.generateAvatarMetadata(self.apg.client, ap_account) + item_elt = await self.generate_avatar_metadata(self.apg.client, ap_account) return [item_elt], None elif node == self.apg._a.namespace_data: - item_elt = await self.generateAvatarData( + item_elt = await self.generate_avatar_data( self.apg.client, ap_account, itemIdentifiers ) return [item_elt], None - elif self.apg._pa.isAttachmentNode(node): + elif self.apg._pa.is_attachment_node(node): use_cache = True # we check cache here because we emit an item-not-found error if the node is # not in cache, as we are not dealing with real AP items - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) if cached_node is None: @@ -365,14 +365,14 @@ if use_cache: if cached_node is None: - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) # TODO: check if node is synchronised if cached_node is not None: # the node is cached, we return items from cache log.debug(f"node {node!r} from {service} is in cache") - pubsub_items, metadata = await self.apg._c.getItemsFromCache( + pubsub_items, metadata = await self.apg._c.get_items_from_cache( client, cached_node, maxItems, itemIdentifiers, rsm_request=rsm_req ) try: @@ -384,7 +384,7 @@ if itemIdentifiers: items = [] for item_id in itemIdentifiers: - item_data = await self.apg.apGet(item_id) + item_data = await self.apg.ap_get(item_id) item_elt = await parser(item_data) items.append(item_elt) return items, None @@ -419,11 +419,11 @@ f"No cache found for node {node} at {service} (AP account {ap_account}), " "using Collection Paging to RSM translation" ) - if self.apg._m.isCommentNode(node): - parent_item = self.apg._m.getParentItem(node) + if self.apg._m.is_comment_node(node): + parent_item = self.apg._m.get_parent_item(node) try: - parent_data = await self.apg.apGet(parent_item) - collection = await self.apg.apGetObject( + parent_data = await self.apg.ap_get(parent_item) + collection = await self.apg.ap_get_object( parent_data.get("object", {}), "replies" ) @@ -433,8 +433,8 @@ text=e ) else: - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) - collection = await self.apg.apGetObject(actor_data, collection_name) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) + collection = await self.apg.ap_get_object(actor_data, collection_name) if not collection: raise error.StanzaError( "item-not-found", @@ -442,7 +442,7 @@ ) kwargs["parser"] = parser - return await self.apg.getAPItems(collection, **kwargs) + return await self.apg.get_ap_items(collection, **kwargs) @ensure_deferred async def retract(self, requestor, service, nodeIdentifier, itemIdentifiers): @@ -459,11 +459,11 @@ sub_state = SubscriptionState.PENDING else: sub_state = SubscriptionState.SUBSCRIBED - node = await self.host.memory.storage.getPubsubNode( + node = await self.host.memory.storage.get_pubsub_node( client, service, nodeIdentifier, with_subscriptions=True ) if node is None: - node = await self.host.memory.storage.setPubsubNode( + node = await self.host.memory.storage.set_pubsub_node( client, service, nodeIdentifier, @@ -510,13 +510,13 @@ if nodeIdentifier in (self.apg._m.namespace, self.apg._events.namespace): # if we subscribe to microblog or events node, we follow the corresponding # account - req_actor_id, recip_actor_id, inbox = await self.getAPActorIdsAndInbox( + req_actor_id, recip_actor_id, inbox = await self.get_ap_actor_ids_and_inbox( requestor, service ) data = self.apg.create_activity("Follow", req_actor_id, recip_actor_id) - resp = await self.apg.signAndPost(inbox, req_actor_id, data) + resp = await self.apg.sign_and_post(inbox, req_actor_id, data) if resp.code >= 300: text = await resp.text() raise error.StanzaError("service-unavailable", text=text) @@ -524,7 +524,7 @@ @ensure_deferred async def unsubscribe(self, requestor, service, nodeIdentifier, subscriber): - req_actor_id, recip_actor_id, inbox = await self.getAPActorIdsAndInbox( + req_actor_id, recip_actor_id, inbox = await self.get_ap_actor_ids_and_inbox( requestor, service ) data = self.apg.create_activity( @@ -537,7 +537,7 @@ ) ) - resp = await self.apg.signAndPost(inbox, req_actor_id, data) + resp = await self.apg.sign_and_post(inbox, req_actor_id, data) if resp.code >= 300: text = await resp.text() raise error.StanzaError("service-unavailable", text=text) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_file_sharing.py --- a/sat/plugins/plugin_comp_file_sharing.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_file_sharing.py Sat Apr 08 13:54:42 2023 +0200 @@ -113,7 +113,7 @@ return resource.ErrorPage(code, brief, details).render(request) - def getDispositionType(self, media_type, media_subtype): + def get_disposition_type(self, media_type, media_subtype): if media_type in ('image', 'video'): return 'inline' elif media_type == 'application' and media_subtype == 'pdf': @@ -136,7 +136,7 @@ "Date, Content-Length, Content-Range") return super().render(request) - def render_OPTIONS(self, request): + def render_options(self, request): request.setResponseCode(http.OK) return b"" @@ -146,17 +146,17 @@ except exceptions.DataError: return self.errorPage(request, http.NOT_FOUND) - defer.ensureDeferred(self.renderGet(request)) + defer.ensureDeferred(self.render_get(request)) return server.NOT_DONE_YET - async def renderGet(self, request): + async def render_get(self, request): try: upload_id, filename = request.upload_data except exceptions.DataError: request.write(self.errorPage(request, http.FORBIDDEN)) request.finish() return - found_files = await request.file_sharing.host.memory.getFiles( + found_files = await request.file_sharing.host.memory.get_files( client=None, peer_jid=None, perms_to_check=None, public_id=upload_id) if not found_files: request.write(self.errorPage(request, http.NOT_FOUND)) @@ -170,7 +170,7 @@ file_res = static.File(file_path) file_res.type = f'{found_file["media_type"]}/{found_file["media_subtype"]}' file_res.encoding = file_res.contentEncodings.get(Path(found_file['name']).suffix) - disp_type = self.getDispositionType( + disp_type = self.get_disposition_type( found_file['media_type'], found_file['media_subtype']) # the URL is percent encoded, and not all browsers/tools unquote the file name, # thus we add a content disposition header @@ -190,10 +190,10 @@ request.finish() def render_PUT(self, request): - defer.ensureDeferred(self.renderPut(request)) + defer.ensureDeferred(self.render_put(request)) return server.NOT_DONE_YET - async def renderPut(self, request): + async def render_put(self, request): try: client, upload_request = request.upload_request_data upload_id, filename = request.upload_data @@ -228,7 +228,7 @@ "path": path } - await request.file_sharing.registerReceivedFile( + await request.file_sharing.register_received_file( client, upload_request.from_, file_data, tmp_file_path, public_id=public_id, ) @@ -273,7 +273,7 @@ def file_tmp_dir(self): return self.channel.site.file_tmp_dir - def refuseRequest(self): + def refuse_request(self): if self.content is not None: self.content.close() self.content = open(os.devnull, 'w+b') @@ -287,16 +287,16 @@ upload_id, filename = self.upload_data except exceptions.DataError as e: log.warning(f"Invalid PUT request, we stop here: {e}") - return self.refuseRequest() + return self.refuse_request() try: client, upload_request, timer = self.file_sharing.expected_uploads.pop(upload_id) except KeyError: log.warning(f"unknown (expired?) upload ID received for a PUT: {upload_id!r}") - return self.refuseRequest() + return self.refuse_request() if not timer.active: log.warning(f"upload id {upload_id!r} used for a PUT, but it is expired") - return self.refuseRequest() + return self.refuse_request() timer.cancel() @@ -305,7 +305,7 @@ f"invalid filename for PUT (upload id: {upload_id!r}, URL: {self.channel._path.decode()}). Original " f"{upload_request.filename!r} doesn't match {filename!r}" ) - return self.refuseRequest() + return self.refuse_request() self.upload_request_data = (client, upload_request) @@ -355,24 +355,24 @@ self._h = self.host.plugins["XEP-0300"] self._t = self.host.plugins["XEP-0264"] self._hu = self.host.plugins["XEP-0363"] - self._hu.registerHandler(self._on_http_upload) - self.host.trigger.add("FILE_getDestDir", self._getDestDirTrigger) + self._hu.register_handler(self._on_http_upload) + self.host.trigger.add("FILE_getDestDir", self._get_dest_dir_trigger) self.host.trigger.add( - "XEP-0234_fileSendingRequest", self._fileSendingRequestTrigger, priority=1000 + "XEP-0234_fileSendingRequest", self._file_sending_request_trigger, priority=1000 ) - self.host.trigger.add("XEP-0234_buildFileElement", self._addFileMetadataElts) - self.host.trigger.add("XEP-0234_parseFileElement", self._getFileMetadataElts) - self.host.trigger.add("XEP-0329_compGetFilesFromNode", self._addFileMetadata) + self.host.trigger.add("XEP-0234_buildFileElement", self._add_file_metadata_elts) + self.host.trigger.add("XEP-0234_parseFileElement", self._get_file_metadata_elts) + self.host.trigger.add("XEP-0329_compGetFilesFromNode", self._add_file_metadata) self.host.trigger.add( "XEP-0329_compGetFilesFromNode_build_directory", - self._addDirectoryMetadataElts) + self._add_directory_metadata_elts) self.host.trigger.add( "XEP-0329_parseResult_directory", - self._getDirectoryMetadataElts) + self._get_directory_metadata_elts) self.files_path = self.host.get_local_path(None, C.FILES_DIR) - self.http_port = int(self.host.memory.getConfig( + self.http_port = int(self.host.memory.config_get( 'component file-sharing', 'http_upload_port', 8888)) - connection_type = self.host.memory.getConfig( + connection_type = self.host.memory.config_get( 'component file-sharing', 'http_upload_connection_type', 'https') if connection_type not in ('http', 'https'): raise exceptions.ConfigError( @@ -383,51 +383,51 @@ if connection_type == 'http': reactor.listenTCP(self.http_port, self.server) else: - options = tls.getOptionsFromConfig( + options = tls.get_options_from_config( self.host.memory.config, "component file-sharing") - tls.TLSOptionsCheck(options) - context_factory = tls.getTLSContextFactory(options) + tls.tls_options_check(options) + context_factory = tls.get_tls_context_factory(options) reactor.listenSSL(self.http_port, self.server, context_factory) - def getHandler(self, client): + def get_handler(self, client): return Comments_handler(self) - def profileConnecting(self, client): + def profile_connecting(self, client): # we activate HTTP upload client.enabled_features.add("XEP-0363") self.init() - public_base_url = self.host.memory.getConfig( + public_base_url = self.host.memory.config_get( 'component file-sharing', 'http_upload_public_facing_url') if public_base_url is None: client._file_sharing_base_url = f"https://{client.host}:{self.http_port}" else: client._file_sharing_base_url = public_base_url path = client.file_tmp_dir = os.path.join( - self.host.memory.getConfig("", "local_dir"), + self.host.memory.config_get("", "local_dir"), C.FILES_TMP_DIR, - regex.pathEscape(client.profile), + regex.path_escape(client.profile), ) if not os.path.exists(path): os.makedirs(path) - def getQuota(self, client, entity): + def get_quota(self, client, entity): """Return maximum size allowed for all files for entity""" - quotas = self.host.memory.getConfig("component file-sharing", "quotas_json", {}) - if self.host.memory.isAdminJID(entity): + quotas = self.host.memory.config_get("component file-sharing", "quotas_json", {}) + if self.host.memory.is_admin_jid(entity): quota = quotas.get("admins") else: try: quota = quotas["jids"][entity.userhost()] except KeyError: quota = quotas.get("users") - return None if quota is None else utils.parseSize(quota) + return None if quota is None else utils.parse_size(quota) async def generate_thumbnails(self, extra: dict, image_path: Path): thumbnails = extra.setdefault(C.KEY_THUMBNAILS, []) for max_thumb_size in self._t.SIZES: try: - thumb_size, thumb_id = await self._t.generateThumbnail( + thumb_size, thumb_id = await self._t.generate_thumbnail( image_path, max_thumb_size, #  we keep thumbnails for 6 months @@ -438,7 +438,7 @@ break thumbnails.append({"id": thumb_id, "size": thumb_size}) - async def registerReceivedFile( + async def register_received_file( self, client, peer_jid, file_data, file_path, public_id=None, extra=None): """Post file reception tasks @@ -460,9 +460,9 @@ log.debug(_("Reusing already generated hash")) file_hash = file_data["hash_hasher"].hexdigest() else: - hasher = self._h.getHasher(HASH_ALGO) + hasher = self._h.get_hasher(HASH_ALGO) with file_path.open('rb') as f: - file_hash = await self._h.calculateHash(f, hasher) + file_hash = await self._h.calculate_hash(f, hasher) final_path = self.files_path/file_hash if final_path.is_file(): @@ -493,7 +493,7 @@ else: await self.generate_thumbnails(extra, thumb_path) - await self.host.memory.setFile( + await self.host.memory.set_file( client, name=name, version="", @@ -508,7 +508,7 @@ extra=extra, ) - async def _getDestDirTrigger( + async def _get_dest_dir_trigger( self, client, peer_jid, transfer_data, file_data, stream_object ): """This trigger accept file sending request, and store file locally""" @@ -522,17 +522,17 @@ assert C.KEY_PROGRESS_ID in file_data filename = file_data["name"] assert filename and not "/" in filename - quota = self.getQuota(client, peer_jid) + quota = self.get_quota(client, peer_jid) if quota is not None: - used_space = await self.host.memory.fileGetUsedSpace(client, peer_jid) + used_space = await self.host.memory.file_get_used_space(client, peer_jid) if (used_space + file_data["size"]) > quota: raise error.StanzaError( "not-acceptable", text=OVER_QUOTA_TXT.format( - quota=utils.getHumanSize(quota), - used_space=utils.getHumanSize(used_space), - file_size=utils.getHumanSize(file_data['size']) + quota=utils.get_human_size(quota), + used_space=utils.get_human_size(used_space), + file_size=utils.get_human_size(file_data['size']) ) ) file_tmp_dir = self.host.get_local_path( @@ -543,26 +543,26 @@ transfer_data["finished_d"].addCallback( lambda __: defer.ensureDeferred( - self.registerReceivedFile(client, peer_jid, file_data, file_tmp_path) + self.register_received_file(client, peer_jid, file_data, file_tmp_path) ) ) - self._f.openFileWrite( + self._f.open_file_write( client, file_tmp_path, transfer_data, file_data, stream_object ) return False, True - async def _retrieveFiles( + async def _retrieve_files( self, client, session, content_data, content_name, file_data, file_elt ): """This method retrieve a file on request, and send if after checking permissions""" peer_jid = session["peer_jid"] if session['local_jid'].user: - owner = client.getOwnerFromJid(session['local_jid']) + owner = client.get_owner_from_jid(session['local_jid']) else: owner = peer_jid try: - found_files = await self.host.memory.getFiles( + found_files = await self.host.memory.get_files( client, peer_jid=peer_jid, name=file_data.get("name"), @@ -595,7 +595,7 @@ type_=found_file['type'])) file_hash = found_file["file_hash"] file_path = self.files_path / file_hash - file_data["hash_hasher"] = hasher = self._h.getHasher(found_file["hash_algo"]) + file_data["hash_hasher"] = hasher = self._h.get_hasher(found_file["hash_algo"]) size = file_data["size"] = found_file["size"] file_data["file_hash"] = file_hash file_data["hash_algo"] = found_file["hash_algo"] @@ -608,13 +608,13 @@ self.host, client, file_path, - uid=self._jf.getProgressId(session, content_name), + uid=self._jf.get_progress_id(session, content_name), size=size, data_cb=lambda data: hasher.update(data), ) return True - def _fileSendingRequestTrigger( + def _file_sending_request_trigger( self, client, session, content_data, content_name, file_data, file_elt ): if not client.is_component: @@ -622,7 +622,7 @@ else: return ( False, - defer.ensureDeferred(self._retrieveFiles( + defer.ensureDeferred(self._retrieve_files( client, session, content_data, content_name, file_data, file_elt )), ) @@ -642,19 +642,19 @@ if request.from_.host not in client._file_sharing_allowed_hosts: raise error.StanzaError("forbidden") - quota = self.getQuota(client, request.from_) + quota = self.get_quota(client, request.from_) if quota is not None: - used_space = await self.host.memory.fileGetUsedSpace(client, request.from_) + used_space = await self.host.memory.file_get_used_space(client, request.from_) if (used_space + request.size) > quota: raise error.StanzaError( "not-acceptable", text=OVER_QUOTA_TXT.format( - quota=utils.getHumanSize(quota), - used_space=utils.getHumanSize(used_space), - file_size=utils.getHumanSize(request.size) + quota=utils.get_human_size(quota), + used_space=utils.get_human_size(used_space), + file_size=utils.get_human_size(request.size) ), - appCondition = self._hu.getFileTooLargeElt(max(quota - used_space, 0)) + appCondition = self._hu.get_file_too_large_elt(max(quota - used_space, 0)) ) upload_id = shortuuid.ShortUUID().random(length=30) @@ -671,7 +671,7 @@ ## metadata triggers ## - def _addFileMetadataElts(self, client, file_elt, extra_args): + def _add_file_metadata_elts(self, client, file_elt, extra_args): # affiliation affiliation = extra_args.get('affiliation') if affiliation is not None: @@ -693,7 +693,7 @@ comment_elt["count"] = str(count) return True - def _getFileMetadataElts(self, client, file_elt, file_data): + def _get_file_metadata_elts(self, client, file_elt, file_data): # affiliation try: affiliation_elt = next(file_elt.elements(NS_FS_AFFILIATION, "affiliation")) @@ -712,17 +712,17 @@ file_data["comments_count"] = comments_elt["count"] return True - def _addFileMetadata( + def _add_file_metadata( self, client, iq_elt, iq_result_elt, owner, node_path, files_data): for file_data in files_data: - file_data["comments_url"] = uri.buildXMPPUri( + file_data["comments_url"] = uri.build_xmpp_uri( "pubsub", path=client.jid.full(), node=COMMENT_NODE_PREFIX + file_data["id"], ) return True - def _addDirectoryMetadataElts( + def _add_directory_metadata_elts( self, client, file_data, directory_elt, owner, node_path): affiliation = file_data.get('affiliation') if affiliation is not None: @@ -731,7 +731,7 @@ content=affiliation ) - def _getDirectoryMetadataElts( + def _get_directory_metadata_elts( self, client, elt, file_data): try: affiliation_elt = next(elt.elements(NS_FS_AFFILIATION, "affiliation")) @@ -754,7 +754,7 @@ "name": "files commenting service", } - def _getFileId(self, nodeIdentifier): + def _get_file_id(self, nodeIdentifier): if not nodeIdentifier.startswith(COMMENT_NODE_PREFIX): raise error.StanzaError("item-not-found") file_id = nodeIdentifier[len(COMMENT_NODE_PREFIX) :] @@ -762,10 +762,10 @@ raise error.StanzaError("item-not-found") return file_id - async def getFileData(self, requestor, nodeIdentifier): - file_id = self._getFileId(nodeIdentifier) + async def get_file_data(self, requestor, nodeIdentifier): + file_id = self._get_file_id(nodeIdentifier) try: - files = await self.host.memory.getFiles(self.parent, requestor, file_id) + files = await self.host.memory.get_files(self.parent, requestor, file_id) except (exceptions.NotFound, exceptions.PermissionError): # we don't differenciate between NotFound and PermissionError # to avoid leaking information on existing files @@ -776,7 +776,7 @@ raise error.InternalError("there should be only one file") return files[0] - def commentsUpdate(self, extra, new_comments, peer_jid): + def comments_update(self, extra, new_comments, peer_jid): """update comments (replace or insert new_comments) @param extra(dict): extra data to update @@ -807,7 +807,7 @@ current_comments.extend(new_comments) - def commentsDelete(self, extra, comments): + def comments_delete(self, extra, comments): try: comments_dict = extra["comments"] except KeyError: @@ -818,7 +818,7 @@ except ValueError: continue - def _getFrom(self, item_elt): + def _get_from(self, item_elt): """retrieve publisher of an item @param item_elt(domish.element): element @@ -832,22 +832,22 @@ @ensure_deferred async def publish(self, requestor, service, nodeIdentifier, items): #  we retrieve file a first time to check authorisations - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) file_id = file_data["id"] - comments = [(item["id"], self._getFrom(item), item.toXml()) for item in items] + comments = [(item["id"], self._get_from(item), item.toXml()) for item in items] if requestor.userhostJID() == file_data["owner"]: peer_jid = None else: peer_jid = requestor.userhost() - update_cb = partial(self.commentsUpdate, new_comments=comments, peer_jid=peer_jid) + update_cb = partial(self.comments_update, new_comments=comments, peer_jid=peer_jid) try: - await self.host.memory.fileUpdate(file_id, "extra", update_cb) + await self.host.memory.file_update(file_id, "extra", update_cb) except exceptions.PermissionError: raise error.StanzaError("not-authorized") @ensure_deferred async def items(self, requestor, service, nodeIdentifier, maxItems, itemIdentifiers): - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) comments = file_data["extra"].get("comments", []) if itemIdentifiers: return [generic.parseXml(c[2]) for c in comments if c[0] in itemIdentifiers] @@ -856,7 +856,7 @@ @ensure_deferred async def retract(self, requestor, service, nodeIdentifier, itemIdentifiers): - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) file_id = file_data["id"] try: comments = file_data["extra"]["comments"] @@ -880,5 +880,5 @@ if not all([c[1] == requestor.userhost() for c in to_remove]): raise error.StanzaError("not-authorized") - remove_cb = partial(self.commentsDelete, comments=to_remove) - await self.host.memory.fileUpdate(file_id, "extra", remove_cb) + remove_cb = partial(self.comments_delete, comments=to_remove) + await self.host.memory.file_update(file_id, "extra", remove_cb) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_comp_file_sharing_management.py --- a/sat/plugins/plugin_comp_file_sharing_management.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_file_sharing_management.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,8 +74,8 @@ self._c = host.plugins["XEP-0050"] self._t = host.plugins["XEP-0264"] self.files_path = host.get_local_path(None, C.FILES_DIR) - host.bridge.addMethod( - "fileSharingDelete", + host.bridge.add_method( + "file_sharing_delete", ".plugin", in_sign="ssss", out_sign="", @@ -83,30 +83,30 @@ async_=True, ) - def profileConnected(self, client): - self._c.addAdHocCommand( - client, self._onChangeFile, "Change Permissions of File(s)", + def profile_connected(self, client): + self._c.add_ad_hoc_command( + client, self._on_change_file, "Change Permissions of File(s)", node=NS_FILE_MANAGEMENT_PERM, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onDeleteFile, "Delete File(s)", + self._c.add_ad_hoc_command( + client, self._on_delete_file, "Delete File(s)", node=NS_FILE_MANAGEMENT_DELETE, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onGenThumbnails, "Generate Thumbnails", + self._c.add_ad_hoc_command( + client, self._on_gen_thumbnails, "Generate Thumbnails", node=NS_FILE_MANAGEMENT_THUMB, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onQuota, "Get Quota", + self._c.add_ad_hoc_command( + client, self._on_quota, "Get Quota", node=NS_FILE_MANAGEMENT_QUOTA, allowed_magics=C.ENTITY_ALL, ) def _delete(self, service_jid_s, path, namespace, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service_jid = jid.JID(service_jid_s) if service_jid_s else None return defer.ensureDeferred(self._c.sequence( client, @@ -127,7 +127,7 @@ note = (self._c.NOTE.ERROR, reason) return payload, status, None, note - def _getRootArgs(self): + def _get_root_args(self): """Create the form to select the file to use @return (tuple): arguments to use in defer.returnValue @@ -149,7 +149,7 @@ payload = form.toElement() return payload, status, None, None - async def _getFileData(self, client, session_data, command_form): + async def _get_file_data(self, client, session_data, command_form): """Retrieve field requested in root form "found_file" will also be set in session_data @@ -162,10 +162,10 @@ path = fields['path'].value.strip() namespace = fields['namespace'].value or None except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if not path: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) requestor = session_data['requestor'] requestor_bare = requestor.userhostJID() @@ -176,7 +176,7 @@ # this must be managed try: - found_files = await self.host.memory.getFiles( + found_files = await self.host.memory.get_files( client, requestor_bare, path=parent_path, name=basename, namespace=namespace) found_file = found_files[0] @@ -194,7 +194,7 @@ session_data['namespace'] = namespace return found_file - def _updateReadPermission(self, access, allowed_jids): + def _update_read_permission(self, access, allowed_jids): if not allowed_jids: if C.ACCESS_PERM_READ in access: del access[C.ACCESS_PERM_READ] @@ -208,27 +208,27 @@ "jids": [j.full() for j in allowed_jids] } - async def _updateDir(self, client, requestor, namespace, file_data, allowed_jids): + async def _update_dir(self, client, requestor, namespace, file_data, allowed_jids): """Recursively update permission of a directory and all subdirectories @param file_data(dict): metadata of the file @param allowed_jids(list[jid.JID]): list of entities allowed to read the file """ assert file_data['type'] == C.FILE_TYPE_DIRECTORY - files_data = await self.host.memory.getFiles( + files_data = await self.host.memory.get_files( client, requestor, parent=file_data['id'], namespace=namespace) for file_data in files_data: if not file_data['access'].get(C.ACCESS_PERM_READ, {}): log.debug("setting {perm} read permission for {name}".format( perm=allowed_jids, name=file_data['name'])) - await self.host.memory.fileUpdate( + await self.host.memory.file_update( file_data['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) if file_data['type'] == C.FILE_TYPE_DIRECTORY: - await self._updateDir(client, requestor, namespace, file_data, 'PUBLIC') + await self._update_dir(client, requestor, namespace, file_data, 'PUBLIC') - async def _onChangeFile(self, client, command_elt, session_data, action, node): + async def _on_change_file(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -241,12 +241,12 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we retrieve it and ask for permissions try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args @@ -288,7 +288,7 @@ try: read_allowed = command_form.fields['read_allowed'] except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if read_allowed.value == 'PUBLIC': allowed_jids = 'PUBLIC' @@ -301,26 +301,26 @@ except RuntimeError as e: log.warning(_("Can't use read_allowed values: {reason}").format( reason=e)) - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if found_file['type'] == C.FILE_TYPE_FILE: - await self.host.memory.fileUpdate( + await self.host.memory.file_update( found_file['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) else: try: recursive = command_form.fields['recursive'] except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) - await self.host.memory.fileUpdate( + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) + await self.host.memory.file_update( found_file['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) if recursive: # we set all file under the directory as public (if they haven't # already a permission set), so allowed entities of root directory # can read them. namespace = session_data['namespace'] - await self._updateDir( + await self._update_dir( client, requestor_bare, namespace, found_file, 'PUBLIC') # job done, we can end the session @@ -329,7 +329,7 @@ note = (self._c.NOTE.INFO, _("management session done")) return (payload, status, None, note) - async def _onDeleteFile(self, client, command_elt, session_data, action, node): + async def _on_delete_file(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -342,12 +342,12 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we need confirmation before actually deleting try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args if found_file['type'] == C.FILE_TYPE_DIRECTORY: @@ -373,31 +373,31 @@ try: confirmed = C.bool(command_form.fields['confirm'].value) except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if not confirmed: note = None else: recursive = found_file['type'] == C.FILE_TYPE_DIRECTORY - await self.host.memory.fileDelete( + await self.host.memory.file_delete( client, requestor_bare, found_file['id'], recursive) note = (self._c.NOTE.INFO, _("file deleted")) status = self._c.STATUS.COMPLETED payload = None return (payload, status, None, note) - def _updateThumbs(self, extra, thumbnails): + def _update_thumbs(self, extra, thumbnails): extra[C.KEY_THUMBNAILS] = thumbnails - async def _genThumbs(self, client, requestor, namespace, file_data): + async def _gen_thumbs(self, client, requestor, namespace, file_data): """Recursively generate thumbnails @param file_data(dict): metadata of the file """ if file_data['type'] == C.FILE_TYPE_DIRECTORY: - sub_files_data = await self.host.memory.getFiles( + sub_files_data = await self.host.memory.get_files( client, requestor, parent=file_data['id'], namespace=namespace) for sub_file_data in sub_files_data: - await self._genThumbs(client, requestor, namespace, sub_file_data) + await self._gen_thumbs(client, requestor, namespace, sub_file_data) elif file_data['type'] == C.FILE_TYPE_FILE: media_type = file_data['media_type'] @@ -407,7 +407,7 @@ for max_thumb_size in self._t.SIZES: try: - thumb_size, thumb_id = await self._t.generateThumbnail( + thumb_size, thumb_id = await self._t.generate_thumbnail( file_path, max_thumb_size, #  we keep thumbnails for 6 months @@ -419,9 +419,9 @@ break thumbnails.append({"id": thumb_id, "size": thumb_size}) - await self.host.memory.fileUpdate( + await self.host.memory.file_update( file_data['id'], 'extra', - partial(self._updateThumbs, thumbnails=thumbnails)) + partial(self._update_thumbs, thumbnails=thumbnails)) log.info("thumbnails for [{file_name}] generated" .format(file_name=file_data['name'])) @@ -429,7 +429,7 @@ else: log.warning("unmanaged file type: {type_}".format(type_=file_data['type'])) - async def _onGenThumbnails(self, client, command_elt, session_data, action, node): + async def _on_gen_thumbnails(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -441,17 +441,17 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we retrieve it and ask for permissions try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args log.info("Generating thumbnails as requested") - await self._genThumbs(client, requestor, found_file['namespace'], found_file) + await self._gen_thumbs(client, requestor, found_file['namespace'], found_file) # job done, we can end the session status = self._c.STATUS.COMPLETED @@ -459,11 +459,11 @@ note = (self._c.NOTE.INFO, _("thumbnails generated")) return (payload, status, None, note) - async def _onQuota(self, client, command_elt, session_data, action, node): + async def _on_quota(self, client, command_elt, session_data, action, node): requestor = session_data['requestor'] - quota = self.host.plugins["file_sharing"].getQuota(client, requestor) + quota = self.host.plugins["file_sharing"].get_quota(client, requestor) try: - size_used = await self.host.memory.fileGetUsedSpace(client, requestor) + size_used = await self.host.memory.file_get_used_space(client, requestor) except exceptions.PermissionError: raise WorkflowError(self._err(_("forbidden"))) status = self._c.STATUS.COMPLETED @@ -473,10 +473,10 @@ note = ( self._c.NOTE.INFO, _("You are currently using {size_used} on {size_quota}").format( - size_used = utils.getHumanSize(size_used), + size_used = utils.get_human_size(size_used), size_quota = ( _("unlimited quota") if quota is None - else utils.getHumanSize(quota) + else utils.get_human_size(quota) ) ) ) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_dbg_manhole.py --- a/sat/plugins/plugin_dbg_manhole.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_dbg_manhole.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,11 +45,11 @@ def __init__(self, host): self.host = host - port = int(host.memory.getConfig(None, "manhole_debug_dangerous_port_int", 0)) + port = int(host.memory.config_get(None, "manhole_debug_dangerous_port_int", 0)) if port: - self.startManhole(port) + self.start_manhole(port) - def startManhole(self, port): + def start_manhole(self, port): log.warning(_("/!\\ Manhole debug server activated, be sure to not use it in " "production, this is dangerous /!\\")) log.info(_("You can connect to manhole server using telnet on port {port}") diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_command_export.py --- a/sat/plugins/plugin_exp_command_export.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_command_export.py Sat Apr 08 13:54:42 2023 +0200 @@ -72,7 +72,7 @@ def write(self, message): self.transport.write(message.encode("utf-8")) - def boolOption(self, key): + def bool_option(self, key): """ Get boolean value from options @param key: name of the option @return: True if key exists and set to "true" (case insensitive), @@ -92,13 +92,13 @@ log.info(_("Plugin command export initialization")) self.host = host self.spawned = {} # key = entity - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=10000) - host.bridge.addMethod( - "exportCommand", + host.trigger.add("messageReceived", self.message_received_trigger, priority=10000) + host.bridge.add_method( + "command_export", ".plugin", in_sign="sasasa{ss}s", out_sign="", - method=self._exportCommand, + method=self._export_command, ) def removeProcess(self, entity, process): @@ -113,7 +113,7 @@ except ValueError: pass - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ from_jid = jid.JID(message_elt["from"]) spawned_key = (from_jid.userhostJID(), client.profile) @@ -131,15 +131,15 @@ exclusive = False for process in processes_set: process.write(mess_data) - _continue &= process.boolOption("continue") - exclusive |= process.boolOption("exclusive") + _continue &= process.bool_option("continue") + exclusive |= process.bool_option("exclusive") if exclusive: raise trigger.SkipOtherTriggers return _continue return True - def _exportCommand(self, command, args, targets, options, profile_key): + def _export_command(self, command, args, targets, options, profile_key): """ Export a commands to authorised targets @param command: full path of the command to execute @param args: list of arguments, with command name as first one @@ -150,7 +150,7 @@ - pty: if set, launch in a pseudo terminal - continue: continue normal messageReceived handling """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) for target in targets: try: _jid = jid.JID(target) @@ -163,5 +163,5 @@ process_prot = ExportCommandProtocol(self, client, _jid, options) self.spawned.setdefault((_jid, client.profile), set()).add(process_prot) reactor.spawnProcess( - process_prot, command, args, usePTY=process_prot.boolOption("pty") + process_prot, command, args, usePTY=process_prot.bool_option("pty") ) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_invitation.py --- a/sat/plugins/plugin_exp_invitation.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,10 +61,10 @@ # map from namespace of the invitation to callback handling it self._ns_cb = {} - def getHandler(self, client): + def get_handler(self, client): return PubsubInvitationHandler(self) - def registerNamespace(self, namespace, callback): + def register_namespace(self, namespace, callback): """Register a callback for a namespace @param namespace(unicode): namespace handled @@ -95,7 +95,7 @@ .format(namespace=namespace, callback=self._ns_cb[namespace])) self._ns_cb[namespace] = callback - def _generateBaseInvitation(self, client, invitee_jid, name, extra): + def _generate_base_invitation(self, client, invitee_jid, name, extra): """Generate common mess_data end invitation_elt @param invitee_jid(jid.JID): entitee to send invitation to @@ -113,8 +113,8 @@ "subject": {}, "extra": {}, } - client.generateMessageXML(mess_data) - self._h.addHintElements(mess_data["xml"], [self._h.HINT_STORE]) + client.generate_message_xml(mess_data) + self._h.add_hint_elements(mess_data["xml"], [self._h.HINT_STORE]) invitation_elt = mess_data["xml"].addElement("invitation", NS_INVITATION) if name is not None: invitation_elt["name"] = name @@ -128,7 +128,7 @@ invitation_elt['thumb_url'] = thumb_url return mess_data, invitation_elt - def sendPubsubInvitation( + def send_pubsub_invitation( self, client: SatXMPPEntity, invitee_jid: jid.JID, @@ -145,12 +145,12 @@ @param node: pubsub node @param item_id: pubsub id None when the invitation is for a whole node - @param name: see [_generateBaseInvitation] - @param extra: see [_generateBaseInvitation] + @param name: see [_generate_base_invitation] + @param extra: see [_generate_base_invitation] """ if extra is None: extra = {} - mess_data, invitation_elt = self._generateBaseInvitation( + mess_data, invitation_elt = self._generate_base_invitation( client, invitee_jid, name, extra) pubsub_elt = invitation_elt.addElement("pubsub") pubsub_elt["service"] = service.full() @@ -172,7 +172,7 @@ invitation_elt.addChild(extra.pop("element")) client.send(mess_data["xml"]) - async def sendFileSharingInvitation( + async def send_file_sharing_invitation( self, client, invitee_jid, service, repos_type=None, namespace=None, path=None, name=None, extra=None ): @@ -185,13 +185,13 @@ - "photos": photos album @param namespace(unicode, None): namespace of the shared repository @param path(unicode, None): path of the shared repository - @param name(unicode, None): see [_generateBaseInvitation] - @param extra(dict, None): see [_generateBaseInvitation] + @param name(unicode, None): see [_generate_base_invitation] + @param extra(dict, None): see [_generate_base_invitation] """ if extra is None: extra = {} li_plg = self.host.plugins["LIST_INTEREST"] - li_plg.normaliseFileSharingService(client, service) + li_plg.normalise_file_sharing_service(client, service) # FIXME: not the best place to adapt permission, but it's necessary to check them # for UX @@ -205,7 +205,7 @@ if "thumb_url" not in extra: # we have no thumbnail, we check in our own list of interests if there is one try: - item_id = li_plg.getFileSharingId(service, namespace, path) + item_id = li_plg.get_file_sharing_id(service, namespace, path) own_interest = await li_plg.get(client, item_id) except exceptions.NotFound: log.debug( @@ -218,7 +218,7 @@ except KeyError: pass - mess_data, invitation_elt = self._generateBaseInvitation( + mess_data, invitation_elt = self._generate_base_invitation( client, invitee_jid, name, extra) file_sharing_elt = invitation_elt.addElement("file_sharing") file_sharing_elt["service"] = service.full() @@ -235,7 +235,7 @@ file_sharing_elt["path"] = path client.send(mess_data["xml"]) - async def _parsePubsubElt(self, client, pubsub_elt): + async def _parse_pubsub_elt(self, client, pubsub_elt): try: service = jid.JID(pubsub_elt["service"]) node = pubsub_elt["node"] @@ -246,7 +246,7 @@ if item_id is not None: try: - items, metadata = await self._p.getItems( + items, metadata = await self._p.get_items( client, service, node, item_ids=[item_id] ) except Exception as e: @@ -276,7 +276,7 @@ return namespace, args - async def _parseFileSharingElt(self, client, file_sharing_elt): + async def _parse_file_sharing_elt(self, client, file_sharing_elt): try: service = jid.JID(file_sharing_elt["service"]) except (RuntimeError, KeyError): @@ -286,10 +286,10 @@ sharing_ns = file_sharing_elt.getAttribute("namespace") path = file_sharing_elt.getAttribute("path") args = [service, repos_type, sharing_ns, path] - ns_fis = self.host.getNamespace("fis") + ns_fis = self.host.get_namespace("fis") return ns_fis, args - async def onInvitation(self, message_elt, client): + async def on_invitation(self, message_elt, client): log.debug("invitation received [{profile}]".format(profile=client.profile)) invitation_elt = message_elt.invitation @@ -303,9 +303,9 @@ log.warning("unexpected element: {xml}".format(xml=elt.toXml())) continue if elt.name == "pubsub": - method = self._parsePubsubElt + method = self._parse_pubsub_elt elif elt.name == "file_sharing": - method = self._parseFileSharingElt + method = self._parse_file_sharing_elt else: log.warning("not implemented invitation element: {xml}".format( xml = elt.toXml())) @@ -324,7 +324,7 @@ 'No handler for namespace "{namespace}", invitation ignored') .format(namespace=namespace)) else: - await utils.asDeferred(cb, client, namespace, name, extra, *args) + await utils.as_deferred(cb, client, namespace, name, extra, *args) @implementer(iwokkel.IDisco) @@ -337,7 +337,7 @@ self.xmlstream.addObserver( INVITATION, lambda message_elt: defer.ensureDeferred( - self.plugin_parent.onInvitation(message_elt, client=self.parent) + self.plugin_parent.on_invitation(message_elt, client=self.parent) ), ) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_invitation_file.py --- a/sat/plugins/plugin_exp_invitation_file.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation_file.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,32 +45,32 @@ def __init__(self, host): log.info(_("File Sharing Invitation plugin initialization")) self.host = host - ns_fis = host.getNamespace("fis") - host.plugins["INVITATION"].registerNamespace(ns_fis, self.onInvitation) - host.bridge.addMethod( - "FISInvite", + ns_fis = host.get_namespace("fis") + host.plugins["INVITATION"].register_namespace(ns_fis, self.on_invitation) + host.bridge.add_method( + "fis_invite", ".plugin", in_sign="ssssssss", out_sign="", - method=self._sendFileSharingInvitation, + method=self._send_file_sharing_invitation, async_=True ) - def _sendFileSharingInvitation( + def _send_file_sharing_invitation( self, invitee_jid_s, service_s, repos_type=None, namespace=None, path=None, name=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) invitee_jid = jid.JID(invitee_jid_s) service = jid.JID(service_s) extra = data_format.deserialise(extra_s) return defer.ensureDeferred( - self.host.plugins["INVITATION"].sendFileSharingInvitation( + self.host.plugins["INVITATION"].send_file_sharing_invitation( client, invitee_jid, service, repos_type=repos_type or None, namespace=namespace or None, path=path or None, name=name or None, extra=extra) ) - def onInvitation( + def on_invitation( self, client: SatXMPPEntity, namespace: str, @@ -97,7 +97,7 @@ path=path) ) return defer.ensureDeferred( - self.host.plugins['LIST_INTEREST'].registerFileSharing( + self.host.plugins['LIST_INTEREST'].register_file_sharing( client, service, repos_type, sharing_ns, path, name, extra ) ) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_invitation_pubsub.py --- a/sat/plugins/plugin_exp_invitation_pubsub.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation_pubsub.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,12 +51,12 @@ self._p = host.plugins["XEP-0060"] # namespace to handler map self._ns_handler = {} - host.bridge.addMethod( - "psInvite", + host.bridge.add_method( + "ps_invite", ".plugin", in_sign="sssssss", out_sign="", - method=self._sendPubsubInvitation, + method=self._send_pubsub_invitation, async_=True ) @@ -66,12 +66,12 @@ handler ) -> None: self._ns_handler[namespace] = handler - self.host.plugins["INVITATION"].registerNamespace(namespace, self.onInvitation) + self.host.plugins["INVITATION"].register_namespace(namespace, self.on_invitation) - def _sendPubsubInvitation( + def _send_pubsub_invitation( self, invitee_jid_s, service_s, node, item_id=None, name=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) invitee_jid = jid.JID(invitee_jid_s) service = jid.JID(service_s) extra = data_format.deserialise(extra_s) @@ -104,13 +104,13 @@ if namespace: try: handler = self._ns_handler[namespace] - preflight = handler.invitePreflight + preflight = handler.invite_preflight except KeyError: pass except AttributeError: - log.debug(f"no invitePreflight method found for {namespace!r}") + log.debug(f"no invite_preflight method found for {namespace!r}") else: - await utils.asDeferred( + await utils.as_deferred( preflight, client, invitee_jid, service, node, item_id, name, extra ) @@ -118,11 +118,11 @@ item_id = extra.pop("default_item_id", None) # we authorize our invitee to see the nodes of interest - await self._p.setNodeAffiliations(client, service, node, {invitee_jid: "member"}) + await self._p.set_node_affiliations(client, service, node, {invitee_jid: "member"}) log.debug(f"affiliation set on {service}'s {node!r} node") # now we send the invitation - self.host.plugins["INVITATION"].sendPubsubInvitation( + self.host.plugins["INVITATION"].send_pubsub_invitation( client, invitee_jid, service, @@ -132,7 +132,7 @@ extra=extra ) - async def onInvitation( + async def on_invitation( self, client: SatXMPPEntity, namespace: str, @@ -153,7 +153,7 @@ except AttributeError: log.debug(f"no on_invitation_preflight method found for {namespace!r}") else: - await utils.asDeferred( + await utils.as_deferred( preflight, client, namespace, name, extra, service, node, item_id, item_elt ) @@ -164,6 +164,6 @@ if not name: name = extra.pop("name", "") - return await self.host.plugins['LIST_INTEREST'].registerPubsub( + return await self.host.plugins['LIST_INTEREST'].register_pubsub( client, namespace, service, node, item_id, creator, name, element, extra) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_jingle_stream.py --- a/sat/plugins/plugin_exp_jingle_stream.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_jingle_stream.py Sat Apr 08 13:54:42 2023 +0200 @@ -60,7 +60,7 @@ def __init__(self): self.pause = False - def setPause(self, paused): + def set_pause(self, paused): # in Python 2.x, Twisted classes are old style # so we can use property and setter if paused: @@ -78,10 +78,10 @@ def connectionMade(self): if self.factory.client_conn is not None: self.transport.loseConnection() - self.factory.setClientConn(self) + self.factory.set_client_conn(self) def dataReceived(self, data): - self.factory.writeToConsumer(data) + self.factory.write_to_consumer(data) def sendData(self, data): self.transport.write(data) @@ -92,9 +92,9 @@ return if reason.type == error.ConnectionDone: - self.factory.streamFinished() + self.factory.stream_finished() else: - self.factory.streamFailed(reason) + self.factory.stream_failed(reason) @interface.implementer(stream.IStreamProducer) @@ -109,15 +109,15 @@ def __init__(self): self.client_conn = None - def setClientConn(self, stream_protocol): + def set_client_conn(self, stream_protocol): # in Python 2.x, Twisted classes are old style # so we can use property and setter assert self.client_conn is None self.client_conn = stream_protocol if self.consumer is None: - self.client_conn.setPause(True) + self.client_conn.set_pause(True) - def startStream(self, consumer): + def start_stream(self, consumer): if self.consumer is not None: raise exceptions.InternalError( _("stream can't be used with multiple consumers") @@ -127,17 +127,17 @@ consumer.registerProducer(self, True) self.deferred = defer.Deferred() if self.client_conn is not None: - self.client_conn.setPause(False) + self.client_conn.set_pause(False) return self.deferred - def streamFinished(self): + def stream_finished(self): self.client_conn = None if self.consumer: self.consumer.unregisterProducer() self.port_listening.stopListening() self.deferred.callback(None) - def streamFailed(self, failure_): + def stream_failed(self, failure_): self.client_conn = None if self.consumer: self.consumer.unregisterProducer() @@ -146,7 +146,7 @@ elif self.producer: self.producer.stopProducing() - def stopStream(self): + def stop_stream(self): if self.client_conn is not None: self.client_conn.disconnect() @@ -154,10 +154,10 @@ self.producer = producer def pauseProducing(self): - self.client_conn.setPause(True) + self.client_conn.set_pause(True) def resumeProducing(self): - self.client_conn.setPause(False) + self.client_conn.set_pause(False) def stopProducing(self): if self.client_conn: @@ -169,7 +169,7 @@ except AttributeError: log.warning(_("No client connected, can't send data")) - def writeToConsumer(self, data): + def write_to_consumer(self, data): self.consumer.write(data) @@ -180,23 +180,23 @@ log.info(_("Plugin Stream initialization")) self.host = host self._j = host.plugins["XEP-0166"] # shortcut to access jingle - self._j.registerApplication(NS_STREAM, self) - host.bridge.addMethod( - "streamOut", + self._j.register_application(NS_STREAM, self) + host.bridge.add_method( + "stream_out", ".plugin", in_sign="ss", out_sign="s", - method=self._streamOut, + method=self._stream_out, async_=True, ) # jingle callbacks - def _streamOut(self, to_jid_s, profile_key): - client = self.host.getClient(profile_key) - return defer.ensureDeferred(self.streamOut(client, jid.JID(to_jid_s))) + def _stream_out(self, to_jid_s, profile_key): + client = self.host.get_client(profile_key) + return defer.ensureDeferred(self.stream_out(client, jid.JID(to_jid_s))) - async def streamOut(self, client, to_jid): + async def stream_out(self, client, to_jid): """send a stream @param peer_jid(jid.JID): recipient @@ -230,7 +230,7 @@ )) return str(port) - def jingleSessionInit(self, client, session, content_name, stream_object): + def jingle_session_init(self, client, session, content_name, stream_object): content_data = session["contents"][content_name] application_data = content_data["application_data"] assert "stream_object" not in application_data @@ -239,7 +239,7 @@ return desc_elt @defer.inlineCallbacks - def jingleRequestConfirmation(self, client, action, session, content_name, desc_elt): + def jingle_request_confirmation(self, client, action, session, content_name, desc_elt): """This method request confirmation for a jingle session""" content_data = session["contents"][content_name] if content_data["senders"] not in ( @@ -249,7 +249,7 @@ log.warning("Bad sender, assuming initiator") content_data["senders"] = self._j.ROLE_INITIATOR - confirm_data = yield xml_tools.deferDialog( + confirm_data = yield xml_tools.defer_dialog( self.host, _(CONFIRM).format(peer=session["peer_jid"].full()), _(CONFIRM_TITLE), @@ -274,10 +274,10 @@ content_data["stream_object"] = factory finished_d = content_data["finished_d"] = defer.Deferred() args = [client, session, content_name, content_data] - finished_d.addCallbacks(self._finishedCb, self._finishedEb, args, None, args) + finished_d.addCallbacks(self._finished_cb, self._finished_eb, args, None, args) defer.returnValue(True) - def jingleHandler(self, client, action, session, content_name, desc_elt): + def jingle_handler(self, client, action, session, content_name, desc_elt): content_data = session["contents"][content_name] application_data = content_data["application_data"] if action in (self._j.A_ACCEPTED_ACK, self._j.A_SESSION_INITIATE): @@ -287,19 +287,19 @@ content_data["stream_object"] = application_data["stream_object"] finished_d = content_data["finished_d"] = defer.Deferred() args = [client, session, content_name, content_data] - finished_d.addCallbacks(self._finishedCb, self._finishedEb, args, None, args) + finished_d.addCallbacks(self._finished_cb, self._finished_eb, args, None, args) else: log.warning("FIXME: unmanaged action {}".format(action)) return desc_elt - def _finishedCb(self, __, client, session, content_name, content_data): + def _finished_cb(self, __, client, session, content_name, content_data): log.info("Pipe transfer completed") - self._j.contentTerminate(client, session, content_name) - content_data["stream_object"].stopStream() + self._j.content_terminate(client, session, content_name) + content_data["stream_object"].stop_stream() - def _finishedEb(self, failure, client, session, content_name, content_data): + def _finished_eb(self, failure, client, session, content_name, content_data): log.warning("Error while streaming pipe: {}".format(failure)) - self._j.contentTerminate( + self._j.content_terminate( client, session, content_name, reason=self._j.REASON_FAILED_TRANSPORT ) - content_data["stream_object"].stopStream() + content_data["stream_object"].stop_stream() diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_lang_detect.py --- a/sat/plugins/plugin_exp_lang_detect.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_lang_detect.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,9 +65,9 @@ def __init__(self, host): log.info(_("Language detection plugin initialization")) self.host = host - host.memory.updateParams(PARAMS) - host.trigger.add("messageReceived", self.messageReceivedTrigger) - host.trigger.add("sendMessage", self.MessageSendTrigger) + host.memory.update_params(PARAMS) + host.trigger.add("messageReceived", self.message_received_trigger) + host.trigger.add("sendMessage", self.message_send_trigger) def add_language(self, mess_data): message = mess_data["message"] @@ -78,18 +78,18 @@ mess_data["message"] = {lang: msg} return mess_data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ - lang_detect = self.host.memory.getParamA( + lang_detect = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile ) if lang_detect: post_treat.addCallback(self.add_language) return True - def MessageSendTrigger(self, client, data, pre_xml_treatments, post_xml_treatments): - lang_detect = self.host.memory.getParamA( + def message_send_trigger(self, client, data, pre_xml_treatments, post_xml_treatments): + lang_detect = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile ) if lang_detect: diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_list_of_interest.py --- a/sat/plugins/plugin_exp_list_of_interest.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_list_of_interest.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,32 +56,32 @@ log.info(_("List of Interest plugin initialization")) self.host = host self._p = self.host.plugins["XEP-0060"] - host.bridge.addMethod( - "interestsList", + host.bridge.add_method( + "interests_list", ".plugin", in_sign="ssss", out_sign="aa{ss}", - method=self._listInterests, + method=self._list_interests, async_=True, ) - host.bridge.addMethod( - "interestsRegisterFileSharing", + host.bridge.add_method( + "interests_file_sharing_register", ".plugin", in_sign="sssssss", out_sign="", - method=self._registerFileSharing, + method=self._register_file_sharing, async_=True, ) - host.bridge.addMethod( - "interestRetract", + host.bridge.add_method( + "interest_retract", ".plugin", in_sign="sss", out_sign="", - method=self._interestRetract, + method=self._interest_retract, async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return ListInterestHandler(self) @defer.inlineCallbacks @@ -99,7 +99,7 @@ if e.condition == "conflict": log.debug(_("requested node already exists")) - async def registerPubsub(self, client, namespace, service, node, item_id=None, + async def register_pubsub(self, client, namespace, service, node, item_id=None, creator=False, name=None, element=None, extra=None): """Register an interesting element in personal list @@ -142,26 +142,26 @@ } if item_id: uri_kwargs['id'] = item_id - interest_uri = uri.buildXMPPUri("pubsub", **uri_kwargs) + interest_uri = uri.build_xmpp_uri("pubsub", **uri_kwargs) # we use URI of the interest as item id to avoid duplicates item_elt = pubsub.Item(interest_uri, payload=interest_elt) await self._p.publish( client, client.jid.userhostJID(), NS_LIST_INTEREST, items=[item_elt] ) - def _registerFileSharing( + def _register_file_sharing( self, service, repos_type, namespace, path, name, extra_raw, profile ): - client = self.host.getClient(profile) + client = self.host.get_client(profile) extra = data_format.deserialise(extra_raw) - return defer.ensureDeferred(self.registerFileSharing( + return defer.ensureDeferred(self.register_file_sharing( client, jid.JID(service), repos_type or None, namespace or None, path or None, name or None, extra )) - def normaliseFileSharingService(self, client, service): + def normalise_file_sharing_service(self, client, service): # FIXME: Q&D fix as the bare file sharing service JID will lead to user own # repository, which thus would not be the same for the host and the guest. # By specifying the user part, we for the use of the host repository. @@ -169,10 +169,10 @@ if service.user is None: service.user = self.host.plugins['XEP-0106'].escape(client.jid.user) - def getFileSharingId(self, service, namespace, path): + def get_file_sharing_id(self, service, namespace, path): return f"{service}_{namespace or ''}_{path or ''}" - async def registerFileSharing( + async def register_file_sharing( self, client, service, repos_type=None, namespace=None, path=None, name=None, extra=None): """Register an interesting file repository in personal list @@ -182,15 +182,15 @@ @param namespace(unicode, None): namespace of the repository @param path(unicode, None): path of the repository @param name(unicode, None): name of the repository - @param extra(dict, None): same as [registerPubsub] + @param extra(dict, None): same as [register_pubsub] """ if extra is None: extra = {} - self.normaliseFileSharingService(client, service) + self.normalise_file_sharing_service(client, service) await self.createNode(client) - item_id = self.getFileSharingId(service, namespace, path) + item_id = self.get_file_sharing_id(service, namespace, path) interest_elt = domish.Element((NS_LIST_INTEREST, "interest")) - interest_elt["namespace"] = self.host.getNamespace("fis") + interest_elt["namespace"] = self.host.get_namespace("fis") if name is not None: interest_elt['name'] = name thumb_url = extra.get('thumb_url') @@ -210,7 +210,7 @@ client, client.jid.userhostJID(), NS_LIST_INTEREST, items=[item_elt] ) - def _listInterestsSerialise(self, interests_data): + def _list_interests_serialise(self, interests_data): interests = [] for item_elt in interests_data[0]: interest_data = {"id": item_elt['id']} @@ -252,16 +252,16 @@ return interests - def _listInterests(self, service, node, namespace, profile): + def _list_interests(self, service, node, namespace, profile): service = jid.JID(service) if service else None node = node or None namespace = namespace or None - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.listInterests(client, service, node, namespace)) - d.addCallback(self._listInterestsSerialise) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.list_interests(client, service, node, namespace)) + d.addCallback(self._list_interests_serialise) return d - async def listInterests(self, client, service=None, node=None, namespace=None): + async def list_interests(self, client, service=None, node=None, namespace=None): """Retrieve list of interests @param service(jid.JID, None): service to use @@ -270,12 +270,12 @@ None to use default node @param namespace(unicode, None): filter interests of this namespace None to retrieve all interests - @return: same as [XEP_0060.getItems] + @return: same as [XEP_0060.get_items] """ # TODO: if a MAM filter were available, it would improve performances if not node: node = NS_LIST_INTEREST - items, metadata = await self._p.getItems(client, service, node) + items, metadata = await self._p.get_items(client, service, node) if namespace is not None: filtered_items = [] for item in items: @@ -291,17 +291,17 @@ return (items, metadata) - def _interestRetract(self, service_s, item_id, profile_key): - d = self._p._retractItem( + def _interest_retract(self, service_s, item_id, profile_key): + d = self._p._retract_item( service_s, NS_LIST_INTEREST, item_id, True, profile_key) d.addCallback(lambda __: None) return d async def get(self, client: SatXMPPEntity, item_id: str) -> dict: """Retrieve a specific interest in profile's list""" - items_data = await self._p.getItems(client, None, NS_LIST_INTEREST, item_ids=[item_id]) + items_data = await self._p.get_items(client, None, NS_LIST_INTEREST, item_ids=[item_id]) try: - return self._listInterestsSerialise(items_data)[0] + return self._list_interests_serialise(items_data)[0] except IndexError: raise exceptions.NotFound diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_parrot.py --- a/sat/plugins/plugin_exp_parrot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_parrot.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,21 +48,21 @@ # XXX: This plugin can be potentially dangerous if we don't trust entities linked # this is specially true if we have other triggers. - # sendMessageTrigger avoid other triggers execution, it's deactivated to allow + # send_message_trigger avoid other triggers execution, it's deactivated to allow # /unparrot command in text commands plugin. # FIXME: potentially unsecure, specially with e2e encryption def __init__(self, host): log.info(_("Plugin Parrot initialization")) self.host = host - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=100) - # host.trigger.add("sendMessage", self.sendMessageTrigger, priority=100) + host.trigger.add("messageReceived", self.message_received_trigger, priority=100) + # host.trigger.add("sendMessage", self.send_message_trigger, priority=100) try: - self.host.plugins[C.TEXT_CMDS].registerTextCommands(self) + self.host.plugins[C.TEXT_CMDS].register_text_commands(self) except KeyError: log.info(_("Text commands not available")) - # def sendMessageTrigger(self, client, mess_data, treatments): + # def send_message_trigger(self, client, mess_data, treatments): # """ Deactivate other triggers if recipient is in parrot links """ # try: # _links = client.parrot_links @@ -73,7 +73,7 @@ # log.debug("Parrot link detected, skipping other triggers") # raise trigger.SkipOtherTriggers - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ # TODO: many things are not repeated (subject, thread, etc) from_jid = message_elt["from"] @@ -92,13 +92,13 @@ lang = e.getAttribute("lang") or "" try: - entity_type = self.host.memory.getEntityData( + entity_type = self.host.memory.entity_data_get( client, from_jid, [C.ENTITY_TYPE])[C.ENTITY_TYPE] except (UnknownEntityError, KeyError): entity_type = "contact" if entity_type == C.ENTITY_TYPE_MUC: src_txt = from_jid.resource - if src_txt == self.host.plugins["XEP-0045"].getRoomNick( + if src_txt == self.host.plugins["XEP-0045"].get_room_nick( client, from_jid.userhostJID() ): # we won't repeat our own messages @@ -115,7 +115,7 @@ return True - def addParrot(self, client, source_jid, dest_jid): + def add_parrot(self, client, source_jid, dest_jid): """Add a parrot link from one entity to another one @param source_jid: entity from who messages will be repeated @@ -132,7 +132,7 @@ % (source_jid.userhost(), str(dest_jid)) ) - def removeParrot(self, client, source_jid): + def remove_parrot(self, client, source_jid): """Remove parrot link @param source_jid: this entity will no more be repeated @@ -152,17 +152,17 @@ if not link_left_jid.user or not link_left_jid.host: raise jid.InvalidFormat except (RuntimeError, jid.InvalidFormat, AttributeError): - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Can't activate Parrot mode for invalid jid", mess_data ) return False link_right_jid = mess_data["to"] - self.addParrot(client, link_left_jid, link_right_jid) - self.addParrot(client, link_right_jid, link_left_jid) + self.add_parrot(client, link_left_jid, link_right_jid) + self.add_parrot(client, link_right_jid, link_left_jid) - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Parrot mode activated for {}".format(str(link_left_jid)), mess_data, @@ -180,17 +180,17 @@ if not link_left_jid.user or not link_left_jid.host: raise jid.InvalidFormat except jid.InvalidFormat: - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Can't deactivate Parrot mode for invalid jid", mess_data ) return False link_right_jid = mess_data["to"] - self.removeParrot(client, link_left_jid) - self.removeParrot(client, link_right_jid) + self.remove_parrot(client, link_left_jid) + self.remove_parrot(client, link_right_jid) - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Parrot mode deactivated for {} and {}".format( str(link_left_jid), str(link_right_jid) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_pubsub_admin.py --- a/sat/plugins/plugin_exp_pubsub_admin.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_pubsub_admin.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,8 +49,8 @@ def __init__(self, host): self.host = host - host.bridge.addMethod( - "psAdminItemsSend", + host.bridge.add_method( + "ps_admin_items_send", ".plugin", in_sign="ssasss", out_sign="as", @@ -60,7 +60,7 @@ def _publish(self, service, nodeIdentifier, items, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) extra = data_format.deserialise(extra) items = [generic.parseXml(i.encode('utf-8')) for i in items] @@ -68,7 +68,7 @@ client, service, nodeIdentifier, items, extra ) - def _sendCb(self, iq_result): + def _send_cb(self, iq_result): publish_elt = iq_result.admin.pubsub.publish ids = [] for item_elt in publish_elt.elements(pubsub.NS_PUBSUB, 'item'): @@ -90,5 +90,5 @@ for item in items: publish_elt.addChild(item) d = iq_elt.send() - d.addCallback(self._sendCb) + d.addCallback(self._send_cb) return d diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_exp_pubsub_hook.py --- a/sat/plugins/plugin_exp_pubsub_hook.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_pubsub_hook.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,51 +56,51 @@ log.info(_("PubSub Hook initialization")) self.host = host self.node_hooks = {} # keep track of the number of hooks per node (for all profiles) - host.bridge.addMethod( - "psHookAdd", ".plugin", in_sign="ssssbs", out_sign="", method=self._addHook + host.bridge.add_method( + "ps_hook_add", ".plugin", in_sign="ssssbs", out_sign="", method=self._addHook ) - host.bridge.addMethod( - "psHookRemove", + host.bridge.add_method( + "ps_hook_remove", ".plugin", in_sign="sssss", out_sign="i", method=self._removeHook, ) - host.bridge.addMethod( - "psHookList", + host.bridge.add_method( + "ps_hook_list", ".plugin", in_sign="s", out_sign="aa{ss}", - method=self._listHooks, + method=self._list_hooks, ) @defer.inlineCallbacks - def profileConnected(self, client): + def profile_connected(self, client): hooks = client._hooks = persistent.PersistentBinaryDict( NS_PUBSUB_HOOK, client.profile ) client._hooks_temporary = {} yield hooks.load() for node in hooks: - self._installNodeManager(client, node) + self._install_node_manager(client, node) - def profileDisconnected(self, client): + def profile_disconnected(self, client): for node in client._hooks: - self._removeNodeManager(client, node) + self._remove_node_manager(client, node) - def _installNodeManager(self, client, node): + def _install_node_manager(self, client, node): if node in self.node_hooks: log.debug(_("node manager already set for {node}").format(node=node)) self.node_hooks[node] += 1 else: # first hook on this node - self.host.plugins["XEP-0060"].addManagedNode( - node, items_cb=self._itemsReceived + self.host.plugins["XEP-0060"].add_managed_node( + node, items_cb=self._items_received ) self.node_hooks[node] = 0 log.info(_("node manager installed on {node}").format(node=node)) - def _removeNodeManager(self, client, node): + def _remove_node_manager(self, client, node): try: self.node_hooks[node] -= 1 except KeyError: @@ -108,12 +108,12 @@ else: if self.node_hooks[node] == 0: del self.node_hooks[node] - self.host.plugins["XEP-0060"].removeManagedNode(node, self._itemsReceived) + self.host.plugins["XEP-0060"].remove_managed_node(node, self._items_received) log.debug(_("hook removed")) else: log.debug(_("node still needed for an other hook")) - def installHook(self, client, service, node, hook_type, hook_arg, persistent): + def install_hook(self, client, service, node, hook_type, hook_arg, persistent): if hook_type not in HOOK_TYPES: raise exceptions.DataError( _("{hook_type} is not handled").format(hook_type=hook_type) @@ -124,7 +124,7 @@ hook_type=hook_type ) ) - self._installNodeManager(client, node) + self._install_node_manager(client, node) hook_data = {"service": service, "type": hook_type, "arg": hook_arg} if persistent: @@ -143,7 +143,7 @@ ) ) - def _itemsReceived(self, client, itemsEvent): + def _items_received(self, client, itemsEvent): node = itemsEvent.nodeIdentifier for hooks in (client._hooks, client._hooks_temporary): if node not in hooks: @@ -188,9 +188,9 @@ ) def _addHook(self, service, node, hook_type, hook_arg, persistent, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service = jid.JID(service) if service else client.jid.userhostJID() - return self.addHook( + return self.add_hook( client, service, str(node), @@ -199,7 +199,7 @@ persistent, ) - def addHook(self, client, service, node, hook_type, hook_arg, persistent): + def add_hook(self, client, service, node, hook_type, hook_arg, persistent): r"""Add a hook which will be triggered on a pubsub notification @param service(jid.JID): service of the node @@ -219,21 +219,21 @@ can be a module path, file path, python code """ assert service is not None - return self.installHook(client, service, node, hook_type, hook_arg, persistent) + return self.install_hook(client, service, node, hook_type, hook_arg, persistent) def _removeHook(self, service, node, hook_type, hook_arg, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service = jid.JID(service) if service else client.jid.userhostJID() - return self.removeHook(client, service, node, hook_type or None, hook_arg or None) + return self.remove_hook(client, service, node, hook_type or None, hook_arg or None) - def removeHook(self, client, service, node, hook_type=None, hook_arg=None): + def remove_hook(self, client, service, node, hook_type=None, hook_arg=None): """Remove a persistent or temporaty root @param service(jid.JID): service of the node @param node(unicode): Pubsub node - @param hook_type(unicode, None): same as for [addHook] + @param hook_type(unicode, None): same as for [add_hook] match all if None - @param hook_arg(unicode, None): same as for [addHook] + @param hook_arg(unicode, None): same as for [add_hook] match all if None @return(int): number of hooks removed """ @@ -254,20 +254,20 @@ if not hooks[node]: #  no more hooks, we can remove the node del hooks[node] - self._removeNodeManager(client, node) + self._remove_node_manager(client, node) else: if hooks == client._hooks: hooks.force(node) return removed - def _listHooks(self, profile): - hooks_list = self.listHooks(self.host.getClient(profile)) + def _list_hooks(self, profile): + hooks_list = self.list_hooks(self.host.get_client(profile)) for hook in hooks_list: hook["service"] = hook["service"].full() - hook["persistent"] = C.boolConst(hook["persistent"]) + hook["persistent"] = C.bool_const(hook["persistent"]) return hooks_list - def listHooks(self, client): + def list_hooks(self, client): """return list of registered hooks""" hooks_list = [] for hooks in (client._hooks, client._hooks_temporary): diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_import.py --- a/sat/plugins/plugin_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,7 +46,7 @@ class ImportPlugin(object): def __init__(self, host): - log.info(_("plugin Import initialization")) + log.info(_("plugin import initialization")) self.host = host def initialize(self, import_handler, name): @@ -54,13 +54,13 @@ @param import_handler(object): specialized import handler instance must have the following methods: - - importItem: import a single main item (i.e. prepare data for publishing) + - import_item: import a single main item (i.e. prepare data for publishing) - importSubitems: import sub items (i.e. items linked to main item, e.g. comments). - Must return a dict with kwargs for recursiveImport if items are to be imported recursively. + Must return a dict with kwargs for recursive_import if items are to be imported recursively. At least "items_import_data", "service" and "node" keys must be provided. if None is returned, no recursion will be done to import subitems, but import can still be done directly by the method. - - publishItem: actualy publish an item - - itemFilters: modify item according to options + - publish_item: actualy publish an item + - item_filters: modify item according to options @param name(unicode): import handler name """ assert name == name.lower().strip() @@ -71,7 +71,7 @@ import_handler.importers = {} def _import(name, location, options, pubsub_service, pubsub_node, profile): - return self._doImport( + return self._do_import( import_handler, name, location, @@ -81,40 +81,40 @@ profile, ) - def _importList(): - return self.listImporters(import_handler) + def _import_list(): + return self.list_importers(import_handler) - def _importDesc(name): + def _import_desc(name): return self.getDescription(import_handler, name) - self.host.bridge.addMethod( - name + "Import", + self.host.bridge.add_method( + name + "import", ".plugin", in_sign="ssa{ss}sss", out_sign="s", method=_import, async_=True, ) - self.host.bridge.addMethod( + self.host.bridge.add_method( name + "ImportList", ".plugin", in_sign="", out_sign="a(ss)", - method=_importList, + method=_import_list, ) - self.host.bridge.addMethod( + self.host.bridge.add_method( name + "ImportDesc", ".plugin", in_sign="s", out_sign="(ss)", - method=_importDesc, + method=_import_desc, ) - def getProgress(self, import_handler, progress_id, profile): - client = self.host.getClient(profile) + def get_progress(self, import_handler, progress_id, profile): + client = self.host.get_client(profile) return client._import[import_handler.name][progress_id] - def listImporters(self, import_handler): + def list_importers(self, import_handler): importers = list(import_handler.importers.keys()) importers.sort() return [ @@ -139,9 +139,9 @@ else: return importer.short_desc, importer.long_desc - def _doImport(self, import_handler, name, location, options, pubsub_service="", + def _do_import(self, import_handler, name, location, options, pubsub_service="", pubsub_node="", profile=C.PROF_KEY_NONE): - client = self.host.getClient(profile) + client = self.host.get_client(profile) options = {key: str(value) for key, value in options.items()} for option in import_handler.BOOL_OPTIONS: try: @@ -158,7 +158,7 @@ _("invalid json option: {option}").format(option=option) ) pubsub_service = jid.JID(pubsub_service) if pubsub_service else None - return self.doImport( + return self.do_import( client, import_handler, str(name), @@ -169,9 +169,9 @@ ) @defer.inlineCallbacks - def doImport(self, client, import_handler, name, location, options=None, + def do_import(self, client, import_handler, name, location, options=None, pubsub_service=None, pubsub_node=None,): - """Import data + """import data @param import_handler(object): instance of the import handler @param name(unicode): name of the importer @@ -221,18 +221,18 @@ "direction": "out", "type": import_handler.name.upper() + "_IMPORT", } - self.host.registerProgressCb( + self.host.register_progress_cb( progress_id, - partial(self.getProgress, import_handler), + partial(self.get_progress, import_handler), metadata, profile=client.profile, ) - self.host.bridge.progressStarted(progress_id, metadata, client.profile) + self.host.bridge.progress_started(progress_id, metadata, client.profile) session = { #  session data, can be used by importers "root_service": pubsub_service, "root_node": pubsub_node, } - self.recursiveImport( + self.recursive_import( client, import_handler, items_import_data, @@ -246,7 +246,7 @@ defer.returnValue(progress_id) @defer.inlineCallbacks - def recursiveImport( + def recursive_import( self, client, import_handler, @@ -268,7 +268,7 @@ can be used by importer so store any useful data "root_service" and "root_node" are set to the main pubsub service and node of the import @param options(dict): import options - @param return_data(dict): data to return on progressFinished + @param return_data(dict): data to return on progress_finished @param service(jid.JID, None): PubSub service to use @param node(unicode, None): PubSub node to use @param depth(int): level of recursion @@ -276,14 +276,14 @@ if return_data is None: return_data = {} for idx, item_import_data in enumerate(items_import_data): - item_data = yield import_handler.importItem( + item_data = yield import_handler.import_item( client, item_import_data, session, options, return_data, service, node ) - yield import_handler.itemFilters(client, item_data, session, options) - recurse_kwargs = yield import_handler.importSubItems( + yield import_handler.item_filters(client, item_data, session, options) + recurse_kwargs = yield import_handler.import_sub_items( client, item_import_data, item_data, session, options ) - yield import_handler.publishItem(client, item_data, service, node, session) + yield import_handler.publish_item(client, item_data, service, node, session) if recurse_kwargs is not None: recurse_kwargs["client"] = client @@ -294,7 +294,7 @@ recurse_kwargs["return_data"] = return_data recurse_kwargs["depth"] = depth + 1 log.debug(_("uploading subitems")) - yield self.recursiveImport(**recurse_kwargs) + yield self.recursive_import(**recurse_kwargs) if depth == 0: client._import[import_handler.name][progress_id]["position"] = str( @@ -302,8 +302,8 @@ ) if depth == 0: - self.host.bridge.progressFinished(progress_id, return_data, client.profile) - self.host.removeProgressCb(progress_id, client.profile) + self.host.bridge.progress_finished(progress_id, return_data, client.profile) + self.host.remove_progress_cb(progress_id, client.profile) del client._import[import_handler.name][progress_id] def register(self, import_handler, name, callback, short_desc="", long_desc=""): @@ -311,10 +311,10 @@ @param name(unicode): unique importer name, should indicate the software it can import and always lowercase @param callback(callable): method to call: - the signature must be (client, location, options) (cf. [doImport]) + the signature must be (client, location, options) (cf. [do_import]) the importer must return a tuple with (items_import_data, items_count) items_import_data(iterable[dict]) data specific to specialized importer - cf. importItem docstring of specialized importer for details + cf. import_item docstring of specialized importer for details items_count (int, None) indicate the total number of items (without subitems) useful to display a progress indicator when the iterator is a generator use None if you can't guess the total number of items diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_account.py --- a/sat/plugins/plugin_misc_account.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_account.py Sat Apr 08 13:54:42 2023 +0200 @@ -108,72 +108,72 @@ def __init__(self, host): log.info(_("Plugin Account initialization")) self.host = host - host.bridge.addMethod( - "registerSatAccount", + host.bridge.add_method( + "libervia_account_register", ".plugin", in_sign="sss", out_sign="", - method=self._registerAccount, + method=self._register_account, async_=True, ) - host.bridge.addMethod( - "getNewAccountDomain", + host.bridge.add_method( + "account_domain_new_get", ".plugin", in_sign="", out_sign="s", - method=self.getNewAccountDomain, + method=self.account_domain_new_get, async_=False, ) - host.bridge.addMethod( - "getAccountDialogUI", + host.bridge.add_method( + "account_dialog_ui_get", ".plugin", in_sign="s", out_sign="s", - method=self._getAccountDialogUI, + method=self._get_account_dialog_ui, async_=False, ) - host.bridge.addMethod( - "asyncConnectWithXMPPCredentials", + host.bridge.add_method( + "credentials_xmpp_connect", ".plugin", in_sign="ss", out_sign="b", - method=self.asyncConnectWithXMPPCredentials, + method=self.credentials_xmpp_connect, async_=True, ) - self.fixEmailAdmins() + self.fix_email_admins() self._sessions = Sessions() - self.__account_cb_id = host.registerCallback( - self._accountDialogCb, with_data=True + self.__account_cb_id = host.register_callback( + self._account_dialog_cb, with_data=True ) - self.__change_password_id = host.registerCallback( - self.__changePasswordCb, with_data=True + self.__change_password_id = host.register_callback( + self.__change_password_cb, with_data=True ) - def deleteBlogCallback(posts, comments): - return lambda data, profile: self.__deleteBlogPostsCb( + def delete_blog_callback(posts, comments): + return lambda data, profile: self.__delete_blog_posts_cb( posts, comments, data, profile ) - self.__delete_posts_id = host.registerCallback( - deleteBlogCallback(True, False), with_data=True + self.__delete_posts_id = host.register_callback( + delete_blog_callback(True, False), with_data=True ) - self.__delete_comments_id = host.registerCallback( - deleteBlogCallback(False, True), with_data=True + self.__delete_comments_id = host.register_callback( + delete_blog_callback(False, True), with_data=True ) - self.__delete_posts_comments_id = host.registerCallback( - deleteBlogCallback(True, True), with_data=True + self.__delete_posts_comments_id = host.register_callback( + delete_blog_callback(True, True), with_data=True ) - self.__delete_account_id = host.registerCallback( - self.__deleteAccountCb, with_data=True + self.__delete_account_id = host.register_callback( + self.__delete_account_cb, with_data=True ) # FIXME: remove this after some time, when the deprecated parameter is really abandoned - def fixEmailAdmins(self): + def fix_email_admins(self): """Handle deprecated config option "admin_email" to fix the admin emails list""" - admin_email = self.getConfig("admin_email") + admin_email = self.config_get("admin_email") if not admin_email: return log.warning( @@ -182,10 +182,10 @@ param_name = "email_admins_list" try: section = "" - value = self.host.memory.getConfig(section, param_name, Exception) + value = self.host.memory.config_get(section, param_name, Exception) except (configparser.NoOptionError, configparser.NoSectionError): section = CONFIG_SECTION - value = self.host.memory.getConfig( + value = self.host.memory.config_get( section, param_name, default_conf[param_name] ) @@ -193,13 +193,13 @@ value.add(admin_email) self.host.memory.config.set(section, param_name, ",".join(value)) - def getConfig(self, name, section=CONFIG_SECTION): + def config_get(self, name, section=CONFIG_SECTION): if name.startswith("email_"): # XXX: email_ parameters were first in [plugin account] section # but as it make more sense to have them in common with other plugins, # they can now be in [DEFAULT] section try: - value = self.host.memory.getConfig(None, name, Exception) + value = self.host.memory.config_get(None, name, Exception) except (configparser.NoOptionError, configparser.NoSectionError): pass else: @@ -209,9 +209,9 @@ default = default_conf[name] else: default = None - return self.host.memory.getConfig(section, name, default) + return self.host.memory.config_get(section, name, default) - def _registerAccount(self, email, password, profile): + def _register_account(self, email, password, profile): return self.registerAccount(email, password, None, profile) def registerAccount(self, email, password, jid_s, profile): @@ -226,11 +226,11 @@ @param profile @return Deferred """ - d = self.createProfile(password, jid_s, profile) - d.addCallback(lambda __: self.sendEmails(email, profile)) + d = self.create_profile(password, jid_s, profile) + d.addCallback(lambda __: self.send_emails(email, profile)) return d - def createProfile(self, password, jid_s, profile): + def create_profile(self, password, jid_s, profile): """Register a new profile and its associated XMPP account. @param password (unicode): password chosen by the user @@ -244,14 +244,14 @@ if not password or not profile: raise exceptions.DataError - if profile.lower() in self.getConfig("reserved_list"): + if profile.lower() in self.config_get("reserved_list"): return defer.fail(Failure(exceptions.ConflictError)) - d = self.host.memory.createProfile(profile, password) - d.addCallback(lambda __: self.profileCreated(password, jid_s, profile)) + d = self.host.memory.create_profile(profile, password) + d.addCallback(lambda __: self.profile_created(password, jid_s, profile)) return d - def profileCreated(self, password, jid_s, profile): + def profile_created(self, password, jid_s, profile): """Create the XMPP account and set the profile connection parameters. @param password (unicode): password chosen by the user @@ -265,30 +265,30 @@ d = defer.succeed(None) jid_ = jid.JID(jid_s) else: - jid_s = profile + "@" + self.getNewAccountDomain() + jid_s = profile + "@" + self.account_domain_new_get() jid_ = jid.JID(jid_s) - d = self.host.plugins["XEP-0077"].registerNewAccount(jid_, password) + d = self.host.plugins["XEP-0077"].register_new_account(jid_, password) def setParams(__): - self.host.memory.setParam( + self.host.memory.param_set( "JabberID", jid_s, "Connection", profile_key=profile ) - d = self.host.memory.setParam( + d = self.host.memory.param_set( "Password", password, "Connection", profile_key=profile ) return d - def removeProfile(failure): - self.host.memory.asyncDeleteProfile(profile) + def remove_profile(failure): + self.host.memory.profile_delete_async(profile) return failure - d.addCallback(lambda __: self.host.memory.startSession(password, profile)) + d.addCallback(lambda __: self.host.memory.start_session(password, profile)) d.addCallback(setParams) - d.addCallback(lambda __: self.host.memory.stopSession(profile)) - d.addErrback(removeProfile) + d.addCallback(lambda __: self.host.memory.stop_session(profile)) + d.addErrback(remove_profile) return d - def _sendEmailEb(self, failure_, email): + def _send_email_eb(self, failure_, email): # TODO: return error code to user log.error( _("Failed to send account creation confirmation to {email}: {msg}").format( @@ -296,13 +296,13 @@ ) ) - def sendEmails(self, email, profile): + def send_emails(self, email, profile): # time to send the email - domain = self.getNewAccountDomain() + domain = self.account_domain_new_get() # email to the administrators - admins_emails = self.getConfig("email_admins_list") + admins_emails = self.config_get("email_admins_list") if not admins_emails: log.warning( "No known admin email, we can't send email to administrator(s).\n" @@ -313,7 +313,7 @@ subject = _("New Libervia account created") # there is no email when an existing XMPP account is used body = f"New account created on {domain}: {profile} [{email or ''}]" - d_admin = sat_email.sendEmail( + d_admin = sat_email.send_email( self.host.memory.config, admins_emails, subject, body) admins_emails_txt = ", ".join(["<" + addr + ">" for addr in admins_emails]) @@ -333,7 +333,7 @@ # TODO: if use register with an existing account, an XMPP message should be sent return d_admin - jid_s = self.host.memory.getParamA( + jid_s = self.host.memory.param_get_a( "JabberID", "Connection", profile_key=profile ) subject = _("Your Libervia account has been created") @@ -342,20 +342,20 @@ # XXX: this will not fail when the email address doesn't exist # FIXME: check email reception to validate email given by the user # FIXME: delete the profile if the email could not been sent? - d_user = sat_email.sendEmail(self.host.memory.config, [email], subject, body) + d_user = sat_email.send_email(self.host.memory.config, [email], subject, body) d_user.addCallbacks( lambda __: log.debug( "Account creation confirmation sent to <{}>".format(email) ), - self._sendEmailEb, + self._send_email_eb, errbackArgs=[email] ) return defer.DeferredList([d_user, d_admin]) - def getNewAccountDomain(self): + def account_domain_new_get(self): """get the domain that will be set to new account""" - domain = self.getConfig("new_account_domain") or self.getConfig( + domain = self.config_get("new_account_domain") or self.config_get( "xmpp_domain", None ) if not domain: @@ -367,7 +367,7 @@ return DEFAULT_DOMAIN return domain - def _getAccountDialogUI(self, profile): + def _get_account_dialog_ui(self, profile): """Get the main dialog to manage your account @param menu_data @param profile: %(doc_profile)s @@ -381,7 +381,7 @@ ) tab_container = form_ui.current_container - tab_container.addTab( + tab_container.add_tab( "update", D_("Change your password"), container=xml_tools.PairsContainer ) form_ui.addLabel(D_("Current profile password")) @@ -394,7 +394,7 @@ # FIXME: uncomment and fix these features """ if 'GROUPBLOG' in self.host.plugins: - tab_container.addTab("delete_posts", D_("Delete your posts"), container=xml_tools.PairsContainer) + tab_container.add_tab("delete_posts", D_("Delete your posts"), container=xml_tools.PairsContainer) form_ui.addLabel(D_("Current profile password")) form_ui.addPassword("delete_posts_passwd", value="") form_ui.addLabel(D_("Delete all your posts and their comments")) @@ -402,7 +402,7 @@ form_ui.addLabel(D_("Delete all your comments on other's posts")) form_ui.addBool("delete_comments_checkbox", "false") - tab_container.addTab("delete", D_("Delete your account"), container=xml_tools.PairsContainer) + tab_container.add_tab("delete", D_("Delete your account"), container=xml_tools.PairsContainer) form_ui.addLabel(D_("Current profile password")) form_ui.addPassword("delete_passwd", value="") form_ui.addLabel(D_("Delete your account")) @@ -412,12 +412,12 @@ return form_ui.toXml() @defer.inlineCallbacks - def _accountDialogCb(self, data, profile): + def _account_dialog_cb(self, data, profile): """Called when the user submits the main account dialog @param data @param profile """ - sat_cipher = yield self.host.memory.asyncGetParamA( + sat_cipher = yield self.host.memory.param_get_a_async( C.PROFILE_PASS_PATH[1], C.PROFILE_PASS_PATH[0], profile_key=profile ) @@ -442,7 +442,7 @@ verified = yield verify(delete_passwd) assert isinstance(verified, bool) if verified: - defer.returnValue(self.__deleteAccount(profile)) + defer.returnValue(self.__delete_account(profile)) defer.returnValue(error_ui()) # check for blog posts deletion @@ -456,7 +456,7 @@ verified = yield verify(delete_posts_passwd) assert isinstance(verified, bool) if verified: - defer.returnValue(self.__deleteBlogPosts(posts, comments, profile)) + defer.returnValue(self.__delete_blog_posts(posts, comments, profile)) defer.returnValue(error_ui()) """ @@ -469,7 +469,7 @@ assert isinstance(verified, bool) if verified: if new_passwd1 == new_passwd2: - data = yield self.__changePassword(new_passwd1, profile=profile) + data = yield self.__change_password(new_passwd1, profile=profile) defer.returnValue(data) else: defer.returnValue( @@ -481,13 +481,13 @@ defer.returnValue({}) - def __changePassword(self, password, profile): + def __change_password(self, password, profile): """Ask for a confirmation before changing the XMPP account and SàT profile passwords. @param password (str): the new password @param profile (str): %(doc_profile)s """ - session_id, __ = self._sessions.newSession( + session_id, __ = self._sessions.new_session( {"new_password": password}, profile=profile ) form_ui = xml_tools.XMLUI( @@ -504,24 +504,24 @@ form_ui.addText(D_("Continue with changing the password?")) return {"xmlui": form_ui.toXml()} - def __changePasswordCb(self, data, profile): + def __change_password_cb(self, data, profile): """Actually change the user XMPP account and SàT profile password @param data (dict) @profile (str): %(doc_profile)s """ - client = self.host.getClient(profile) - password = self._sessions.profileGet(data["session_id"], profile)["new_password"] + client = self.host.get_client(profile) + password = self._sessions.profile_get(data["session_id"], profile)["new_password"] del self._sessions[data["session_id"]] - def passwordChanged(__): - d = self.host.memory.setParam( + def password_changed(__): + d = self.host.memory.param_set( C.PROFILE_PASS_PATH[1], password, C.PROFILE_PASS_PATH[0], profile_key=profile, ) d.addCallback( - lambda __: self.host.memory.setParam( + lambda __: self.host.memory.param_set( "Password", password, "Connection", profile_key=profile ) ) @@ -536,11 +536,11 @@ ) return defer.succeed({"xmlui": error_ui.toXml()}) - d = self.host.plugins["XEP-0077"].changePassword(client, password) - d.addCallbacks(passwordChanged, errback) + d = self.host.plugins["XEP-0077"].change_password(client, password) + d.addCallbacks(password_changed, errback) return d - def __deleteAccount(self, profile): + def __delete_account(self, profile): """Ask for a confirmation before deleting the XMPP account and SàT profile @param profile """ @@ -561,7 +561,7 @@ D_( "All your data stored on %(server)s, including your %(target)s will be erased." ) - % {"server": self.getNewAccountDomain(), "target": target} + % {"server": self.account_domain_new_get(), "target": target} ) form_ui.addText( D_( @@ -570,26 +570,26 @@ ) return {"xmlui": form_ui.toXml()} - def __deleteAccountCb(self, data, profile): + def __delete_account_cb(self, data, profile): """Actually delete the XMPP account and SàT profile @param data @param profile """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) - def userDeleted(__): + def user_deleted(__): # FIXME: client should be disconnected at this point, so 2 next loop should be removed (to be confirmed) for jid_ in client.roster._jids: # empty roster client.presence.unsubscribe(jid_) - for jid_ in self.host.memory.getWaitingSub( + for jid_ in self.host.memory.sub_waiting_get( profile ): # delete waiting subscriptions - self.host.memory.delWaitingSub(jid_) + self.host.memory.del_waiting_sub(jid_) - delete_profile = lambda: self.host.memory.asyncDeleteProfile( + delete_profile = lambda: self.host.memory.profile_delete_async( profile, force=True ) if "GROUPBLOG" in self.host.plugins: @@ -611,10 +611,10 @@ return defer.succeed({"xmlui": error_ui.toXml()}) d = self.host.plugins["XEP-0077"].unregister(client, jid.JID(client.jid.host)) - d.addCallbacks(userDeleted, errback) + d.addCallbacks(user_deleted, errback) return d - def __deleteBlogPosts(self, posts, comments, profile): + def __delete_blog_posts(self, posts, comments, profile): """Ask for a confirmation before deleting the blog posts @param posts: delete all posts of the user (and their comments) @param comments: delete all the comments of the user on other's posts @@ -678,7 +678,7 @@ return {"xmlui": form_ui.toXml()} - def __deleteBlogPostsCb(self, posts, comments, data, profile): + def __delete_blog_posts_cb(self, posts, comments, data, profile): """Actually delete the XMPP account and SàT profile @param posts: delete all posts of the user (and their comments) @param comments: delete all the comments of the user on other's posts @@ -723,7 +723,7 @@ d.addCallbacks(deleted, errback) return d - def asyncConnectWithXMPPCredentials(self, jid_s, password): + def credentials_xmpp_connect(self, jid_s, password): """Create and connect a new SàT profile using the given XMPP credentials. Re-use given JID and XMPP password for the profile name and profile password. @@ -733,34 +733,34 @@ @raise exceptions.PasswordError, exceptions.ConflictError """ try: # be sure that the profile doesn't exist yet - self.host.memory.getProfileName(jid_s) + self.host.memory.get_profile_name(jid_s) except exceptions.ProfileUnknownError: pass else: raise exceptions.ConflictError - d = self.createProfile(password, jid_s, jid_s) + d = self.create_profile(password, jid_s, jid_s) d.addCallback( - lambda __: self.host.memory.getProfileName(jid_s) + lambda __: self.host.memory.get_profile_name(jid_s) ) # checks if the profile has been successfuly created d.addCallback(lambda profile: defer.ensureDeferred( self.host.connect(profile, password, {}, 0))) def connected(result): - self.sendEmails(None, profile=jid_s) + self.send_emails(None, profile=jid_s) return result - def removeProfile( + def remove_profile( failure ): # profile has been successfully created but the XMPP credentials are wrong! log.debug( "Removing previously auto-created profile: %s" % failure.getErrorMessage() ) - self.host.memory.asyncDeleteProfile(jid_s) + self.host.memory.profile_delete_async(jid_s) raise failure # FIXME: we don't catch the case where the JID host is not an XMPP server, and the user # has to wait until the DBUS timeout ; as a consequence, emails are sent to the admins - # and the profile is not deleted. When the host exists, removeProfile is well called. - d.addCallbacks(connected, removeProfile) + # and the profile is not deleted. When the host exists, remove_profile is well called. + d.addCallbacks(connected, remove_profile) return d diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_android.py --- a/sat/plugins/plugin_misc_android.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_android.py Sat Apr 08 13:54:42 2023 +0200 @@ -143,7 +143,7 @@ notification_intent.setFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP) notification_intent.setAction(Intent.ACTION_MAIN) - notification_intent.addCategory(Intent.CATEGORY_LAUNCHER) + notification_intent.add_category(Intent.CATEGORY_LAUNCHER) if sat_action is not None: action_data = AndroidString(json.dumps(sat_action).encode()) log.debug(f"adding extra {INTENT_EXTRA_ACTION} ==> {action_data}") @@ -233,10 +233,10 @@ category_label=D_(PARAM_VIBRATE_CATEGORY), vibrate_param_name=PARAM_VIBRATE_NAME, vibrate_param_label=PARAM_VIBRATE_LABEL, - vibrate_options=params.makeOptions(VIBRATION_OPTS, "always"), + vibrate_options=params.make_options(VIBRATION_OPTS, "always"), ring_param_name=PARAM_RING_NAME, ring_param_label=PARAM_RING_LABEL, - ring_options=params.makeOptions(RING_OPTS, "normal"), + ring_options=params.make_options(RING_OPTS, "normal"), ) def __init__(self, host): @@ -245,7 +245,7 @@ self.host = host self._csi = host.plugins.get('XEP-0352') self._csi_timer = None - host.memory.updateParams(self.params) + host.memory.update_params(self.params) try: os.mkdir(SOCKET_DIR, 0o700) except OSError as e: @@ -268,15 +268,15 @@ raise e # we set a low priority because we want the notification to be sent after all # plugins have done their job - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=-1000) + host.trigger.add("messageReceived", self.message_received_trigger, priority=-1000) # profiles autoconnection - host.bridge.addMethod( - "profileAutoconnectGet", + host.bridge.add_method( + "profile_autoconnect_get", ".plugin", in_sign="", out_sign="s", - method=self._profileAutoconnectGet, + method=self._profile_autoconnect_get, async_=True, ) @@ -284,7 +284,7 @@ self.am = activity.getSystemService(Context.AUDIO_SERVICE) # sound notification - media_dir = Path(host.memory.getConfig("", "media_dir")) + media_dir = Path(host.memory.config_get("", "media_dir")) assert media_dir is not None notif_path = media_dir / "sounds" / "notifications" / "music-box.mp3" self.notif_player = MediaPlayer() @@ -297,20 +297,20 @@ log.info("SSL Android patch applied") # DNS fix - defer.ensureDeferred(self.updateResolver()) + defer.ensureDeferred(self.update_resolver()) # Connectivity handling self.cm = activity.getSystemService(Context.CONNECTIVITY_SERVICE) self._net_type = None - d = defer.ensureDeferred(self._checkConnectivity()) - d.addErrback(host.logErrback) + d = defer.ensureDeferred(self._check_connectivity()) + d.addErrback(host.log_errback) # XXX: we need to keep a reference to BroadcastReceiver to avoid # "XXX has no attribute 'invoke'" error (looks like the same issue as # https://github.com/kivy/pyjnius/issues/59) self.br = BroadcastReceiver( callback=lambda *args, **kwargs: reactor.callFromThread( - self.onConnectivityChange + self.on_connectivity_change ), actions=["android.net.conn.CONNECTIVITY_CHANGE"] ) @@ -326,29 +326,29 @@ previous_state = self._state self._state = new_state if new_state == STATE_RUNNING: - self._onRunning(previous_state) + self._on_running(previous_state) elif new_state == STATE_PAUSED: - self._onPaused(previous_state) + self._on_paused(previous_state) elif new_state == STATE_STOPPED: - self._onStopped(previous_state) + self._on_stopped(previous_state) @property def cagou_active(self): return self._state == STATE_RUNNING - def _onRunning(self, previous_state): + def _on_running(self, previous_state): if previous_state is not None: - self.host.bridge.bridgeReactivateSignals() - self.setActive() + self.host.bridge.bridge_reactivate_signals() + self.set_active() - def _onPaused(self, previous_state): - self.host.bridge.bridgeDeactivateSignals() - self.setInactive() + def _on_paused(self, previous_state): + self.host.bridge.bridge_deactivate_signals() + self.set_inactive() - def _onStopped(self, previous_state): - self.setInactive() + def _on_stopped(self, previous_state): + self.set_inactive() - def _notifyMessage(self, mess_data, client): + def _notify_message(self, mess_data, client): """Send notification when suitable notification is sent if: @@ -378,7 +378,7 @@ ringer_mode = self.am.getRingerMode() vibrate_mode = ringer_mode == AudioManager.RINGER_MODE_VIBRATE - ring_setting = self.host.memory.getParamA( + ring_setting = self.host.memory.param_get_a( PARAM_RING_NAME, PARAM_RING_CATEGORY, profile_key=client.profile @@ -387,7 +387,7 @@ if ring_setting != 'never' and ringer_mode == AudioManager.RINGER_MODE_NORMAL: self.notif_player.start() - vibration_setting = self.host.memory.getParamA( + vibration_setting = self.host.memory.param_get_a( PARAM_VIBRATE_NAME, PARAM_VIBRATE_CATEGORY, profile_key=client.profile @@ -400,27 +400,27 @@ log.warning("Can't use vibrator: {e}".format(e=e)) return mess_data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): if not self.cagou_active: # we only send notification is the frontend is not displayed - post_treat.addCallback(self._notifyMessage, client) + post_treat.addCallback(self._notify_message, client) return True # Profile autoconnection - def _profileAutoconnectGet(self): - return defer.ensureDeferred(self.profileAutoconnectGet()) + def _profile_autoconnect_get(self): + return defer.ensureDeferred(self.profile_autoconnect_get()) - async def _getProfilesAutoconnect(self): - autoconnect_dict = await self.host.memory.storage.getIndParamValues( + async def _get_profiles_autoconnect(self): + autoconnect_dict = await self.host.memory.storage.get_ind_param_values( category='Connection', name='autoconnect_backend', ) return [p for p, v in autoconnect_dict.items() if C.bool(v)] - async def profileAutoconnectGet(self): + async def profile_autoconnect_get(self): """Return profile to connect automatically by frontend, if any""" - profiles_autoconnect = await self._getProfilesAutoconnect() + profiles_autoconnect = await self._get_profiles_autoconnect() if not profiles_autoconnect: return None if len(profiles_autoconnect) > 1: @@ -431,56 +431,56 @@ # CSI - def _setInactive(self): + def _set_inactive(self): self._csi_timer = None - for client in self.host.getClients(C.PROF_KEY_ALL): - self._csi.setInactive(client) + for client in self.host.get_clients(C.PROF_KEY_ALL): + self._csi.set_inactive(client) - def setInactive(self): + def set_inactive(self): if self._csi is None or self._csi_timer is not None: return - self._csi_timer = reactor.callLater(CSI_DELAY, self._setInactive) + self._csi_timer = reactor.callLater(CSI_DELAY, self._set_inactive) - def setActive(self): + def set_active(self): if self._csi is None: return if self._csi_timer is not None: self._csi_timer.cancel() self._csi_timer = None - for client in self.host.getClients(C.PROF_KEY_ALL): - self._csi.setActive(client) + for client in self.host.get_clients(C.PROF_KEY_ALL): + self._csi.set_active(client) # Connectivity - async def _handleNetworkChange(self, net_type): + async def _handle_network_change(self, net_type): """Notify the clients about network changes. This way the client can disconnect/reconnect transport, or change delays """ log.debug(f"handling network change ({net_type})") if net_type == NET_TYPE_NONE: - for client in self.host.getClients(C.PROF_KEY_ALL): - client.networkDisabled() + for client in self.host.get_clients(C.PROF_KEY_ALL): + client.network_disabled() else: # DNS servers may have changed - await self.updateResolver() + await self.update_resolver() # client may be there but disabled (e.g. with stream management) - for client in self.host.getClients(C.PROF_KEY_ALL): + for client in self.host.get_clients(C.PROF_KEY_ALL): log.debug(f"enabling network for {client.profile}") - client.networkEnabled() + client.network_enabled() # profiles may have been disconnected and then purged, we try # to reconnect them in case - profiles_autoconnect = await self._getProfilesAutoconnect() + profiles_autoconnect = await self._get_profiles_autoconnect() for profile in profiles_autoconnect: - if not self.host.isConnected(profile): + if not self.host.is_connected(profile): log.info(f"{profile} is not connected, reconnecting it") try: await self.host.connect(profile) except Exception as e: log.error(f"Can't connect profile {profile}: {e}") - async def _checkConnectivity(self): + async def _check_connectivity(self): active_network = self.cm.getActiveNetworkInfo() if active_network is None: net_type = NET_TYPE_NONE @@ -506,24 +506,24 @@ log.info("network activated (type={net_type_android})" .format(net_type_android=net_type_android)) else: - log.debug("_checkConnectivity called without network change ({net_type})" + log.debug("_check_connectivity called without network change ({net_type})" .format(net_type = net_type)) - # we always call _handleNetworkChange even if there is not connectivity change + # we always call _handle_network_change even if there is not connectivity change # to be sure to reconnect when necessary - await self._handleNetworkChange(net_type) + await self._handle_network_change(net_type) - def onConnectivityChange(self): - log.debug("onConnectivityChange called") - d = defer.ensureDeferred(self._checkConnectivity()) - d.addErrback(self.host.logErrback) + def on_connectivity_change(self): + log.debug("on_connectivity_change called") + d = defer.ensureDeferred(self._check_connectivity()) + d.addErrback(self.host.log_errback) - async def updateResolver(self): + async def update_resolver(self): # There is no "/etc/resolv.conf" on Android, which confuse Twisted and makes # SRV record checking unusable. We fixe that by checking DNS server used, and # updating Twisted's resolver accordingly - dns_servers = await self.getDNSServers() + dns_servers = await self.get_dns_servers() log.info( "Patching Twisted to use Android DNS resolver ({dns_servers})".format( @@ -531,7 +531,7 @@ ) dns_client.theResolver = dns_client.createResolver(servers=dns_servers) - async def getDNSServers(self): + async def get_dns_servers(self): servers = [] if api_version < 26: diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_app_manager.py --- a/sat/plugins/plugin_misc_app_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_app_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -84,31 +84,31 @@ self._started = {} # instance id to app data map self._instances = {} - host.bridge.addMethod( - "applicationsList", + host.bridge.add_method( + "applications_list", ".plugin", in_sign="as", out_sign="as", method=self.list_applications, ) - host.bridge.addMethod( - "applicationStart", + host.bridge.add_method( + "application_start", ".plugin", in_sign="ss", out_sign="s", method=self._start, async_=True, ) - host.bridge.addMethod( - "applicationStop", + host.bridge.add_method( + "application_stop", ".plugin", in_sign="sss", out_sign="", method=self._stop, async_=True, ) - host.bridge.addMethod( - "applicationExposedGet", + host.bridge.add_method( + "application_exposed_get", ".plugin", in_sign="sss", out_sign="s", @@ -117,12 +117,12 @@ ) # application has been started succeesfully, # args: name, instance_id, extra - host.bridge.addSignal( + host.bridge.add_signal( "application_started", ".plugin", signature="sss" ) # application went wrong with the application # args: name, instance_id, extra - host.bridge.addSignal( + host.bridge.add_signal( "application_error", ".plugin", signature="sss" ) yaml.add_constructor( @@ -169,7 +169,7 @@ "expected" ) - value = self.host.memory.getConfig(section, name, default) + value = self.host.memory.config_get(section, name, default) # FIXME: "public_url" is used only here and doesn't take multi-sites into account if name == "public_url" and (not value or value.startswith('http')): if not value: @@ -408,7 +408,7 @@ log.info(f"{app_name!r} is already started or being started") return ret_data else: - cache_path = self.host.memory.getCachePath( + cache_path = self.host.memory.get_cache_path( PLUGIN_INFO[C.PI_IMPORT_NAME], app_name ) cache_path.mkdir(0o700, parents=True, exist_ok=True) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_attach.py --- a/sat/plugins/plugin_misc_attach.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_attach.py Sat Apr 08 13:54:42 2023 +0200 @@ -58,10 +58,10 @@ log.info(_("plugin Attach initialization")) self.host = host self._u = host.plugins["UPLOAD"] - host.trigger.add("sendMessage", self._sendMessageTrigger) - host.trigger.add("sendMessageComponent", self._sendMessageTrigger) + host.trigger.add("sendMessage", self._send_message_trigger) + host.trigger.add("sendMessageComponent", self._send_message_trigger) self._attachments_handlers = {'clear': [], 'encrypted': []} - self.register(self.defaultCanHandle, self.defaultAttach, False, -1000) + self.register(self.default_can_handle, self.default_attach, False, -1000) def register(self, can_handle, attach, encrypted=False, priority=0): """Register an attachments handler @@ -94,7 +94,7 @@ handlers.sort(key=lambda h: h.priority, reverse=True) log.debug(f"new attachments handler: {handler}") - async def attachFiles(self, client, data): + async def attach_files(self, client, data): """Main method to attach file It will do generic pre-treatment, and call the suitable attachments handler @@ -135,13 +135,13 @@ _("Can't resize attachment of type {main_type!r}: {attachment}") .format(main_type=main_type, attachment=attachment)) - if client.encryption.isEncryptionRequested(data): + if client.encryption.is_encryption_requested(data): handlers = self._attachments_handlers['encrypted'] else: handlers = self._attachments_handlers['clear'] for handler in handlers: - can_handle = await utils.asDeferred(handler.can_handle, client, data) + can_handle = await utils.as_deferred(handler.can_handle, client, data) if can_handle: break else: @@ -150,7 +150,7 @@ destinee = data['to'] )) - await utils.asDeferred(handler.attach, client, data) + await utils.as_deferred(handler.attach, client, data) for dir_path in tmp_dirs_to_clean: log.debug(f"Cleaning temporary directory at {dir_path}") @@ -220,7 +220,7 @@ progress_id = attachment.pop("progress_id", None) if progress_id: extra["progress_id"] = progress_id - check_certificate = self.host.memory.getParamA( + check_certificate = self.host.memory.param_get_a( "check_certificate", "Connection", profile_key=client.profile) if not check_certificate: extra['ignore_tls_errors'] = True @@ -251,19 +251,19 @@ return data - def _attachFiles(self, data, client): - return defer.ensureDeferred(self.attachFiles(client, data)) + def _attach_files(self, data, client): + return defer.ensureDeferred(self.attach_files(client, data)) - def _sendMessageTrigger( + def _send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments): if mess_data['extra'].get(C.KEY_ATTACHMENTS): - post_xml_treatments.addCallback(self._attachFiles, client=client) + post_xml_treatments.addCallback(self._attach_files, client=client) return True - async def defaultCanHandle(self, client, data): + async def default_can_handle(self, client, data): return True - async def defaultAttach(self, client, data): + async def default_attach(self, client, data): await self.upload_files(client, data) # TODO: handle xhtml-im body_elt = data["xml"].body diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_debug.py --- a/sat/plugins/plugin_misc_debug.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_debug.py Sat Apr 08 13:54:42 2023 +0200 @@ -40,15 +40,15 @@ def __init__(self, host): log.info(_("Plugin Debug initialization")) self.host = host - host.bridge.addMethod( - "debugFakeSignal", + host.bridge.add_method( + "debug_signal_fake", ".plugin", in_sign="sss", out_sign="", - method=self._fakeSignal, + method=self._fake_signal, ) - def _fakeSignal(self, signal, arguments, profile_key): + def _fake_signal(self, signal, arguments, profile_key): """send a signal from backend @param signal(str): name of the signal @@ -58,6 +58,6 @@ args = json.loads(arguments) method = getattr(self.host.bridge, signal) if profile_key != C.PROF_KEY_NONE: - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) args.append(profile) method(*args) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_download.py --- a/sat/plugins/plugin_misc_download.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_download.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,20 +54,20 @@ def __init__(self, host): log.info(_("plugin Download initialization")) self.host = host - host.bridge.addMethod( - "fileDownload", + host.bridge.add_method( + "file_download", ".plugin", in_sign="ssss", out_sign="s", - method=self._fileDownload, + method=self._file_download, async_=True, ) - host.bridge.addMethod( - "fileDownloadComplete", + host.bridge.add_method( + "file_download_complete", ".plugin", in_sign="ssss", out_sign="s", - method=self._fileDownloadComplete, + method=self._file_download_complete, async_=True, ) self._download_callbacks = {} @@ -75,11 +75,11 @@ self.register_scheme('http', self.download_http) self.register_scheme('https', self.download_http) - def _fileDownload( + def _file_download( self, attachment_s: str, dest_path: str, extra_s: str, profile: str ) -> defer.Deferred: d = defer.ensureDeferred(self.file_download( - self.host.getClient(profile), + self.host.get_client(profile), data_format.deserialise(attachment_s), Path(dest_path), data_format.deserialise(extra_s) @@ -118,11 +118,11 @@ else: return {"progress": progress_id} - def _fileDownloadComplete( + def _file_download_complete( self, attachment_s: str, dest_path: str, extra_s: str, profile: str ) -> defer.Deferred: d = defer.ensureDeferred(self.file_download_complete( - self.host.getClient(profile), + self.host.get_client(profile), data_format.deserialise(attachment_s), Path(dest_path), data_format.deserialise(extra_s) @@ -168,7 +168,7 @@ # we hash the URL to have an unique identifier, and avoid double download url_hash = hashlib.sha256(uri_parsed.geturl().encode()).hexdigest() cache_uid = f"{stem}_{url_hash}" - cache_data = client.cache.getMetadata(cache_uid) + cache_data = client.cache.get_metadata(cache_uid) if cache_data is not None: # file is already in cache, we return it download_d = defer.succeed(cache_data['path']) @@ -176,14 +176,14 @@ else: # the file is not in cache unique_name = '.'.join([cache_uid] + suffixes) - with client.cache.cacheData( + with client.cache.cache_data( "DOWNLOAD", cache_uid, filename=unique_name) as f: # we close the file and only use its name, the file will be opened # by the registered callback dest_path = Path(f.name) # should we check certificates? - check_certificate = self.host.memory.getParamA( + check_certificate = self.host.memory.param_get_a( "check_certificate", "Connection", profile_key=client.profile) if not check_certificate: extra['ignore_tls_errors'] = True @@ -203,7 +203,7 @@ "Can't download URI {uri}: {reason}").format( uri=uri, reason=e)) if cache_uid is not None: - client.cache.removeFromCache(cache_uid) + client.cache.remove_from_cache(cache_uid) elif dest_path.exists(): dest_path.unlink() raise e diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_email_invitation.py --- a/sat/plugins/plugin_misc_email_invitation.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_email_invitation.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,30 +77,30 @@ log.info(_("plugin Invitations initialization")) self.host = host self.invitations = persistent.LazyPersistentBinaryDict('invitations') - host.bridge.addMethod("invitationCreate", ".plugin", in_sign='sasssssssssa{ss}s', + host.bridge.add_method("invitation_create", ".plugin", in_sign='sasssssssssa{ss}s', out_sign='a{ss}', method=self._create, async_=True) - host.bridge.addMethod("invitationGet", ".plugin", in_sign='s', out_sign='a{ss}', + host.bridge.add_method("invitation_get", ".plugin", in_sign='s', out_sign='a{ss}', method=self.get, async_=True) - host.bridge.addMethod("invitationDelete", ".plugin", in_sign='s', out_sign='', + host.bridge.add_method("invitation_delete", ".plugin", in_sign='s', out_sign='', method=self._delete, async_=True) - host.bridge.addMethod("invitationModify", ".plugin", in_sign='sa{ss}b', + host.bridge.add_method("invitation_modify", ".plugin", in_sign='sa{ss}b', out_sign='', method=self._modify, async_=True) - host.bridge.addMethod("invitationList", ".plugin", in_sign='s', + host.bridge.add_method("invitation_list", ".plugin", in_sign='s', out_sign='a{sa{ss}}', method=self._list, async_=True) - host.bridge.addMethod("invitationSimpleCreate", ".plugin", in_sign='sssss', + host.bridge.add_method("invitation_simple_create", ".plugin", in_sign='sssss', out_sign='a{ss}', - method=self._simpleCreate, + method=self._simple_create, async_=True) - def checkExtra(self, extra): + def check_extra(self, extra): if EXTRA_RESERVED.intersection(extra): raise ValueError( _("You can't use following key(s) in extra, they are reserved: {}") @@ -132,7 +132,7 @@ kwargs[key] = str(value) return defer.ensureDeferred(self.create(**kwargs)) - async def getExistingInvitation(self, email: Optional[str]) -> Optional[dict]: + async def get_existing_invitation(self, email: Optional[str]) -> Optional[dict]: """Retrieve existing invitation with given email @param email: check if any invitation exist with this email @@ -151,7 +151,7 @@ invitation[KEY_ID] = id_ return invitation - async def _createAccountAndProfile( + async def _create_account_and_profile( self, id_: str, kwargs: dict, @@ -161,7 +161,7 @@ ## XMPP account creation password = kwargs.pop('password', None) if password is None: - password = utils.generatePassword() + password = utils.generate_password() assert password # XXX: password is here saved in clear in database # it is needed for invitation as the same password is used for profile @@ -173,7 +173,7 @@ jid_ = kwargs.pop('jid_', None) if not jid_: - domain = self.host.memory.getConfig(None, 'xmpp_domain') + domain = self.host.memory.config_get(None, 'xmpp_domain') if not domain: # TODO: fallback to profile's domain raise ValueError(_("You need to specify xmpp_domain in sat.conf")) @@ -186,7 +186,7 @@ # we don't register account if there is no user as anonymous login is then # used try: - await self.host.plugins['XEP-0077'].registerNewAccount(jid_, password) + await self.host.plugins['XEP-0077'].register_new_account(jid_, password) except error.StanzaError as e: prefix = jid_.user idx = 0 @@ -197,7 +197,7 @@ log.info(_("requested jid already exists, trying with {}".format( jid_.full()))) try: - await self.host.plugins['XEP-0077'].registerNewAccount( + await self.host.plugins['XEP-0077'].register_new_account( jid_, password ) @@ -216,11 +216,11 @@ uuid=id_ ) # profile creation should not fail as we generate unique name ourselves - await self.host.memory.createProfile(guest_profile, password) - await self.host.memory.startSession(password, guest_profile) - await self.host.memory.setParam("JabberID", jid_.full(), "Connection", + await self.host.memory.create_profile(guest_profile, password) + await self.host.memory.start_session(password, guest_profile) + await self.host.memory.param_set("JabberID", jid_.full(), "Connection", profile_key=guest_profile) - await self.host.memory.setParam("Password", password, "Connection", + await self.host.memory.param_set("Password", password, "Connection", profile_key=guest_profile) async def create(self, **kwargs): @@ -290,11 +290,11 @@ _("You can't use following key(s) in both args and extra: {}").format( ', '.join(set(kwargs).intersection(extra)))) - self.checkExtra(extra) + self.check_extra(extra) email = kwargs.pop('email', None) - existing = await self.getExistingInvitation(email) + existing = await self.get_existing_invitation(email) if existing is not None: log.info(f"There is already an invitation for {email!r}") extra.update(existing) @@ -316,7 +316,7 @@ id_ = existing[KEY_ID] if existing else str(shortuuid.uuid()) if existing is None: - await self._createAccountAndProfile(id_, kwargs, extra) + await self._create_account_and_profile(id_, kwargs, extra) profile = kwargs.pop('profile', None) guest_profile = extra[KEY_GUEST_PROFILE] @@ -333,8 +333,8 @@ pass else: await self.host.connect(guest_profile, password) - guest_client = self.host.getClient(guest_profile) - await id_plugin.setIdentity(guest_client, {'nicknames': [name]}) + guest_client = self.host.get_client(guest_profile) + await id_plugin.set_identity(guest_client, {'nicknames': [name]}) await self.host.disconnect(guest_profile) ## email @@ -370,7 +370,7 @@ invite_url = url_template.format(**format_args) format_args['url'] = invite_url - await sat_email.sendEmail( + await sat_email.send_email( self.host.memory.config, [email] + emails_extra, (kwargs.pop('message_subject', None) or DEFAULT_SUBJECT).format( @@ -384,11 +384,11 @@ # FIXME: a parameter to disable auto roster adding would be nice if profile is not None: try: - client = self.host.getClient(profile) + client = self.host.get_client(profile) except Exception as e: log.error(f"Can't get host profile: {profile}: {e}") else: - await self.host.updateContact(client, jid_, name, ['guests']) + await self.host.contact_update(client, jid_, name, ['guests']) if kwargs: log.warning(_("Not all arguments have been consumed: {}").format(kwargs)) @@ -400,20 +400,20 @@ return extra - def _simpleCreate(self, invitee_email, invitee_name, url_template, extra_s, profile): - client = self.host.getClient(profile) + def _simple_create(self, invitee_email, invitee_name, url_template, extra_s, profile): + client = self.host.get_client(profile) # FIXME: needed because python-dbus use a specific string class invitee_email = str(invitee_email) invitee_name = str(invitee_name) url_template = str(url_template) extra = data_format.deserialise(extra_s) d = defer.ensureDeferred( - self.simpleCreate(client, invitee_email, invitee_name, url_template, extra) + self.simple_create(client, invitee_email, invitee_name, url_template, extra) ) d.addCallback(lambda data: {k: str(v) for k,v in data.items()}) return d - async def simpleCreate( + async def simple_create( self, client, invitee_email, invitee_name, url_template, extra): """Simplified method to invite somebody by email""" return await self.create( @@ -443,7 +443,7 @@ password = data['password'] try: await self.host.connect(guest_profile, password) - guest_client = self.host.getClient(guest_profile) + guest_client = self.host.get_client(guest_profile) # XXX: be extra careful to use guest_client and not client below, as this will # delete the associated XMPP account log.debug("deleting XMPP account") @@ -453,7 +453,7 @@ f"Can't delete {guest_profile}'s XMPP account, maybe it as already been " f"deleted: {e}") try: - await self.host.memory.asyncDeleteProfile(guest_profile, True) + await self.host.memory.profile_delete_async(guest_profile, True) except Exception as e: log.warning(f"Can't delete guest profile {guest_profile}: {e}") log.debug("removing guest data") @@ -474,8 +474,8 @@ else update them @raise KeyError: there is not invitation with this id_ """ - self.checkExtra(new_extra) - def gotCurrentData(current_data): + self.check_extra(new_extra) + def got_current_data(current_data): if replace: new_data = new_extra for k in EXTRA_RESERVED: @@ -500,7 +500,7 @@ self.invitations[id_] = new_data d = self.invitations[id_] - d.addCallback(gotCurrentData) + d.addCallback(got_current_data) return d def _list(self, profile=C.PROF_KEY_NONE): diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_extra_pep.py --- a/sat/plugins/plugin_misc_extra_pep.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_extra_pep.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,13 +62,13 @@ "category_label": D_(PARAM_KEY), "param_name": PARAM_NAME, "param_label": D_(PARAM_LABEL), - "jids": "\n".join({elt.toXml() for elt in params.createJidElts(PARAM_DEFAULT)}), + "jids": "\n".join({elt.toXml() for elt in params.create_jid_elts(PARAM_DEFAULT)}), } def __init__(self, host): log.info(_("Plugin Extra PEP initialization")) self.host = host - host.memory.updateParams(self.params) + host.memory.update_params(self.params) - def getFollowedEntities(self, profile_key): - return self.host.memory.getParamA(PARAM_NAME, PARAM_KEY, profile_key=profile_key) + def get_followed_entities(self, profile_key): + return self.host.memory.param_get_a(PARAM_NAME, PARAM_KEY, profile_key=profile_key) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_file.py --- a/sat/plugins/plugin_misc_file.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_file.py Sat Apr 08 13:54:42 2023 +0200 @@ -69,24 +69,24 @@ def __init__(self, host): log.info(_("plugin File initialization")) self.host = host - host.bridge.addMethod( - "fileSend", + host.bridge.add_method( + "file_send", ".plugin", in_sign="ssssss", out_sign="a{ss}", - method=self._fileSend, + method=self._file_send, async_=True, ) self._file_managers = [] - host.importMenu( + host.import_menu( (D_("Action"), D_("send file")), - self._fileSendMenu, + self._file_send_menu, security_limit=10, help_string=D_("Send a file"), type_=C.MENU_SINGLE, ) - def _fileSend( + def _file_send( self, peer_jid_s: str, filepath: str, @@ -95,13 +95,13 @@ extra_s: str, profile: str = C.PROF_KEY_NONE ) -> defer.Deferred: - client = self.host.getClient(profile) - return defer.ensureDeferred(self.fileSend( + client = self.host.get_client(profile) + return defer.ensureDeferred(self.file_send( client, jid.JID(peer_jid_s), filepath, name or None, file_desc or None, data_format.deserialise(extra_s) )) - async def fileSend( + async def file_send( self, client, peer_jid, filepath, filename=None, file_desc=None, extra=None ): """Send a file using best available method @@ -119,7 +119,7 @@ if not filename: filename = os.path.basename(filepath) or "_" for manager, priority in self._file_managers: - if await utils.asDeferred(manager.canHandleFileSend, + if await utils.as_deferred(manager.can_handle_file_send, client, peer_jid, filepath): try: method_name = manager.name @@ -131,8 +131,8 @@ ) ) try: - progress_id = await utils.asDeferred( - manager.fileSend, client, peer_jid, filepath, filename, file_desc, + progress_id = await utils.as_deferred( + manager.file_send, client, peer_jid, filepath, filename, file_desc, extra ) except Exception as e: @@ -155,15 +155,15 @@ ).toXml() } - def _onFileChoosed(self, peer_jid, data, profile): - client = self.host.getClient(profile) + def _on_file_choosed(self, peer_jid, data, profile): + client = self.host.get_client(profile) cancelled = C.bool(data.get("cancelled", C.BOOL_FALSE)) if cancelled: return path = data["path"] - return self.fileSend(client, peer_jid, path) + return self.file_send(client, peer_jid, path) - def _fileSendMenu(self, data, profile): + def _file_send_menu(self, data, profile): """ XMLUI activated by menu: return file sending UI @param profile: %(doc_profile)s @@ -173,8 +173,8 @@ except RuntimeError: raise exceptions.DataError(_("Invalid JID")) - file_choosed_id = self.host.registerCallback( - partial(self._onFileChoosed, jid_), + file_choosed_id = self.host.register_callback( + partial(self._on_file_choosed, jid_), with_data=True, one_shot=True, ) @@ -193,7 +193,7 @@ def register(self, manager, priority: int = 0) -> None: """Register a fileSending manager - @param manager: object implementing canHandleFileSend, and fileSend methods + @param manager: object implementing can_handle_file_send, and file_send methods @param priority: pririoty of this manager, the higher available will be used """ m_data = (manager, priority) @@ -201,9 +201,9 @@ raise exceptions.ConflictError( f"Manager {manager} is already registered" ) - if not hasattr(manager, "canHandleFileSend") or not hasattr(manager, "fileSend"): + if not hasattr(manager, "can_handle_file_send") or not hasattr(manager, "file_send"): raise ValueError( - f'{manager} must have both "canHandleFileSend" and "fileSend" methods to ' + f'{manager} must have both "can_handle_file_send" and "file_send" methods to ' 'be registered') self._file_managers.append(m_data) self._file_managers.sort(key=lambda m: m[1], reverse=True) @@ -219,7 +219,7 @@ # Dialogs with user # the overwrite check is done here - def openFileWrite(self, client, file_path, transfer_data, file_data, stream_object): + def open_file_write(self, client, file_path, transfer_data, file_data, stream_object): """create SatFile or FileStremaObject for the requested file and fill suitable data """ if stream_object: @@ -245,15 +245,15 @@ data_cb=file_data.get("data_cb"), ) - async def _gotConfirmation( + async def _got_confirmation( self, client, data, peer_jid, transfer_data, file_data, stream_object ): """Called when the permission and dest path have been received @param peer_jid(jid.JID): jid of the file sender - @param transfer_data(dict): same as for [self.getDestDir] - @param file_data(dict): same as for [self.getDestDir] - @param stream_object(bool): same as for [self.getDestDir] + @param transfer_data(dict): same as for [self.get_dest_dir] + @param file_data(dict): same as for [self.get_dest_dir] + @param stream_object(bool): same as for [self.get_dest_dir] return (bool): True if copy is wanted and OK False if user wants to cancel if file exists ask confirmation and call again self._getDestDir if needed @@ -266,7 +266,7 @@ # we manage case where file already exists if os.path.exists(file_path): - overwrite = await xml_tools.deferConfirm( + overwrite = await xml_tools.defer_confirm( self.host, _(CONFIRM_OVERWRITE).format(file_path), _(CONFIRM_OVERWRITE_TITLE), @@ -280,12 +280,12 @@ ) if not overwrite: - return await self.getDestDir(client, peer_jid, transfer_data, file_data) + return await self.get_dest_dir(client, peer_jid, transfer_data, file_data) - self.openFileWrite(client, file_path, transfer_data, file_data, stream_object) + self.open_file_write(client, file_path, transfer_data, file_data, stream_object) return True - async def getDestDir( + async def get_dest_dir( self, client, peer_jid, transfer_data, file_data, stream_object=False ): """Request confirmation and destination dir to user @@ -296,7 +296,7 @@ @param filename(unicode): name of the file @param transfer_data(dict): data of the transfer session, it will be only used to store the file_obj. - "file_obj" (or "stream_object") key *MUST NOT* exist before using getDestDir + "file_obj" (or "stream_object") key *MUST NOT* exist before using get_dest_dir @param file_data(dict): information about the file to be transfered It MUST contain the following keys: - peer_jid (jid.JID): other peer jid @@ -314,7 +314,7 @@ a stream.FileStreamObject will be used return: True if transfer is accepted """ - cont, ret_value = await self.host.trigger.asyncReturnPoint( + cont, ret_value = await self.host.trigger.async_return_point( "FILE_getDestDir", client, peer_jid, transfer_data, file_data, stream_object ) if not cont: @@ -323,8 +323,8 @@ assert filename and not "/" in filename assert PROGRESS_ID_KEY in file_data # human readable size - file_data["size_human"] = common_utils.getHumanSize(file_data["size"]) - resp_data = await xml_tools.deferDialog( + file_data["size_human"] = common_utils.get_human_size(file_data["size"]) + resp_data = await xml_tools.defer_dialog( self.host, _(CONFIRM).format(peer=peer_jid.full(), **file_data), _(CONFIRM_TITLE), @@ -339,7 +339,7 @@ profile=client.profile, ) - accepted = await self._gotConfirmation( + accepted = await self._got_confirmation( client, resp_data, peer_jid, diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_forums.py --- a/sat/plugins/plugin_misc_forums.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_forums.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,26 +62,26 @@ self._p.OPT_SEND_ITEM_SUBSCRIBE: 1, self._p.OPT_PUBLISH_MODEL: self._p.ACCESS_OPEN, } - host.registerNamespace('forums', NS_FORUMS) - host.bridge.addMethod("forumsGet", ".plugin", + host.register_namespace('forums', NS_FORUMS) + host.bridge.add_method("forums_get", ".plugin", in_sign='ssss', out_sign='s', method=self._get, async_=True) - host.bridge.addMethod("forumsSet", ".plugin", + host.bridge.add_method("forums_set", ".plugin", in_sign='sssss', out_sign='', method=self._set, async_=True) - host.bridge.addMethod("forumTopicsGet", ".plugin", + host.bridge.add_method("forum_topics_get", ".plugin", in_sign='ssa{ss}s', out_sign='(aa{ss}s)', - method=self._getTopics, + method=self._get_topics, async_=True) - host.bridge.addMethod("forumTopicCreate", ".plugin", + host.bridge.add_method("forum_topic_create", ".plugin", in_sign='ssa{ss}s', out_sign='', - method=self._createTopic, + method=self._create_topic, async_=True) @defer.inlineCallbacks - def _createForums(self, client, forums, service, node, forums_elt=None, names=None): + def _create_forums(self, client, forums, service, node, forums_elt=None, names=None): """Recursively create element(s) @param forums(list): forums which may have subforums @@ -115,7 +115,7 @@ log.info(_("creating missing forum node")) forum_node = FORUM_TOPICS_NODE_TPL.format(node=node, uuid=shortuuid.uuid()) yield self._p.createNode(client, service, forum_node, self._node_options) - value = uri.buildXMPPUri('pubsub', + value = uri.build_xmpp_uri('pubsub', path=service.full(), node=forum_node) if key in FORUM_ATTR: @@ -124,7 +124,7 @@ forum_elt.addElement(key, content=value) elif key == 'sub-forums': sub_forums_elt = forum_elt.addElement('forums') - yield self._createForums(client, value, service, node, sub_forums_elt, names=names) + yield self._create_forums(client, value, service, node, sub_forums_elt, names=names) else: log.warning(_("Unknown forum attribute: {key}").format(key=key)) if not forum_elt.getAttribute('title'): @@ -137,7 +137,7 @@ raise ValueError(_("forum need uri or sub-forums")) defer.returnValue(forums_elt) - def _parseForums(self, parent_elt=None, forums=None): + def _parse_forums(self, parent_elt=None, forums=None): """Recursivly parse a elements and return corresponding forums data @param item(domish.Element): item with element @@ -170,7 +170,7 @@ data[elt.name] = str(elt) elif elt.name == 'forums': sub_forums = data['sub-forums'] = [] - self._parseForums(elt, sub_forums) + self._parse_forums(elt, sub_forums) if not 'title' in data or not {'uri', 'sub-forums'}.intersection(data): log.warning(_("invalid forum, ignoring: {xml}").format(xml=forum_elt.toXml())) else: @@ -181,7 +181,7 @@ return forums def _get(self, service=None, node=None, forums_key=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if service.strip(): service = jid.JID(service) else: @@ -199,14 +199,14 @@ node = NS_FORUMS if forums_key is None: forums_key = 'default' - items_data = await self._p.getItems(client, service, node, item_ids=[forums_key]) + items_data = await self._p.get_items(client, service, node, item_ids=[forums_key]) item = items_data[0][0] # we have the item and need to convert it to json - forums = self._parseForums(item) + forums = self._parse_forums(item) return forums def _set(self, forums, service=None, node=None, forums_key=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) forums = json.loads(forums) if service.strip(): service = jid.JID(service) @@ -241,16 +241,16 @@ node = NS_FORUMS if forums_key is None: forums_key = 'default' - forums_elt = await self._createForums(client, forums, service, node) - return await self._p.sendItem( + forums_elt = await self._create_forums(client, forums, service, node) + return await self._p.send_item( client, service, node, forums_elt, item_id=forums_key ) - def _getTopics(self, service, node, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - extra = self._p.parseExtra(extra) + def _get_topics(self, service, node, extra=None, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + extra = self._p.parse_extra(extra) d = defer.ensureDeferred( - self.getTopics( + self.get_topics( client, jid.JID(service), node, rsm_request=extra.rsm_request, extra=extra.extra ) @@ -260,12 +260,12 @@ ) return d - async def getTopics(self, client, service, node, rsm_request=None, extra=None): + async def get_topics(self, client, service, node, rsm_request=None, extra=None): """Retrieve topics data Topics are simple microblog URIs with some metadata duplicated from first post """ - topics_data = await self._p.getItems( + topics_data = await self._p.get_items( client, service, node, rsm_request=rsm_request, extra=extra ) topics = [] @@ -279,13 +279,13 @@ topics.append(topic) return (topics, metadata) - def _createTopic(self, service, node, mb_data, profile_key): - client = self.host.getClient(profile_key) + def _create_topic(self, service, node, mb_data, profile_key): + client = self.host.get_client(profile_key) return defer.ensureDeferred( - self.createTopic(client, jid.JID(service), node, mb_data) + self.create_topic(client, jid.JID(service), node, mb_data) ) - async def createTopic(self, client, service, node, mb_data): + async def create_topic(self, client, service, node, mb_data): try: title = mb_data['title'] content = mb_data.pop('content') @@ -296,7 +296,7 @@ topic_node = FORUM_TOPIC_NODE_TPL.format(node=node, uuid=shortuuid.uuid()) await self._p.createNode(client, service, topic_node, self._node_options) await self._m.send(client, mb_data, service, topic_node) - topic_uri = uri.buildXMPPUri('pubsub', + topic_uri = uri.build_xmpp_uri('pubsub', subtype='microblog', path=service.full(), node=topic_node) @@ -304,4 +304,4 @@ topic_elt['uri'] = topic_uri topic_elt['author'] = client.jid.userhost() topic_elt.addElement('title', content = title) - await self._p.sendItem(client, service, node, topic_elt) + await self._p.send_item(client, service, node, topic_elt) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_groupblog.py --- a/sat/plugins/plugin_misc_groupblog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_groupblog.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,19 +61,19 @@ log.info(_("Group blog plugin initialization")) self.host = host self._p = self.host.plugins["XEP-0060"] - host.trigger.add("XEP-0277_item2data", self._item2dataTrigger) - host.trigger.add("XEP-0277_data2entry", self._data2entryTrigger) - host.trigger.add("XEP-0277_comments", self._commentsTrigger) + host.trigger.add("XEP-0277_item2data", self._item_2_data_trigger) + host.trigger.add("XEP-0277_data2entry", self._data_2_entry_trigger) + host.trigger.add("XEP-0277_comments", self._comments_trigger) ## plugin management methods ## - def getHandler(self, client): + def get_handler(self, client): return GroupBlog_handler() @defer.inlineCallbacks - def profileConnected(self, client): + def profile_connected(self, client): try: - yield self.host.checkFeatures(client, (NS_PUBSUB_GROUPBLOG,)) + yield self.host.check_features(client, (NS_PUBSUB_GROUPBLOG,)) except exceptions.FeatureNotFound: client.server_groupblog_available = False log.warning( @@ -85,21 +85,21 @@ client.server_groupblog_available = True log.info(_("Server can manage group blogs")) - def getFeatures(self, profile): + def features_get(self, profile): try: - client = self.host.getClient(profile) + client = self.host.get_client(profile) except exceptions.ProfileNotSetError: return {} try: - return {"available": C.boolConst(client.server_groupblog_available)} + return {"available": C.bool_const(client.server_groupblog_available)} except AttributeError: - if self.host.isConnected(profile): + if self.host.is_connected(profile): log.debug("Profile is not connected, service is not checked yet") else: log.error("client.server_groupblog_available should be available !") return {} - def _item2dataTrigger(self, item_elt, entry_elt, microblog_data): + def _item_2_data_trigger(self, item_elt, entry_elt, microblog_data): """Parse item to find group permission elements""" config_form = data_form.findForm(item_elt, NS_PUBSUB_ITEM_CONFIG) if config_form is None: @@ -109,7 +109,7 @@ opt = self._p.OPT_ROSTER_GROUPS_ALLOWED microblog_data['groups'] = config_form.fields[opt].values - def _data2entryTrigger(self, client, mb_data, entry_elt, item_elt): + def _data_2_entry_trigger(self, client, mb_data, entry_elt, item_elt): """Build fine access permission if needed This trigger check if "group*" key are present, @@ -130,7 +130,7 @@ form.addField(allowed) item_elt.addChild(form.toElement()) - def _commentsTrigger(self, client, mb_data, options): + def _comments_trigger(self, client, mb_data, options): """This method is called when a comments node is about to be created It changes the access mode to roster if needed, and give the authorized groups diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_identity.py --- a/sat/plugins/plugin_misc_identity.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_identity.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,11 +77,11 @@ "avatar": { "type": dict, # convert avatar path to avatar metadata (and check validity) - "set_data_filter": self.avatarSetDataFilter, + "set_data_filter": self.avatar_set_data_filter, # update profile avatar, so all frontends are aware - "set_post_treatment": self.avatarSetPostTreatment, - "update_is_new_data": self.avatarUpdateIsNewData, - "update_data_filter": self.avatarUpdateDataFilter, + "set_post_treatment": self.avatar_set_post_treatment, + "update_is_new_data": self.avatar_update_is_new_data, + "update_data_filter": self.avatar_update_data_filter, # we store the metadata in database, to restore it on next connection # (it is stored only for roster entities) "store": True, @@ -92,70 +92,70 @@ # of returning only the data from the first successful callback "get_all": True, # append nicknames from roster, resource, etc. - "get_post_treatment": self.nicknamesGetPostTreatment, - "update_is_new_data": self.nicknamesUpdateIsNewData, + "get_post_treatment": self.nicknames_get_post_treatment, + "update_is_new_data": self.nicknames_update_is_new_data, "store": True, }, "description": { "type": str, "get_all": True, - "get_post_treatment": self.descriptionGetPostTreatment, + "get_post_treatment": self.description_get_post_treatment, "store": True, } } - host.trigger.add("roster_update", self._rosterUpdateTrigger) - host.memory.setSignalOnUpdate("avatar") - host.memory.setSignalOnUpdate("nicknames") - host.bridge.addMethod( - "identityGet", + host.trigger.add("roster_update", self._roster_update_trigger) + host.memory.set_signal_on_update("avatar") + host.memory.set_signal_on_update("nicknames") + host.bridge.add_method( + "identity_get", ".plugin", in_sign="sasbs", out_sign="s", - method=self._getIdentity, + method=self._get_identity, async_=True, ) - host.bridge.addMethod( - "identitiesGet", + host.bridge.add_method( + "identities_get", ".plugin", in_sign="asass", out_sign="s", - method=self._getIdentities, + method=self._get_identities, async_=True, ) - host.bridge.addMethod( - "identitiesBaseGet", + host.bridge.add_method( + "identities_base_get", ".plugin", in_sign="s", out_sign="s", - method=self._getBaseIdentities, + method=self._get_base_identities, async_=True, ) - host.bridge.addMethod( - "identitySet", + host.bridge.add_method( + "identity_set", ".plugin", in_sign="ss", out_sign="", - method=self._setIdentity, + method=self._set_identity, async_=True, ) - host.bridge.addMethod( - "avatarGet", + host.bridge.add_method( + "avatar_get", ".plugin", in_sign="sbs", out_sign="s", method=self._getAvatar, async_=True, ) - host.bridge.addMethod( - "avatarSet", + host.bridge.add_method( + "avatar_set", ".plugin", in_sign="sss", out_sign="", - method=self._setAvatar, + method=self._set_avatar, async_=True, ) - async def profileConnecting(self, client): + async def profile_connecting(self, client): client._identity_update_lock = [] # we restore known identities from database client._identity_storage = persistent.LazyPersistentBinaryDict( @@ -188,22 +188,22 @@ f"{value}") to_delete.append(key) continue - cache = self.host.common_cache.getMetadata(cache_uid) + cache = self.host.common_cache.get_metadata(cache_uid) if cache is None: log.debug( f"purging avatar for {entity}: it is not in cache anymore") to_delete.append(key) continue - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, entity, name, value, silent=True ) for key in to_delete: await client._identity_storage.adel(key) - def _rosterUpdateTrigger(self, client, roster_item): - old_item = client.roster.getItem(roster_item.jid) + def _roster_update_trigger(self, client, roster_item): + old_item = client.roster.get_item(roster_item.jid) if old_item is None or old_item.name != roster_item.name: log.debug( f"roster nickname has been updated to {roster_item.name!r} for " @@ -247,7 +247,7 @@ cb_list.append(callback) cb_list.sort(key=lambda c: c.priority, reverse=True) - def getIdentityJid(self, client, peer_jid): + def get_identity_jid(self, client, peer_jid): """Return jid to use to set identity metadata if it's a jid of a room occupant, full jid will be used @@ -260,9 +260,9 @@ if self._m is None: return peer_jid.userhostJID() else: - return self._m.getBareOrFull(client, peer_jid) + return self._m.get_bare_or_full(client, peer_jid) - def checkType(self, metadata_name, value): + def check_type(self, metadata_name, value): """Check that type used for a metadata is the one declared in self.metadata""" value_type = self.metadata[metadata_name]["type"] if not isinstance(value, value_type): @@ -270,7 +270,7 @@ f"{value} has wrong type: it is {type(value)} while {value_type} was " f"expected") - def getFieldType(self, metadata_name: str) -> str: + def get_field_type(self, metadata_name: str) -> str: """Return the type the requested field @param metadata_name: name of the field to check @@ -298,7 +298,7 @@ @param use_cache: if False, cache won't be checked @param prefilled_values: map of origin => value to use when `get_all` is set """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) try: metadata = self.metadata[metadata_name] except KeyError: @@ -306,7 +306,7 @@ get_all = metadata.get('get_all', False) if use_cache: try: - data = self.host.memory.getEntityDatum( + data = self.host.memory.get_entity_datum( client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): pass @@ -343,7 +343,7 @@ .format(callback=callback.get, metadata_name=metadata_name, e=e)) else: if data: - self.checkType(metadata_name, data) + self.check_type(metadata_name, data) if get_all: if isinstance(data, list): all_data.extend(data) @@ -359,9 +359,9 @@ post_treatment = metadata.get("get_post_treatment") if post_treatment is not None: - data = await utils.asDeferred(post_treatment, client, entity, data) + data = await utils.as_deferred(post_treatment, client, entity, data) - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, entity, metadata_name, data) if metadata.get('store', False): @@ -381,12 +381,12 @@ @param entity(jid.JID, None): entity for which avatar is requested None to use profile's jid """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) metadata = self.metadata[metadata_name] data_filter = metadata.get("set_data_filter") if data_filter is not None: - data = await utils.asDeferred(data_filter, client, entity, data) - self.checkType(metadata_name, data) + data = await utils.as_deferred(data_filter, client, entity, data) + self.check_type(metadata_name, data) try: callbacks = metadata['callbacks'] @@ -411,7 +411,7 @@ post_treatment = metadata.get("set_post_treatment") if post_treatment is not None: - await utils.asDeferred(post_treatment, client, entity, data) + await utils.as_deferred(post_treatment, client, entity, data) async def update( self, @@ -426,14 +426,14 @@ This method may be called by plugins when an identity metadata is available. @param origin: namespace of the plugin which is source of the metadata """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) if (entity, metadata_name) in client._identity_update_lock: log.debug(f"update is locked for {entity}'s {metadata_name}") return metadata = self.metadata[metadata_name] try: - cached_data = self.host.memory.getEntityDatum( + cached_data = self.host.memory.get_entity_datum( client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): # metadata is not cached, we do the update @@ -443,7 +443,7 @@ try: update_is_new_data = metadata["update_is_new_data"] except KeyError: - update_is_new_data = self.defaultUpdateIsNewData + update_is_new_data = self.default_update_is_new_data if data is None: if cached_data is None: @@ -467,7 +467,7 @@ # get_all is set, meaning that we have to check all plugins # so we first delete current cache try: - self.host.memory.delEntityDatum(client, entity, metadata_name) + self.host.memory.del_entity_datum(client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): pass # then fill it again by calling get, which will retrieve all values @@ -481,32 +481,32 @@ if data is not None: data_filter = metadata['update_data_filter'] if data_filter is not None: - data = await utils.asDeferred(data_filter, client, entity, data) - self.checkType(metadata_name, data) + data = await utils.as_deferred(data_filter, client, entity, data) + self.check_type(metadata_name, data) - self.host.memory.updateEntityData(client, entity, metadata_name, data) + self.host.memory.update_entity_data(client, entity, metadata_name, data) if metadata.get('store', False): key = f"{entity}\n{metadata_name}" await client._identity_storage.aset(key, data) - def defaultUpdateIsNewData(self, client, entity, cached_data, new_data): + def default_update_is_new_data(self, client, entity, cached_data, new_data): return new_data != cached_data def _getAvatar(self, entity, use_cache, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) entity = jid.JID(entity) if entity else None d = defer.ensureDeferred(self.get(client, "avatar", entity, use_cache)) d.addCallback(lambda data: data_format.serialise(data)) return d - def _setAvatar(self, file_path, entity, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + def _set_avatar(self, file_path, entity, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) entity = jid.JID(entity) if entity else None return defer.ensureDeferred( self.set(client, "avatar", file_path, entity)) - def _blockingCacheAvatar( + def _blocking_cache_avatar( self, source: str, avatar_data: dict[str, Any] @@ -546,7 +546,7 @@ img_buf.seek(0) image_hash = hashlib.sha1(img_buf.read()).hexdigest() img_buf.seek(0) - with self.host.common_cache.cacheData( + with self.host.common_cache.cache_data( source, image_hash, media_type ) as f: f.write(img_buf.read()) @@ -554,21 +554,21 @@ avatar_data['filename'] = avatar_data['path'].name avatar_data['cache_uid'] = image_hash - async def cacheAvatar(self, source: str, avatar_data: Dict[str, Any]) -> None: + async def cache_avatar(self, source: str, avatar_data: Dict[str, Any]) -> None: """Resize if necessary and cache avatar @param source: source importing the avatar (usually it is plugin's import name), will be used in cache metadata - @param avatar_data: avatar metadata as build by [avatarSetDataFilter] + @param avatar_data: avatar metadata as build by [avatar_set_data_filter] will be updated with following keys: path: updated path using cached file filename: updated filename using cached file base64: resized and base64 encoded avatar cache_uid: SHA1 hash used as cache unique ID """ - await threads.deferToThread(self._blockingCacheAvatar, source, avatar_data) + await threads.deferToThread(self._blocking_cache_avatar, source, avatar_data) - async def avatarSetDataFilter(self, client, entity, file_path): + async def avatar_set_data_filter(self, client, entity, file_path): """Convert avatar file path to dict data""" file_path = Path(file_path) if not file_path.is_file(): @@ -583,14 +583,14 @@ raise ValueError(f"Can't identify type of image at {file_path}") if not media_type.startswith('image/'): raise ValueError(f"File at {file_path} doesn't appear to be an image") - await self.cacheAvatar(IMPORT_NAME, avatar_data) + await self.cache_avatar(IMPORT_NAME, avatar_data) return avatar_data - async def avatarSetPostTreatment(self, client, entity, avatar_data): + async def avatar_set_post_treatment(self, client, entity, avatar_data): """Update our own avatar""" await self.update(client, IMPORT_NAME, "avatar", avatar_data, entity) - def avatarBuildMetadata( + def avatar_build_metadata( self, path: Path, media_type: Optional[str] = None, @@ -622,10 +622,10 @@ "cache_uid": cache_uid, } - def avatarUpdateIsNewData(self, client, entity, cached_data, new_data): + def avatar_update_is_new_data(self, client, entity, cached_data, new_data): return new_data['path'] != cached_data['path'] - async def avatarUpdateDataFilter(self, client, entity, data): + async def avatar_update_data_filter(self, client, entity, data): if not isinstance(data, dict): raise ValueError(f"Invalid data type ({type(data)}), a dict is expected") mandatory_keys = {'path', 'filename', 'cache_uid'} @@ -633,7 +633,7 @@ raise ValueError(f"missing avatar data keys: {mandatory_keys - data.keys()}") return data - async def nicknamesGetPostTreatment(self, client, entity, plugin_nicknames): + async def nicknames_get_post_treatment(self, client, entity, plugin_nicknames): """Prepend nicknames from core locations + set default nickname nicknames are checked from many locations, there is always at least @@ -648,13 +648,13 @@ # for MUC we add resource if entity.resource: - # getIdentityJid let the resource only if the entity is a MUC room + # get_identity_jid let the resource only if the entity is a MUC room # occupant jid nicknames.append(entity.resource) # we first check roster (if we are not in a component) if not client.is_component: - roster_item = client.roster.getItem(entity.userhostJID()) + roster_item = client.roster.get_item(entity.userhostJID()) if roster_item is not None and roster_item.name: # user set name has priority over entity set name nicknames.append(roster_item.name) @@ -670,10 +670,10 @@ # we remove duplicates while preserving order with dict return list(dict.fromkeys(nicknames)) - def nicknamesUpdateIsNewData(self, client, entity, cached_data, new_nicknames): + def nicknames_update_is_new_data(self, client, entity, cached_data, new_nicknames): return not set(new_nicknames).issubset(cached_data) - async def descriptionGetPostTreatment( + async def description_get_post_treatment( self, client: SatXMPPEntity, entity: jid.JID, @@ -682,15 +682,15 @@ """Join all descriptions in a unique string""" return '\n'.join(plugin_description) - def _getIdentity(self, entity_s, metadata_filter, use_cache, profile): + def _get_identity(self, entity_s, metadata_filter, use_cache, profile): entity = jid.JID(entity_s) - client = self.host.getClient(profile) + client = self.host.get_client(profile) d = defer.ensureDeferred( - self.getIdentity(client, entity, metadata_filter, use_cache)) + self.get_identity(client, entity, metadata_filter, use_cache)) d.addCallback(data_format.serialise) return d - async def getIdentity( + async def get_identity( self, client: SatXMPPEntity, entity: Optional[jid.JID] = None, @@ -719,14 +719,14 @@ return id_data - def _getIdentities(self, entities_s, metadata_filter, profile): + def _get_identities(self, entities_s, metadata_filter, profile): entities = [jid.JID(e) for e in entities_s] - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.getIdentities(client, entities, metadata_filter)) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.get_identities(client, entities, metadata_filter)) d.addCallback(lambda d: data_format.serialise({str(j):i for j, i in d.items()})) return d - async def getIdentities( + async def get_identities( self, client: SatXMPPEntity, entities: List[jid.JID], @@ -735,7 +735,7 @@ """Retrieve several identities at once @param entities: entities from which identities must be retrieved - @param metadata_filter: same as for [getIdentity] + @param metadata_filter: same as for [get_identity] @return: identities metadata where key is jid if an error happens while retrieve a jid entity, it won't be present in the result (and a warning will be logged) @@ -745,7 +745,7 @@ for entity_jid in entities: get_identity_list.append( defer.ensureDeferred( - self.getIdentity( + self.get_identity( client, entity=entity_jid, metadata_filter=metadata_filter, @@ -761,13 +761,13 @@ identities[entity_jid] = identity return identities - def _getBaseIdentities(self, profile_key): - client = self.host.getClient(profile_key) - d = defer.ensureDeferred(self.getBaseIdentities(client)) + def _get_base_identities(self, profile_key): + client = self.host.get_client(profile_key) + d = defer.ensureDeferred(self.get_base_identities(client)) d.addCallback(lambda d: data_format.serialise({str(j):i for j, i in d.items()})) return d - async def getBaseIdentities( + async def get_base_identities( self, client: SatXMPPEntity, ) -> dict: @@ -779,20 +779,20 @@ if client.is_component: entities = [client.jid.userhostJID()] else: - entities = client.roster.getJids() + [client.jid.userhostJID()] + entities = client.roster.get_jids() + [client.jid.userhostJID()] - return await self.getIdentities( + return await self.get_identities( client, entities, ['avatar', 'nicknames'] ) - def _setIdentity(self, id_data_s, profile): - client = self.host.getClient(profile) + def _set_identity(self, id_data_s, profile): + client = self.host.get_client(profile) id_data = data_format.deserialise(id_data_s) - return defer.ensureDeferred(self.setIdentity(client, id_data)) + return defer.ensureDeferred(self.set_identity(client, id_data)) - async def setIdentity(self, client, id_data): + async def set_identity(self, client, id_data): """Update profile's identity @param id_data(dict): data to update, key can be one of self.metadata keys diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_ip.py --- a/sat/plugins/plugin_misc_ip.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_ip.py Sat Apr 08 13:54:42 2023 +0200 @@ -99,7 +99,7 @@ def __init__(self, host): log.info(_("plugin IP discovery initialization")) self.host = host - host.memory.updateParams(PARAMS) + host.memory.update_params(PARAMS) # NAT-Port try: @@ -109,50 +109,50 @@ self._nat = None # XXX: cache is kept until SàT is restarted - # if IP may have changed, use self.refreshIP + # if IP may have changed, use self.refresh_ip self._external_ip_cache = None self._local_ip_cache = None - def getHandler(self, client): + def get_handler(self, client): return IPPlugin_handler() - def refreshIP(self): + def refresh_ip(self): # FIXME: use a trigger instead ? self._external_ip_cache = None self._local_ip_cache = None - def _externalAllowed(self, client): + def _external_allowed(self, client): """Return value of parameter with autorisation of user to do external requests if parameter is not set, a dialog is shown to use to get its confirmation, and parameted is set according to answer @return (defer.Deferred[bool]): True if external request is autorised """ - allow_get_ip = self.host.memory.params.getParamA( + allow_get_ip = self.host.memory.params.param_get_a( GET_IP_NAME, GET_IP_CATEGORY, use_default=False ) if allow_get_ip is None: # we don't have autorisation from user yet to use get_ip, we ask him - def setParam(allowed): - # FIXME: we need to use boolConst as setParam only manage str/unicode + def param_set(allowed): + # FIXME: we need to use bool_const as param_set only manage str/unicode # need to be fixed when params will be refactored - self.host.memory.setParam( - GET_IP_NAME, C.boolConst(allowed), GET_IP_CATEGORY + self.host.memory.param_set( + GET_IP_NAME, C.bool_const(allowed), GET_IP_CATEGORY ) return allowed - d = xml_tools.deferConfirm( + d = xml_tools.defer_confirm( self.host, _(GET_IP_CONFIRM), _(GET_IP_CONFIRM_TITLE), profile=client.profile, ) - d.addCallback(setParam) + d.addCallback(param_set) return d return defer.succeed(allow_get_ip) - def _filterAddresse(self, ip_addr): + def _filter_addresse(self, ip_addr): """Filter acceptable addresses For now, just remove IPv4 local addresses @@ -161,7 +161,7 @@ """ return not ip_addr.startswith("127.") - def _insertFirst(self, addresses, ip_addr): + def _insert_first(self, addresses, ip_addr): """Insert ip_addr as first item in addresses @param addresses(list): list of IP addresses @@ -174,7 +174,7 @@ else: addresses.insert(0, ip_addr) - async def _getIPFromExternal(self, ext_url): + async def _get_ip_from_external(self, ext_url): """Get local IP by doing a connection on an external url @param ext_utl(str): url to connect to @@ -201,7 +201,7 @@ return local_ip @defer.inlineCallbacks - def getLocalIPs(self, client): + def get_local_i_ps(self, client): """Try do discover local area network IPs @return (deferred): list of lan IP addresses @@ -225,43 +225,43 @@ continue for data in inet_list: addresse = data["addr"] - if self._filterAddresse(addresse): + if self._filter_addresse(addresse): addresses.append(addresse) # then we use our connection to server ip = client.xmlstream.transport.getHost().host - if self._filterAddresse(ip): - self._insertFirst(addresses, ip) + if self._filter_addresse(ip): + self._insert_first(addresses, ip) defer.returnValue(addresses) # if server is local, we try with NAT-Port if self._nat is not None: - nat_ip = yield self._nat.getIP(local=True) + nat_ip = yield self._nat.get_ip(local=True) if nat_ip is not None: - self._insertFirst(addresses, nat_ip) + self._insert_first(addresses, nat_ip) defer.returnValue(addresses) if addresses: defer.returnValue(addresses) # still not luck, we need to contact external website - allow_get_ip = yield self._externalAllowed(client) + allow_get_ip = yield self._external_allowed(client) if not allow_get_ip: defer.returnValue(addresses or localhost) try: - local_ip = yield defer.ensureDeferred(self._getIPFromExternal(GET_IP_PAGE)) + local_ip = yield defer.ensureDeferred(self._get_ip_from_external(GET_IP_PAGE)) except (internet_error.DNSLookupError, internet_error.TimeoutError): log.warning("Can't access Domain Name System") else: if local_ip is not None: - self._insertFirst(addresses, local_ip) + self._insert_first(addresses, local_ip) defer.returnValue(addresses or localhost) @defer.inlineCallbacks - def getExternalIP(self, client): + def get_external_ip(self, client): """Try to discover external IP @return (deferred): external IP address or None if it can't be discovered @@ -295,13 +295,13 @@ # then with NAT-Port if self._nat is not None: - nat_ip = yield self._nat.getIP() + nat_ip = yield self._nat.get_ip() if nat_ip is not None: self._external_ip_cache = nat_ip defer.returnValue(nat_ip) # and finally by requesting external website - allow_get_ip = yield self._externalAllowed(client) + allow_get_ip = yield self._external_allowed(client) try: ip = ((yield webclient.getPage(GET_IP_PAGE.encode('utf-8'))) if allow_get_ip else None) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_lists.py --- a/sat/plugins/plugin_misc_lists.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_lists.py Sat Apr 08 13:54:42 2023 +0200 @@ -208,16 +208,16 @@ log.info(_("Pubsub lists plugin initialization")) self.host = host self._s = self.host.plugins["XEP-0346"] - self.namespace = self._s.getSubmittedNS(APP_NS_TICKETS) - host.registerNamespace("tickets", APP_NS_TICKETS) - host.registerNamespace("tickets_type", NS_TICKETS_TYPE) + self.namespace = self._s.get_submitted_ns(APP_NS_TICKETS) + host.register_namespace("tickets", APP_NS_TICKETS) + host.register_namespace("tickets_type", NS_TICKETS_TYPE) self.host.plugins["PUBSUB_INVITATION"].register( APP_NS_TICKETS, self ) self._p = self.host.plugins["XEP-0060"] self._m = self.host.plugins["XEP-0277"] - host.bridge.addMethod( - "listGet", + host.bridge.add_method( + "list_get", ".plugin", in_sign="ssiassss", out_sign="s", @@ -232,68 +232,68 @@ default_node=self.namespace, form_ns=APP_NS_TICKETS, filters={ - "author": self._s.valueOrPublisherFilter, - "created": self._s.dateFilter, - "updated": self._s.dateFilter, - "time_limit": self._s.dateFilter, + "author": self._s.value_or_publisher_filter, + "created": self._s.date_filter, + "updated": self._s.date_filter, + "time_limit": self._s.date_filter, }, profile_key=profile_key), async_=True, ) - host.bridge.addMethod( - "listSet", + host.bridge.add_method( + "list_set", ".plugin", in_sign="ssa{sas}ssss", out_sign="s", method=self._set, async_=True, ) - host.bridge.addMethod( - "listDeleteItem", + host.bridge.add_method( + "list_delete_item", ".plugin", in_sign="sssbs", out_sign="", method=self._delete, async_=True, ) - host.bridge.addMethod( - "listSchemaGet", + host.bridge.add_method( + "list_schema_get", ".plugin", in_sign="sss", out_sign="s", - method=lambda service, nodeIdentifier, profile_key: self._s._getUISchema( + method=lambda service, nodeIdentifier, profile_key: self._s._get_ui_schema( service, nodeIdentifier, default_node=self.namespace, profile_key=profile_key), async_=True, ) - host.bridge.addMethod( - "listsList", + host.bridge.add_method( + "lists_list", ".plugin", in_sign="sss", out_sign="s", - method=self._listsList, + method=self._lists_list, async_=True, ) - host.bridge.addMethod( - "listTemplatesNamesGet", + host.bridge.add_method( + "list_templates_names_get", ".plugin", in_sign="ss", out_sign="s", - method=self._getTemplatesNames, + method=self._get_templates_names, ) - host.bridge.addMethod( - "listTemplateGet", + host.bridge.add_method( + "list_template_get", ".plugin", in_sign="sss", out_sign="s", - method=self._getTemplate, + method=self._get_template, ) - host.bridge.addMethod( - "listTemplateCreate", + host.bridge.add_method( + "list_template_create", ".plugin", in_sign="ssss", out_sign="(ss)", - method=self._createTemplate, + method=self._create_template, async_=True, ) @@ -309,7 +309,7 @@ item_elt: domish.Element ) -> None: try: - schema = await self._s.getSchemaForm(client, service, node) + schema = await self._s.get_schema_form(client, service, node) except Exception as e: log.warning(f"Can't retrive node schema as {node!r} [{service}]: {e}") else: @@ -323,7 +323,7 @@ def _set(self, service, node, values, schema=None, item_id=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client, service, node, schema, item_id, extra = self._s.prepareBridgeSet( + client, service, node, schema, item_id, extra = self._s.prepare_bridge_set( service, node, schema, item_id, extra_s, profile_key ) d = defer.ensureDeferred(self.set( @@ -346,22 +346,22 @@ 'created' and 'updated' will be forced to current time: - 'created' is set if item_id is None, i.e. if it's a new ticket - 'updated' is set everytime - @param extra(dict, None): same as for [XEP-0060.sendItem] with additional keys: + @param extra(dict, None): same as for [XEP-0060.send_item] with additional keys: - update(bool): if True, get previous item data to merge with current one if True, item_id must be set - other arguments are same as for [self._s.sendDataFormItem] + other arguments are same as for [self._s.send_data_form_item] @return (unicode): id of the created item """ if not node: node = self.namespace if not item_id: - comments_service = await self._m.getCommentsService(client, service) + comments_service = await self._m.get_comments_service(client, service) # we need to use uuid for comments node, because we don't know item id in # advance (we don't want to set it ourselves to let the server choose, so we # can have a nicer id if serial ids is activated) - comments_node = self._m.getCommentsNode( + comments_node = self._m.get_comments_node( node + "_" + str(shortuuid.uuid()) ) options = { @@ -372,7 +372,7 @@ self._p.OPT_PUBLISH_MODEL: self._p.ACCESS_OPEN, } await self._p.createNode(client, comments_service, comments_node, options) - values["comments_uri"] = uri.buildXMPPUri( + values["comments_uri"] = uri.build_xmpp_uri( "pubsub", subtype="microblog", path=comments_service.full(), @@ -386,7 +386,7 @@ def _delete( self, service_s, nodeIdentifier, itemIdentifier, notify, profile_key ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred(self.delete( client, jid.JID(service_s) if service_s else None, @@ -405,26 +405,26 @@ ) -> None: if not node: node = self.namespace - return await self._p.retractItems( + return await self._p.retract_items( service, node, (itemIdentifier,), notify, client.profile ) - def _listsList(self, service, node, profile): + def _lists_list(self, service, node, profile): service = jid.JID(service) if service else None node = node or None - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.listsList(client, service, node)) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.lists_list(client, service, node)) d.addCallback(data_format.serialise) return d - async def listsList( + async def lists_list( self, client, service: Optional[jid.JID], node: Optional[str]=None ) -> List[dict]: """Retrieve list of pubsub lists registered in personal interests @return list: list of lists metadata """ - items, metadata = await self.host.plugins['LIST_INTEREST'].listInterests( + items, metadata = await self.host.plugins['LIST_INTEREST'].list_interests( client, service, node, namespace=APP_NS_TICKETS) lists = [] for item in items: @@ -454,34 +454,34 @@ return lists - def _getTemplatesNames(self, language, profile): - client = self.host.getClient(profile) - return data_format.serialise(self.getTemplatesNames(client, language)) + def _get_templates_names(self, language, profile): + client = self.host.get_client(profile) + return data_format.serialise(self.get_templates_names(client, language)) - def getTemplatesNames(self, client, language: str) -> list: + def get_templates_names(self, client, language: str) -> list: """Retrieve well known list templates""" templates = [{"id": tpl_id, "name": d["name"], "icon": d["icon"]} for tpl_id, d in TEMPLATES.items()] return templates - def _getTemplate(self, name, language, profile): - client = self.host.getClient(profile) - return data_format.serialise(self.getTemplate(client, name, language)) + def _get_template(self, name, language, profile): + client = self.host.get_client(profile) + return data_format.serialise(self.get_template(client, name, language)) - def getTemplate(self, client, name: str, language: str) -> dict: + def get_template(self, client, name: str, language: str) -> dict: """Retrieve a well known template""" return TEMPLATES[name] - def _createTemplate(self, template_id, name, access_model, profile): - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.createTemplate( + def _create_template(self, template_id, name, access_model, profile): + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.create_template( client, template_id, name, access_model )) d.addCallback(lambda node_data: (node_data[0].full(), node_data[1])) return d - async def createTemplate( + async def create_template( self, client, template_id: str, name: str, access_model: str ) -> Tuple[jid.JID, str]: """Create a list from a template""" @@ -493,12 +493,12 @@ 0, {"type": "hidden", "name": NS_TICKETS_TYPE, "value": template_id} ) - schema = xml_tools.dataDict2dataForm( + schema = xml_tools.data_dict_2_data_form( {"namespace": APP_NS_TICKETS, "fields": fields} ).toElement() service = client.jid.userhostJID() - node = self._s.getSubmittedNS(f"{APP_NS_TICKETS}_{name}") + node = self._s.get_submitted_ns(f"{APP_NS_TICKETS}_{name}") options = { self._p.OPT_ACCESS_MODEL: access_model, } @@ -507,11 +507,11 @@ # XXX: should node options be in TEMPLATE? options[self._p.OPT_OVERWRITE_POLICY] = self._p.OWPOL_ANY_PUB await self._p.createNode(client, service, node, options) - await self._s.setSchema(client, service, node, schema) + await self._s.set_schema(client, service, node, schema) list_elt = domish.Element((APP_NS_TICKETS, "list")) list_elt["type"] = template_id try: - await self.host.plugins['LIST_INTEREST'].registerPubsub( + await self.host.plugins['LIST_INTEREST'].register_pubsub( client, APP_NS_TICKETS, service, node, creator=True, name=name, element=list_elt) except Exception as e: diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_merge_requests.py --- a/sat/plugins/plugin_misc_merge_requests.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_merge_requests.py Sat Apr 08 13:54:42 2023 +0200 @@ -69,35 +69,35 @@ log.info(_("Merge requests plugin initialization")) self.host = host self._s = self.host.plugins["XEP-0346"] - self.namespace = self._s.getSubmittedNS(APP_NS_MERGE_REQUESTS) - host.registerNamespace('merge_requests', self.namespace) + self.namespace = self._s.get_submitted_ns(APP_NS_MERGE_REQUESTS) + host.register_namespace('merge_requests', self.namespace) self._p = self.host.plugins["XEP-0060"] self._t = self.host.plugins["LISTS"] self._handlers = {} self._handlers_list = [] # handlers sorted by priority self._type_handlers = {} # data type => handler map - host.bridge.addMethod("mergeRequestsGet", ".plugin", + host.bridge.add_method("merge_requests_get", ".plugin", in_sign='ssiassss', out_sign='s', method=self._get, async_=True ) - host.bridge.addMethod("mergeRequestSet", ".plugin", + host.bridge.add_method("merge_request_set", ".plugin", in_sign='ssssa{sas}ssss', out_sign='s', method=self._set, async_=True) - host.bridge.addMethod("mergeRequestsSchemaGet", ".plugin", + host.bridge.add_method("merge_requests_schema_get", ".plugin", in_sign='sss', out_sign='s', method=lambda service, nodeIdentifier, profile_key: - self._s._getUISchema(service, + self._s._get_ui_schema(service, nodeIdentifier, default_node=self.namespace, profile_key=profile_key), async_=True) - host.bridge.addMethod("mergeRequestParseData", ".plugin", + host.bridge.add_method("merge_request_parse_data", ".plugin", in_sign='ss', out_sign='aa{ss}', - method=self._parseData, + method=self._parse_data, async_=True) - host.bridge.addMethod("mergeRequestsImport", ".plugin", + host.bridge.add_method("merge_requests_import", ".plugin", in_sign='ssssa{ss}s', out_sign='', method=self._import, async_=True @@ -141,7 +141,7 @@ def serialise(self, get_data): tickets_xmlui, metadata, items_patches = get_data - tickets_xmlui_s, metadata = self._p.transItemsData((tickets_xmlui, metadata)) + tickets_xmlui_s, metadata = self._p.trans_items_data((tickets_xmlui, metadata)) return data_format.serialise({ "items": tickets_xmlui_s, "metadata": metadata, @@ -151,7 +151,7 @@ def _get(self, service='', node='', max_items=10, item_ids=None, sub_id=None, extra="", profile_key=C.PROF_KEY_NONE): extra = data_format.deserialise(extra) - client, service, node, max_items, extra, sub_id = self._s.prepareBridgeGet( + client, service, node, max_items, extra, sub_id = self._s.prepare_bridge_get( service, node, max_items, sub_id, extra, profile_key) d = self.get(client, service, node or None, max_items, item_ids, sub_id or None, extra.rsm_request, extra.extra) @@ -178,11 +178,11 @@ # XXX: Q&D way to get list for labels when displaying them, but text when we # have to modify them if C.bool(extra.get('labels_as_list', C.BOOL_FALSE)): - filters = {'labels': self._s.textbox2ListFilter} + filters = {'labels': self._s.textbox_2_list_filter} else: filters = {} tickets_xmlui, metadata = yield defer.ensureDeferred( - self._s.getDataFormItems( + self._s.get_data_form_items( client, service, node, @@ -199,13 +199,13 @@ for ticket in tickets_xmlui: request_type = ticket.named_widgets[FIELD_DATA_TYPE].value request_data = ticket.named_widgets[FIELD_DATA].value - parsed_data = yield self.parseData(request_type, request_data) + parsed_data = yield self.parse_data(request_type, request_data) parsed_patches.append(parsed_data) defer.returnValue((tickets_xmlui, metadata, parsed_patches)) def _set(self, service, node, repository, method, values, schema=None, item_id=None, extra="", profile_key=C.PROF_KEY_NONE): - client, service, node, schema, item_id, extra = self._s.prepareBridgeSet( + client, service, node, schema, item_id, extra = self._s.prepare_bridge_set( service, node, schema, item_id, extra, profile_key) d = defer.ensureDeferred( self.set( @@ -290,13 +290,13 @@ deserialise, form_ns=APP_NS_MERGE_REQUESTS) return item_id - def _parseData(self, data_type, data): - d = self.parseData(data_type, data) + def _parse_data(self, data_type, data): + d = self.parse_data(data_type, data) d.addCallback(lambda parsed_patches: {key: str(value) for key, value in parsed_patches.items()}) return d - def parseData(self, data_type, data): + def parse_data(self, data_type, data): """Parse a merge request data according to type @param data_type(unicode): type of the data to parse @@ -314,7 +314,7 @@ def _import(self, repository, item_id, service=None, node=None, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None d = self.import_request(client, repository, item_id, service, node or None, extra=extra or None) @@ -323,14 +323,14 @@ @defer.inlineCallbacks def import_request(self, client, repository, item, service=None, node=None, extra=None): - """Import a merge request in specified directory + """import a merge request in specified directory @param repository(unicode): path to the repository where the code stands """ if not node: node = self.namespace tickets_xmlui, metadata = yield defer.ensureDeferred( - self._s.getDataFormItems( + self._s.get_data_form_items( client, service, node, diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_nat_port.py --- a/sat/plugins/plugin_misc_nat_port.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_nat_port.py Sat Apr 08 13:54:42 2023 +0200 @@ -75,7 +75,7 @@ def unload(self): if self._to_unmap: log.info("Cleaning mapped ports") - return threads.deferToThread(self._unmapPortsBlocking) + return threads.deferToThread(self._unmap_ports_blocking) def _init_failed(self, failure_): e = failure_.trap(exceptions.NotFound, exceptions.FeatureNotFound) @@ -98,23 +98,23 @@ except Exception: raise failure.Failure(exceptions.FeatureNotFound()) - def getIP(self, local=False): + def get_ip(self, local=False): """Return IP address found with UPnP-IGD @param local(bool): True to get external IP address, False to get local network one @return (None, str): found IP address, or None of something got wrong """ - def getIP(__): + def get_ip(__): if self._upnp is None: return None # lanaddr can be the empty string if not found, # we need to return None in this case return (self._upnp.lanaddr or None) if local else self._external_ip - return self._initialised.addCallback(getIP) + return self._initialised.addCallback(get_ip) - def _unmapPortsBlocking(self): + def _unmap_ports_blocking(self): """Unmap ports mapped in this session""" self._mutex.acquire() try: @@ -137,7 +137,7 @@ finally: self._mutex.release() - def _mapPortBlocking(self, int_port, ext_port, protocol, desc): + def _map_port_blocking(self, int_port, ext_port, protocol, desc): """Internal blocking method to map port @param int_port(int): internal port to use @@ -186,7 +186,7 @@ return ext_port - def mapPort(self, int_port, ext_port=None, protocol="TCP", desc=DEFAULT_DESC): + def map_port(self, int_port, ext_port=None, protocol="TCP", desc=DEFAULT_DESC): """Add a port mapping @param int_port(int): internal port to use @@ -199,7 +199,7 @@ if self._upnp is None: return defer.succeed(None) - def mappingCb(ext_port): + def mapping_cb(ext_port): log.info( "{protocol} mapping from {int_port} to {ext_port} successful".format( protocol=protocol, int_port=int_port, ext_port=ext_port @@ -207,16 +207,16 @@ ) return ext_port - def mappingEb(failure_): + def mapping_eb(failure_): failure_.trap(MappingError) log.warning("Can't map internal {int_port}".format(int_port=int_port)) - def mappingUnknownEb(failure_): + def mapping_unknown_eb(failure_): log.error(_("error while trying to map ports: {msg}").format(msg=failure_)) d = threads.deferToThread( - self._mapPortBlocking, int_port, ext_port, protocol, desc + self._map_port_blocking, int_port, ext_port, protocol, desc ) - d.addCallbacks(mappingCb, mappingEb) - d.addErrback(mappingUnknownEb) + d.addCallbacks(mapping_cb, mapping_eb) + d.addErrback(mapping_unknown_eb) return d diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_quiz.py --- a/sat/plugins/plugin_misc_quiz.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_quiz.py Sat Apr 08 13:54:42 2023 +0200 @@ -44,7 +44,7 @@ class Quiz(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -53,7 +53,7 @@ def __init__(self, host): log.info(_("Plugin Quiz initialization")) - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -62,39 +62,39 @@ game_init={"stage": None}, player_init={"score": 0}, ) - host.bridge.addMethod( - "quizGameLaunch", + host.bridge.add_method( + "quiz_game_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, ) # args: players, room_jid, profile - host.bridge.addMethod( - "quizGameCreate", + host.bridge.add_method( + "quiz_game_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) # args: room_jid, players, profile - host.bridge.addMethod( - "quizGameReady", + host.bridge.add_method( + "quiz_game_ready", ".plugin", in_sign="sss", out_sign="", - method=self._playerReady, + method=self._player_ready, ) # args: player, referee, profile - host.bridge.addMethod( - "quizGameAnswer", + host.bridge.add_method( + "quiz_game_answer", ".plugin", in_sign="ssss", out_sign="", - method=self.playerAnswer, + method=self.player_answer, ) - host.bridge.addSignal( - "quizGameStarted", ".plugin", signature="ssass" + host.bridge.add_signal( + "quiz_game_started", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "quizGameNew", + host.bridge.add_signal( + "quiz_game_new", ".plugin", signature="sa{ss}s", doc={ @@ -104,8 +104,8 @@ "param_2": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameQuestion", + host.bridge.add_signal( + "quiz_game_question", ".plugin", signature="sssis", doc={ @@ -117,8 +117,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGamePlayerBuzzed", + host.bridge.add_signal( + "quiz_game_player_buzzed", ".plugin", signature="ssbs", doc={ @@ -129,8 +129,8 @@ "param_3": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGamePlayerSays", + host.bridge.add_signal( + "quiz_game_player_says", ".plugin", signature="sssis", doc={ @@ -142,8 +142,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameAnswerResult", + host.bridge.add_signal( + "quiz_game_answer_result", ".plugin", signature="ssba{si}s", doc={ @@ -155,8 +155,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameTimerExpired", + host.bridge.add_signal( + "quiz_game_timer_expired", ".plugin", signature="ss", doc={ @@ -165,8 +165,8 @@ "param_1": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameTimerRestarted", + host.bridge.add_signal( + "quiz_game_timer_restarted", ".plugin", signature="sis", doc={ @@ -238,7 +238,7 @@ def __start_play(self, room_jid, game_data, profile): """Start the game (tell to the first player after dealer to play""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data["stage"] = "play" next_player_idx = game_data["current_player"] = ( game_data["init_player"] + 1 @@ -251,9 +251,9 @@ mess.firstChildElement().addElement("your_turn") client.send(mess) - def playerAnswer(self, player, referee, answer, profile_key=C.PROF_KEY_NONE): + def player_answer(self, player, referee, answer, profile_key=C.PROF_KEY_NONE): """Called when a player give an answer""" - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.debug( "new player answer (%(profile)s): %(answer)s" % {"profile": client.profile, "answer": answer} @@ -264,17 +264,17 @@ answer_elt.addContent(answer) client.send(mess) - def timerExpired(self, room_jid, profile): + def timer_expired(self, room_jid, profile): """Called when nobody answered the question in time""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] game_data["stage"] = "expired" mess = self.createGameElt(room_jid) mess.firstChildElement().addElement("timer_expired") client.send(mess) - reactor.callLater(4, self.askQuestion, room_jid, client.profile) + reactor.callLater(4, self.ask_question, room_jid, client.profile) - def pauseTimer(self, room_jid): + def pause_timer(self, room_jid): """Stop the timer and save the time left""" game_data = self.games[room_jid] left = max(0, game_data["timer"].getTime() - time()) @@ -283,9 +283,9 @@ game_data["previous_stage"] = game_data["stage"] game_data["stage"] = "paused" - def restartTimer(self, room_jid, profile): + def restart_timer(self, room_jid, profile): """Restart a timer with the saved time""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] assert game_data["time_left"] is not None mess = self.createGameElt(room_jid) @@ -293,15 +293,15 @@ jabber_client.restarted_elt["time_left"] = str(game_data["time_left"]) client.send(mess) game_data["timer"] = reactor.callLater( - game_data["time_left"], self.timerExpired, room_jid, profile + game_data["time_left"], self.timer_expired, room_jid, profile ) game_data["time_left"] = None game_data["stage"] = game_data["previous_stage"] del game_data["previous_stage"] - def askQuestion(self, room_jid, profile): + def ask_question(self, room_jid, profile): """Ask a new question""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] game_data["stage"] = "question" game_data["question_id"] = "1" @@ -314,13 +314,13 @@ ) client.send(mess) game_data["timer"] = reactor.callLater( - timer, self.timerExpired, room_jid, profile + timer, self.timer_expired, room_jid, profile ) game_data["time_left"] = None - def checkAnswer(self, room_jid, player, answer, profile): + def check_answer(self, room_jid, player, answer, profile): """Check if the answer given is right""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] players_data = game_data["players_data"] good_answer = game_data["question_id"] == "1" and answer == "42" @@ -334,11 +334,11 @@ client.send(mess) if good_answer: - reactor.callLater(4, self.askQuestion, room_jid, profile) + reactor.callLater(4, self.ask_question, room_jid, profile) else: - reactor.callLater(4, self.restartTimer, room_jid, profile) + reactor.callLater(4, self.restart_timer, room_jid, profile) - def newGame(self, room_jid, profile): + def new_game(self, room_jid, profile): """Launch a new round""" common_data = {"game_score": 0} new_game_data = { @@ -349,11 +349,11 @@ ) } msg_elts = self.__game_data_to_xml(new_game_data) - RoomGame.newRound(self, room_jid, (common_data, msg_elts), profile) - reactor.callLater(10, self.askQuestion, room_jid, profile) + RoomGame.new_round(self, room_jid, (common_data, msg_elts), profile) + reactor.callLater(10, self.ask_question, room_jid, profile) def room_game_cmd(self, mess_elt, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) from_jid = jid.JID(mess_elt["from"]) room_jid = jid.JID(from_jid.userhost()) game_elt = mess_elt.firstChildElement() @@ -367,7 +367,7 @@ players = [] for player in elt.elements(): players.append(str(player)) - self.host.bridge.quizGameStarted( + self.host.bridge.quiz_game_started( room_jid.userhost(), from_jid.full(), players, profile ) @@ -383,15 +383,15 @@ if ( list(status.values()).count("ready") == nb_players ): # everybody is ready, we can start the game - self.newGame(room_jid, profile) + self.new_game(room_jid, profile) elif elt.name == "game_data": - self.host.bridge.quizGameNew( + self.host.bridge.quiz_game_new( room_jid.userhost(), self.__xml_to_game_data(elt), profile ) elif elt.name == "question": # A question is asked - self.host.bridge.quizGameQuestion( + self.host.bridge.quiz_game_question( room_jid.userhost(), elt["id"], str(elt), @@ -411,7 +411,7 @@ buzzer_elt["pause"] = str(pause) client.send(mess) if pause: - self.pauseTimer(room_jid) + self.pause_timer(room_jid) # and we send the player answer mess = self.createGameElt(room_jid) _answer = str(elt) @@ -421,16 +421,16 @@ say_elt["delay"] = "3" reactor.callLater(2, client.send, mess) reactor.callLater( - 6, self.checkAnswer, room_jid, player, _answer, profile=profile + 6, self.check_answer, room_jid, player, _answer, profile=profile ) elif elt.name == "player_buzzed": - self.host.bridge.quizGamePlayerBuzzed( + self.host.bridge.quiz_game_player_buzzed( room_jid.userhost(), elt["player"], elt["pause"] == str(True), profile ) elif elt.name == "player_says": - self.host.bridge.quizGamePlayerSays( + self.host.bridge.quiz_game_player_says( room_jid.userhost(), elt["player"], str(elt), @@ -440,15 +440,15 @@ elif elt.name == "answer_result": player, good_answer, score = self.__answer_result_to_signal_args(elt) - self.host.bridge.quizGameAnswerResult( + self.host.bridge.quiz_game_answer_result( room_jid.userhost(), player, good_answer, score, profile ) elif elt.name == "timer_expired": - self.host.bridge.quizGameTimerExpired(room_jid.userhost(), profile) + self.host.bridge.quiz_game_timer_expired(room_jid.userhost(), profile) elif elt.name == "timer_restarted": - self.host.bridge.quizGameTimerRestarted( + self.host.bridge.quiz_game_timer_restarted( room_jid.userhost(), int(elt["time_left"]), profile ) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_radiocol.py --- a/sat/plugins/plugin_misc_radiocol.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_radiocol.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,7 +65,7 @@ class Radiocol(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -74,7 +74,7 @@ def __init__(self, host): log.info(_("Radio collective initialization")) - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -89,49 +89,49 @@ }, ) self.host = host - host.bridge.addMethod( - "radiocolLaunch", + host.bridge.add_method( + "radiocol_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, async_=True, ) - host.bridge.addMethod( - "radiocolCreate", + host.bridge.add_method( + "radiocol_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) - host.bridge.addMethod( - "radiocolSongAdded", + host.bridge.add_method( + "radiocol_song_added", ".plugin", in_sign="sss", out_sign="", - method=self._radiocolSongAdded, + method=self._radiocol_song_added, async_=True, ) - host.bridge.addSignal( - "radiocolPlayers", ".plugin", signature="ssass" + host.bridge.add_signal( + "radiocol_players", ".plugin", signature="ssass" ) # room_jid, referee, players, profile - host.bridge.addSignal( - "radiocolStarted", ".plugin", signature="ssasais" + host.bridge.add_signal( + "radiocol_started", ".plugin", signature="ssasais" ) # room_jid, referee, players, [QUEUE_TO_START, QUEUE_LIMIT], profile - host.bridge.addSignal( - "radiocolSongRejected", ".plugin", signature="sss" + host.bridge.add_signal( + "radiocol_song_rejected", ".plugin", signature="sss" ) # room_jid, reason, profile - host.bridge.addSignal( - "radiocolPreload", ".plugin", signature="ssssssss" + host.bridge.add_signal( + "radiocol_preload", ".plugin", signature="ssssssss" ) # room_jid, timestamp, filename, title, artist, album, profile - host.bridge.addSignal( - "radiocolPlay", ".plugin", signature="sss" + host.bridge.add_signal( + "radiocol_play", ".plugin", signature="sss" ) # room_jid, filename, profile - host.bridge.addSignal( - "radiocolNoUpload", ".plugin", signature="ss" + host.bridge.add_signal( + "radiocol_no_upload", ".plugin", signature="ss" ) # room_jid, profile - host.bridge.addSignal( - "radiocolUploadOk", ".plugin", signature="ss" + host.bridge.add_signal( + "radiocol_upload_ok", ".plugin", signature="ss" ) # room_jid, profile def __create_preload_elt(self, sender, song_added_elt): @@ -143,10 +143,10 @@ # XXX: the frontend should know the temporary directory where file is put return preload_elt - def _radiocolSongAdded(self, referee_s, song_path, profile): - return self.radiocolSongAdded(jid.JID(referee_s), song_path, profile) + def _radiocol_song_added(self, referee_s, song_path, profile): + return self.radiocol_song_added(jid.JID(referee_s), song_path, profile) - def radiocolSongAdded(self, referee, song_path, profile): + def radiocol_song_added(self, referee, song_path, profile): """This method is called by libervia when a song has been uploaded @param referee (jid.JID): JID of the referee in the room (room userhost + '/' + nick) @param song_path (unicode): absolute path of the song added @@ -174,7 +174,7 @@ song = OggVorbis(song_path) except (OggVorbisHeaderError, HeaderNotFoundError): # this file is not ogg vorbis nor mp3, we reject it - self.deleteFile(song_path) # FIXME: same host trick (see note above) + self.delete_file(song_path) # FIXME: same host trick (see note above) return defer.fail( exceptions.DataError( D_( @@ -200,7 +200,7 @@ ) # FIXME: works only because of the same host trick, see the note under the docstring return self.send(referee, ("", "song_added"), attrs, profile=profile) - def playNext(self, room_jid, profile): + def play_next(self, room_jid, profile): """"Play next song in queue if exists, and put a timer which trigger after the song has been played to play next one""" # TODO: songs need to be erased once played or found invalids @@ -210,7 +210,7 @@ log.debug(_("No more participants in the radiocol: cleaning data")) radio_data["queue"] = [] for filename in radio_data["to_delete"]: - self.deleteFile(filename, radio_data) + self.delete_file(filename, radio_data) radio_data["to_delete"] = {} queue = radio_data["queue"] if not queue: @@ -228,13 +228,13 @@ self.send(room_jid, ("", "upload_ok"), profile=profile) radio_data["upload"] = True - reactor.callLater(length, self.playNext, room_jid, profile) + reactor.callLater(length, self.play_next, room_jid, profile) # we wait more than the song length to delete the file, to manage poorly reactive networks/clients reactor.callLater( - length + 90, self.deleteFile, filename, radio_data + length + 90, self.delete_file, filename, radio_data ) # FIXME: same host trick (see above) - def deleteFile(self, filename, radio_data=None): + def delete_file(self, filename, radio_data=None): """ Delete a previously uploaded file. @param filename: filename to delete, or full filepath if radio_data is None @@ -263,16 +263,16 @@ def room_game_cmd(self, mess_elt, profile): from_jid = jid.JID(mess_elt["from"]) room_jid = from_jid.userhostJID() - nick = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) + nick = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) radio_elt = mess_elt.firstChildElement() radio_data = self.games[room_jid] if "queue" in radio_data: queue = radio_data["queue"] - from_referee = self.isReferee(room_jid, from_jid.resource) - to_referee = self.isReferee(room_jid, jid.JID(mess_elt["to"]).user) - is_player = self.isPlayer(room_jid, nick) + from_referee = self.is_referee(room_jid, from_jid.resource) + to_referee = self.is_referee(room_jid, jid.JID(mess_elt["to"]).user) + is_player = self.is_player(room_jid, nick) for elt in radio_elt.elements(): if not from_referee and not (to_referee and elt.name == "song_added"): continue # sender must be referee, expect when a song is submitted @@ -287,9 +287,9 @@ for player in elt.elements(): players.append(str(player)) signal = ( - self.host.bridge.radiocolStarted + self.host.bridge.radiocol_started if elt.name == "started" - else self.host.bridge.radiocolPlayers + else self.host.bridge.radiocol_players ) signal( room_jid.userhost(), @@ -299,7 +299,7 @@ profile, ) elif elt.name == "preload": # a song is in queue and must be preloaded - self.host.bridge.radiocolPreload( + self.host.bridge.radiocol_preload( room_jid.userhost(), elt["timestamp"], elt["filename"], @@ -310,17 +310,17 @@ profile, ) elif elt.name == "play": - self.host.bridge.radiocolPlay( + self.host.bridge.radiocol_play( room_jid.userhost(), elt["filename"], profile ) elif elt.name == "song_rejected": # a song has been refused - self.host.bridge.radiocolSongRejected( + self.host.bridge.radiocol_song_rejected( room_jid.userhost(), elt["reason"], profile ) elif elt.name == "no_upload": - self.host.bridge.radiocolNoUpload(room_jid.userhost(), profile) + self.host.bridge.radiocol_no_upload(room_jid.userhost(), profile) elif elt.name == "upload_ok": - self.host.bridge.radiocolUploadOk(room_jid.userhost(), profile) + self.host.bridge.radiocol_upload_ok(room_jid.userhost(), profile) elif elt.name == "song_added": # a song has been added # FIXME: we are KISS for the proof of concept: every song is added, to a limit of 3 in queue. # Need to manage some sort of rules to allow peoples to send songs @@ -348,11 +348,11 @@ if not radio_data["playing"] and len(queue) == QUEUE_TO_START: # We have not started playing yet, and we have QUEUE_TO_START # songs in queue. We can now start the party :) - self.playNext(room_jid, profile) + self.play_next(room_jid, profile) else: log.error(_("Unmanaged game element: %s") % elt.name) - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): game_data = self.games[room_jid] elements = [] if game_data["playing"]: diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_register_account.py --- a/sat/plugins/plugin_misc_register_account.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_register_account.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,14 +49,14 @@ log.info(_("Plugin Register Account initialization")) self.host = host self._sessions = Sessions() - host.registerCallback( - self.registerNewAccountCB, with_data=True, force_id="registerNewAccount" + host.register_callback( + self.register_new_account_cb, with_data=True, force_id="register_new_account" ) - self.__register_account_id = host.registerCallback( - self._registerConfirmation, with_data=True + self.__register_account_id = host.register_callback( + self._register_confirmation, with_data=True ) - def registerNewAccountCB(self, data, profile): + def register_new_account_cb(self, data, profile): """Called when the user click on the "New account" button.""" session_data = {} @@ -81,7 +81,7 @@ session_data["user"], host, resource = jid.parse(session_data["JabberID"]) session_data["server"] = session_data[C.FORCE_SERVER_PARAM] or host - session_id, __ = self._sessions.newSession(session_data, profile=profile) + session_id, __ = self._sessions.new_session(session_data, profile=profile) form_ui = xml_tools.XMLUI( "form", title=D_("Register new account"), @@ -95,30 +95,30 @@ ) return {"xmlui": form_ui.toXml()} - def _registerConfirmation(self, data, profile): + def _register_confirmation(self, data, profile): """Save the related parameters and proceed the registration.""" - session_data = self._sessions.profileGet(data["session_id"], profile) + session_data = self._sessions.profile_get(data["session_id"], profile) - self.host.memory.setParam( + self.host.memory.param_set( "JabberID", session_data["JabberID"], "Connection", profile_key=profile ) - self.host.memory.setParam( + self.host.memory.param_set( "Password", session_data["Password"], "Connection", profile_key=profile ) - self.host.memory.setParam( + self.host.memory.param_set( C.FORCE_SERVER_PARAM, session_data[C.FORCE_SERVER_PARAM], "Connection", profile_key=profile, ) - self.host.memory.setParam( + self.host.memory.param_set( C.FORCE_PORT_PARAM, session_data[C.FORCE_PORT_PARAM], "Connection", profile_key=profile, ) - d = self._registerNewAccount( + d = self._register_new_account( jid.JID(session_data["JabberID"]), session_data["Password"], None, @@ -127,14 +127,14 @@ del self._sessions[data["session_id"]] return d - def _registerNewAccount(self, client, jid_, password, email, server): + def _register_new_account(self, client, jid_, password, email, server): #  FIXME: port is not set here - def registeredCb(__): + def registered_cb(__): xmlui = xml_tools.XMLUI("popup", title=D_("Confirmation")) xmlui.addText(D_("Registration successful.")) return {"xmlui": xmlui.toXml()} - def registeredEb(failure): + def registered_eb(failure): xmlui = xml_tools.XMLUI("popup", title=D_("Failure")) xmlui.addText(D_("Registration failed: %s") % failure.getErrorMessage()) try: @@ -146,8 +146,8 @@ pass return {"xmlui": xmlui.toXml()} - registered_d = self.host.plugins["XEP-0077"].registerNewAccount( + registered_d = self.host.plugins["XEP-0077"].register_new_account( client, jid_, password, email=email, host=server, port=C.XMPP_C2S_PORT ) - registered_d.addCallbacks(registeredCb, registeredEb) + registered_d.addCallbacks(registered_cb, registered_eb) return registered_d diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_room_game.py --- a/sat/plugins/plugin_misc_room_game.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_room_game.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,9 +56,9 @@ class RoomGame(object): """This class is used to help launching a MUC game. - Bridge methods callbacks: _prepareRoom, _playerReady, _createGame - Triggered methods: userJoinedTrigger, userLeftTrigger - Also called from subclasses: newRound + bridge methods callbacks: _prepare_room, _player_ready, _create_game + Triggered methods: user_joined_trigger, user_left_trigger + Also called from subclasses: new_round For examples of messages sequences, please look in sub-classes. """ @@ -81,13 +81,13 @@ class MyGame(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type(self.__class__.__name__, (self.__class__, RoomGame, object), {}) def __init__(self, host): - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_(self, host, ...) """ @@ -125,41 +125,41 @@ # by an arbitrary value. If needed, this attribute would be set to True from the testcase. self.testing = False - host.trigger.add("MUC user joined", self.userJoinedTrigger) - host.trigger.add("MUC user left", self.userLeftTrigger) + host.trigger.add("MUC user joined", self.user_joined_trigger) + host.trigger.add("MUC user left", self.user_left_trigger) - def _createOrInvite(self, room_jid, other_players, profile): + def _create_or_invite(self, room_jid, other_players, profile): """ This is called only when someone explicitly wants to play. The game will not be created if one already exists in the room, also its creation could be postponed until all the expected players - join the room (in that case it will be created from userJoinedTrigger). + join the room (in that case it will be created from user_joined_trigger). @param room (wokkel.muc.Room): the room @param other_players (list[jid.JID]): list of the other players JID (bare) """ # FIXME: broken ! raise NotImplementedError("To be fixed") - client = self.host.getClient(profile) - user_jid = self.host.getJidNStream(profile)[0] - nick = self.host.plugins["XEP-0045"].getRoomNick(client, room_jid) + client = self.host.get_client(profile) + user_jid = self.host.get_jid_n_stream(profile)[0] + nick = self.host.plugins["XEP-0045"].get_room_nick(client, room_jid) nicks = [nick] - if self._gameExists(room_jid): - if not self._checkJoinAuth(room_jid, user_jid, nick): + if self._game_exists(room_jid): + if not self._check_join_auth(room_jid, user_jid, nick): return - nicks.extend(self._invitePlayers(room_jid, other_players, nick, profile)) - self._updatePlayers(room_jid, nicks, True, profile) + nicks.extend(self._invite_players(room_jid, other_players, nick, profile)) + self._update_players(room_jid, nicks, True, profile) else: - self._initGame(room_jid, nick) - (auth, waiting, missing) = self._checkWaitAuth(room_jid, other_players) + self._init_game(room_jid, nick) + (auth, waiting, missing) = self._check_wait_auth(room_jid, other_players) nicks.extend(waiting) - nicks.extend(self._invitePlayers(room_jid, missing, nick, profile)) + nicks.extend(self._invite_players(room_jid, missing, nick, profile)) if auth: - self.createGame(room_jid, nicks, profile) + self.create_game(room_jid, nicks, profile) else: - self._updatePlayers(room_jid, nicks, False, profile) + self._update_players(room_jid, nicks, False, profile) - def _initGame(self, room_jid, referee_nick): + def _init_game(self, room_jid, referee_nick): """ @param room_jid (jid.JID): JID of the room @@ -167,7 +167,7 @@ """ # Important: do not add the referee to 'players' yet. For a # message to be emitted whenever a new player is joining, - # it is necessary to not modify 'players' outside of _updatePlayers. + # it is necessary to not modify 'players' outside of _update_players. referee_jid = jid.JID(room_jid.userhost() + "/" + referee_nick) self.games[room_jid] = { "referee": referee_jid, @@ -178,19 +178,19 @@ self.games[room_jid].update(copy.deepcopy(self.game_init)) self.invitations.setdefault(room_jid, []) - def _gameExists(self, room_jid, started=False): + def _game_exists(self, room_jid, started=False): """Return True if a game has been initialized/started. @param started: if False, the game must be initialized to return True, - otherwise it must be initialized and started with createGame. + otherwise it must be initialized and started with create_game. @return: True if a game is initialized/started in that room""" return room_jid in self.games and (not started or self.games[room_jid]["started"]) - def _checkJoinAuth(self, room_jid, user_jid=None, nick="", verbose=False): + def _check_join_auth(self, room_jid, user_jid=None, nick="", verbose=False): """Checks if this profile is allowed to join the game. The parameter nick is used to check if the user is already a player in that game. When this method is called from - userJoinedTrigger, nick is also used to check the user + user_joined_trigger, nick is also used to check the user identity instead of user_jid_s (see TODO comment below). @param room_jid (jid.JID): the JID of the room hosting the game @param user_jid (jid.JID): JID of the user @@ -198,9 +198,9 @@ @return: True if this profile can join the game """ auth = False - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): auth = False - elif self.join_mode == self.ALL or self.isPlayer(room_jid, nick): + elif self.join_mode == self.ALL or self.is_player(room_jid, nick): auth = True elif self.join_mode == self.INVITED: # considering all the batches of invitations @@ -227,7 +227,7 @@ ) return auth - def _updatePlayers(self, room_jid, nicks, sync, profile): + def _update_players(self, room_jid, nicks, sync, profile): """Update the list of players and signal to the room that some players joined the game. If sync is True, the news players are synchronized with the game data they have missed. Remark: self.games[room_jid]['players'] should not be modified outside this method. @@ -251,16 +251,16 @@ sync = ( sync - and self._gameExists(room_jid, True) + and self._game_exists(room_jid, True) and len(self.games[room_jid]["players"]) > 0 ) setStatus("desync" if sync else "init") self.games[room_jid]["players"].extend(new_nicks) - self._synchronizeRoom(room_jid, [room_jid], profile) + self._synchronize_room(room_jid, [room_jid], profile) if sync: setStatus("init") - def _synchronizeRoom(self, room_jid, recipients, profile): + def _synchronize_room(self, room_jid, recipients, profile): """Communicate the list of players to the whole room or only to some users, also send the synchronization data to the players who recently joined the game. @param room_jid (jid.JID): JID of the room @@ -269,16 +269,16 @@ - room JID + "/" + user nick @param profile (unicode): %(doc_profile)s """ - if self._gameExists(room_jid, started=True): - element = self._createStartElement(self.games[room_jid]["players"]) + if self._game_exists(room_jid, started=True): + element = self._create_start_element(self.games[room_jid]["players"]) else: - element = self._createStartElement( + element = self._create_start_element( self.games[room_jid]["players"], name="players" ) elements = [(element, None, None)] sync_args = [] - sync_data = self._getSyncData(room_jid) + sync_data = self._get_sync_data(room_jid) for nick in sync_data: user_jid = jid.JID(room_jid.userhost() + "/" + nick) if user_jid in recipients: @@ -291,19 +291,19 @@ sync_args.append(([user_jid, user_elements], {"profile": profile})) for recipient in recipients: - self._sendElements(recipient, elements, profile=profile) + self._send_elements(recipient, elements, profile=profile) for args, kwargs in sync_args: - self._sendElements(*args, **kwargs) + self._send_elements(*args, **kwargs) - def _getSyncData(self, room_jid, force_nicks=None): + def _get_sync_data(self, room_jid, force_nicks=None): """The synchronization data are returned for each player who has the state 'desync' or if he's been contained by force_nicks. @param room_jid (jid.JID): JID of the room @param force_nicks: force the synchronization for this list of the nicks @return: a mapping between player nicks and a list of elements to - be sent by self._synchronizeRoom for the game to be synchronized. + be sent by self._synchronize_room for the game to be synchronized. """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return {} data = {} status = self.games[room_jid]["status"] @@ -314,12 +314,12 @@ if nick not in nicks: nicks.append(nick) for nick in nicks: - elements = self.getSyncDataForPlayer(room_jid, nick) + elements = self.get_sync_data_for_player(room_jid, nick) if elements: data[nick] = elements return data - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): """This method may (and should probably) be overwritten by a child class. @param room_jid (jid.JID): JID of the room @param nick: the nick of the player to be synchronized @@ -327,7 +327,7 @@ """ return [] - def _invitePlayers(self, room_jid, other_players, nick, profile): + def _invite_players(self, room_jid, other_players, nick, profile): """Invite players to a room, associated game may exist or not. @param other_players (list[jid.JID]): list of the players to invite @@ -336,7 +336,7 @@ """ raise NotImplementedError("Need to be fixed !") # FIXME: this is broken and unsecure ! - if not self._checkInviteAuth(room_jid, nick): + if not self._check_invite_auth(room_jid, nick): return [] # TODO: remove invitation waiting for too long, using the time data self.invitations[room_jid].append( @@ -356,7 +356,7 @@ nicks.append(other_nick) return nicks - def _checkInviteAuth(self, room_jid, nick, verbose=False): + def _check_invite_auth(self, room_jid, nick, verbose=False): """Checks if this user is allowed to invite players @param room_jid (jid.JID): JID of the room @@ -365,16 +365,16 @@ @return: True if the user is allowed to invite other players """ auth = False - if self.invite_mode == self.FROM_ALL or not self._gameExists(room_jid): + if self.invite_mode == self.FROM_ALL or not self._game_exists(room_jid): auth = True elif self.invite_mode == self.FROM_NONE: - auth = not self._gameExists(room_jid, started=True) and self.isReferee( + auth = not self._game_exists(room_jid, started=True) and self.is_referee( room_jid, nick ) elif self.invite_mode == self.FROM_REFEREE: - auth = self.isReferee(room_jid, nick) + auth = self.is_referee(room_jid, nick) elif self.invite_mode == self.FROM_PLAYERS: - auth = self.isPlayer(room_jid, nick) + auth = self.is_player(room_jid, nick) if not auth and (verbose or _DEBUG): log.debug( _("%(user)s not allowed to invite for the game %(game)s in %(room)s") @@ -382,31 +382,31 @@ ) return auth - def isReferee(self, room_jid, nick): + def is_referee(self, room_jid, nick): """Checks if the player with this nick is the referee for the game in this room" @param room_jid (jid.JID): room JID @param nick: user nick in the room @return: True if the user is the referee of the game in this room """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return False return ( jid.JID(room_jid.userhost() + "/" + nick) == self.games[room_jid]["referee"] ) - def isPlayer(self, room_jid, nick): + def is_player(self, room_jid, nick): """Checks if the user with this nick is a player for the game in this room. @param room_jid (jid.JID): JID of the room @param nick: user nick in the room @return: True if the user is a player of the game in this room """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return False # Important: the referee is not in the 'players' list right after - # the game initialization, that's why we do also check with isReferee - return nick in self.games[room_jid]["players"] or self.isReferee(room_jid, nick) + # the game initialization, that's why we do also check with is_referee + return nick in self.games[room_jid]["players"] or self.is_referee(room_jid, nick) - def _checkWaitAuth(self, room, other_players, verbose=False): + def _check_wait_auth(self, room, other_players, verbose=False): """Check if we must wait for other players before starting the game. @param room (wokkel.muc.Room): the room @@ -441,26 +441,26 @@ ) return result - def getUniqueName(self, muc_service=None, profile_key=C.PROF_KEY_NONE): + def get_unique_name(self, muc_service=None, profile_key=C.PROF_KEY_NONE): """Generate unique room name @param muc_service (jid.JID): you can leave empty to autofind the muc service @param profile_key (unicode): %(doc_profile_key)s @return: jid.JID (unique name for a new room to be created) """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) # FIXME: jid.JID must be used instead of strings - room = self.host.plugins["XEP-0045"].getUniqueName(client, muc_service) + room = self.host.plugins["XEP-0045"].get_unique_name(client, muc_service) return jid.JID("sat_%s_%s" % (self.name.lower(), room.userhost())) - def _prepareRoom( + def _prepare_room( self, other_players=None, room_jid_s="", profile_key=C.PROF_KEY_NONE ): room_jid = jid.JID(room_jid_s) if room_jid_s else None other_players = [jid.JID(player).userhostJID() for player in other_players] - return self.prepareRoom(other_players, room_jid, profile_key) + return self.prepare_room(other_players, room_jid, profile_key) - def prepareRoom(self, other_players=None, room_jid=None, profile_key=C.PROF_KEY_NONE): + def prepare_room(self, other_players=None, room_jid=None, profile_key=C.PROF_KEY_NONE): """Prepare the room for a game: create it if it doesn't exist and invite players. @param other_players (list[JID]): list of other players JID (bare) @@ -468,9 +468,9 @@ @param profile_key (unicode): %(doc_profile_key)s """ # FIXME: need to be refactored - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.debug(_("Preparing room for %s game") % self.name) - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("Unknown profile")) return defer.succeed(None) @@ -479,19 +479,19 @@ # Create/join the given room, or a unique generated one if no room is specified. if room_jid is None: - room_jid = self.getUniqueName(profile_key=profile_key) + room_jid = self.get_unique_name(profile_key=profile_key) else: - self.host.plugins["XEP-0045"].checkRoomJoined(client, room_jid) - self._createOrInvite(client, room_jid, other_players) + self.host.plugins["XEP-0045"].check_room_joined(client, room_jid) + self._create_or_invite(client, room_jid, other_players) return defer.succeed(None) - user_jid = self.host.getJidNStream(profile)[0] + user_jid = self.host.get_jid_n_stream(profile)[0] d = self.host.plugins["XEP-0045"].join(room_jid, user_jid.user, {}, profile) return d.addCallback( - lambda __: self._createOrInvite(client, room_jid, other_players) + lambda __: self._create_or_invite(client, room_jid, other_players) ) - def userJoinedTrigger(self, room, user, profile): + def user_joined_trigger(self, room, user, profile): """This trigger is used to check if the new user can take part of a game, create the game if we were waiting for him or just update the players list. @room: wokkel.muc.Room object. room.roster is a dict{wokkel.muc.User.nick: wokkel.muc.User} @@ -500,13 +500,13 @@ """ room_jid = room.occupantJID.userhostJID() profile_nick = room.occupantJID.resource - if not self.isReferee(room_jid, profile_nick): + if not self.is_referee(room_jid, profile_nick): return True # profile is not the referee - if not self._checkJoinAuth( + if not self._check_join_auth( room_jid, user.entity if user.entity else None, user.nick ): # user not allowed but let him know that we are playing :p - self._synchronizeRoom( + self._synchronize_room( room_jid, [jid.JID(room_jid.userhost() + "/" + user.nick)], profile ) return True @@ -520,17 +520,17 @@ ) return True other_players = self.invitations[room_jid][batch][1] - (auth, nicks, __) = self._checkWaitAuth(room, other_players) + (auth, nicks, __) = self._check_wait_auth(room, other_players) if auth: del self.invitations[room_jid][batch] nicks.insert(0, profile_nick) # add the referee - self.createGame(room_jid, nicks, profile_key=profile) + self.create_game(room_jid, nicks, profile_key=profile) return True # let the room know that a new player joined - self._updatePlayers(room_jid, [user.nick], True, profile) + self._update_players(room_jid, [user.nick], True, profile) return True - def userLeftTrigger(self, room, user, profile): + def user_left_trigger(self, room, user, profile): """This trigger is used to update or stop the game when a user leaves. @room: wokkel.muc.Room object. room.roster is a dict{wokkel.muc.User.nick: wokkel.muc.User} @@ -539,9 +539,9 @@ """ room_jid = room.occupantJID.userhostJID() profile_nick = room.occupantJID.resource - if not self.isReferee(room_jid, profile_nick): + if not self.is_referee(room_jid, profile_nick): return True # profile is not the referee - if self.isPlayer(room_jid, user.nick): + if self.is_player(room_jid, user.nick): try: self.games[room_jid]["players"].remove(user.nick) except ValueError: @@ -559,7 +559,7 @@ self.invitations[room_jid][batch][1].append(user_jid) return True - def _checkCreateGameAndInit(self, room_jid, profile): + def _check_create_game_and_init(self, room_jid, profile): """Check if that profile can create the game. If the game can be created but is not initialized yet, this method will also do the initialization. @@ -569,16 +569,16 @@ - create: set to True to allow the game creation - sync: set to True to advice a game synchronization """ - user_nick = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) + user_nick = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) if not user_nick: log.error( "Internal error: profile %s has not joined the room %s" % (profile, room_jid.userhost()) ) return False, False - if self._gameExists(room_jid): - is_referee = self.isReferee(room_jid, user_nick) - if self._gameExists(room_jid, started=True): + if self._game_exists(room_jid): + is_referee = self.is_referee(room_jid, user_nick) + if self._game_exists(room_jid, started=True): log.info( _("%(game)s game already created in room %(room)s") % {"game": self.name, "room": room_jid.userhost()} @@ -591,13 +591,13 @@ ) return False, False else: - self._initGame(room_jid, user_nick) + self._init_game(room_jid, user_nick) return True, False - def _createGame(self, room_jid_s, nicks=None, profile_key=C.PROF_KEY_NONE): - self.createGame(jid.JID(room_jid_s), nicks, profile_key) + def _create_game(self, room_jid_s, nicks=None, profile_key=C.PROF_KEY_NONE): + self.create_game(jid.JID(room_jid_s), nicks, profile_key) - def createGame(self, room_jid, nicks=None, profile_key=C.PROF_KEY_NONE): + def create_game(self, room_jid, nicks=None, profile_key=C.PROF_KEY_NONE): """Create a new game. This can be called directly from a frontend and skips all the checks and invitation system, @@ -610,19 +610,19 @@ _("Creating %(game)s game in room %(room)s") % {"game": self.name, "room": room_jid} ) - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return - (create, sync) = self._checkCreateGameAndInit(room_jid, profile) + (create, sync) = self._check_create_game_and_init(room_jid, profile) if nicks is None: nicks = [] if not create: if sync: - self._updatePlayers(room_jid, nicks, True, profile) + self._update_players(room_jid, nicks, True, profile) return self.games[room_jid]["started"] = True - self._updatePlayers(room_jid, nicks, False, profile) + self._update_players(room_jid, nicks, False, profile) if self.player_init: # specific data to each player (score, private data) self.games[room_jid].setdefault("players_data", {}) @@ -632,16 +632,16 @@ self.player_init ) - def _playerReady(self, player_nick, referee_jid_s, profile_key=C.PROF_KEY_NONE): - self.playerReady(player_nick, jid.JID(referee_jid_s), profile_key) + def _player_ready(self, player_nick, referee_jid_s, profile_key=C.PROF_KEY_NONE): + self.player_ready(player_nick, jid.JID(referee_jid_s), profile_key) - def playerReady(self, player_nick, referee_jid, profile_key=C.PROF_KEY_NONE): + def player_ready(self, player_nick, referee_jid, profile_key=C.PROF_KEY_NONE): """Must be called when player is ready to start a new game @param player: the player nick in the room @param referee_jid (jid.JID): JID of the referee """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return @@ -649,7 +649,7 @@ # TODO: we probably need to add the game and room names in the sent message self.send(referee_jid, "player_ready", {"player": player_nick}, profile=profile) - def newRound(self, room_jid, data, profile): + def new_round(self, room_jid, data, profile): """Launch a new round (reinit the user data) @param room_jid: room userhost @@ -681,7 +681,7 @@ for player in players: players_data[player].update(copy.deepcopy(common_data)) - def _createGameElt(self, to_jid): + def _create_game_elt(self, to_jid): """Create a generic domish Element for the game messages @param to_jid: JID of the recipient @@ -694,7 +694,7 @@ elt.addElement(self.ns_tag) return elt - def _createStartElement(self, players=None, name="started"): + def _create_start_element(self, players=None, name="started"): """Create a domish Element listing the game users @param players: list of the players @@ -715,7 +715,7 @@ started_elt.addChild(player_elt) return started_elt - def _sendElements(self, to_jid, data, profile=None): + def _send_elements(self, to_jid, data, profile=None): """ TODO @param to_jid: recipient JID @@ -729,8 +729,8 @@ @param profile: the profile from which the message is sent @return: a Deferred instance """ - client = self.host.getClient(profile) - msg = self._createGameElt(to_jid) + client = self.host.get_client(profile) + msg = self._create_game_elt(to_jid) for elem, attrs, content in data: if elem is not None: if isinstance(elem, domish.Element): @@ -757,9 +757,9 @@ @param profile: the profile from which the message is sent @return: a Deferred instance """ - return self._sendElements(to_jid, [(elem, attrs, content)], profile) + return self._send_elements(to_jid, [(elem, attrs, content)], profile) - def getHandler(self, client): + def get_handler(self, client): return RoomGameHandler(self) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_static_blog.py --- a/sat/plugins/plugin_misc_static_blog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_static_blog.py Sat Apr 08 13:54:42 2023 +0200 @@ -76,13 +76,13 @@ def __init__(self, host): try: # TODO: remove this attribute when all blogs can be retrieved - self.domain = host.plugins["MISC-ACCOUNT"].getNewAccountDomain() + self.domain = host.plugins["MISC-ACCOUNT"].account_domain_new_get() except KeyError: self.domain = None - host.memory.updateParams(self.params) - # host.importMenu((D_("User"), D_("Public blog")), self._displayPublicBlog, security_limit=1, help_string=D_("Display public blog page"), type_=C.MENU_JID_CONTEXT) + host.memory.update_params(self.params) + # host.import_menu((D_("User"), D_("Public blog")), self._display_public_blog, security_limit=1, help_string=D_("Display public blog page"), type_=C.MENU_JID_CONTEXT) - def _displayPublicBlog(self, menu_data, profile): + def _display_public_blog(self, menu_data, profile): """Check if the blog can be displayed and answer the frontend. @param menu_data: %(menu_data)s diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_tarot.py --- a/sat/plugins/plugin_misc_tarot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_tarot.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,7 +49,7 @@ class Tarot(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -59,7 +59,7 @@ def __init__(self, host): log.info(_("Plugin Tarot initialization")) self._sessions = memory.Sessions() - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -81,61 +81,61 @@ _("Garde Sans"), _("Garde Contre"), ] - host.bridge.addMethod( - "tarotGameLaunch", + host.bridge.add_method( + "tarot_game_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, async_=True, ) # args: players, room_jid, profile - host.bridge.addMethod( - "tarotGameCreate", + host.bridge.add_method( + "tarot_game_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) # args: room_jid, players, profile - host.bridge.addMethod( - "tarotGameReady", + host.bridge.add_method( + "tarot_game_ready", ".plugin", in_sign="sss", out_sign="", - method=self._playerReady, + method=self._player_ready, ) # args: player, referee, profile - host.bridge.addMethod( - "tarotGamePlayCards", + host.bridge.add_method( + "tarot_game_play_cards", ".plugin", in_sign="ssa(ss)s", out_sign="", method=self.play_cards, ) # args: player, referee, cards, profile - host.bridge.addSignal( - "tarotGamePlayers", ".plugin", signature="ssass" + host.bridge.add_signal( + "tarot_game_players", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "tarotGameStarted", ".plugin", signature="ssass" + host.bridge.add_signal( + "tarot_game_started", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "tarotGameNew", ".plugin", signature="sa(ss)s" + host.bridge.add_signal( + "tarot_game_new", ".plugin", signature="sa(ss)s" ) # args: room_jid, hand, profile - host.bridge.addSignal( - "tarotGameChooseContrat", ".plugin", signature="sss" + host.bridge.add_signal( + "tarot_game_choose_contrat", ".plugin", signature="sss" ) # args: room_jid, xml_data, profile - host.bridge.addSignal( - "tarotGameShowCards", ".plugin", signature="ssa(ss)a{ss}s" + host.bridge.add_signal( + "tarot_game_show_cards", ".plugin", signature="ssa(ss)a{ss}s" ) # args: room_jid, type ["chien", "poignée",...], cards, data[dict], profile - host.bridge.addSignal( - "tarotGameCardsPlayed", ".plugin", signature="ssa(ss)s" + host.bridge.add_signal( + "tarot_game_cards_played", ".plugin", signature="ssa(ss)s" ) # args: room_jid, player, type ["chien", "poignée",...], cards, data[dict], profile - host.bridge.addSignal( - "tarotGameYourTurn", ".plugin", signature="ss" + host.bridge.add_signal( + "tarot_game_your_turn", ".plugin", signature="ss" ) # args: room_jid, profile - host.bridge.addSignal( - "tarotGameScore", ".plugin", signature="ssasass" + host.bridge.add_signal( + "tarot_game_score", ".plugin", signature="ssasass" ) # args: room_jid, xml_data, winners (list of nicks), loosers (list of nicks), profile - host.bridge.addSignal( - "tarotGameInvalidCards", ".plugin", signature="ssa(ss)a(ss)s" + host.bridge.add_signal( + "tarot_game_invalid_cards", ".plugin", signature="ssa(ss)a(ss)s" ) # args: room_jid, game phase, played_cards, invalid_cards, profile self.deck_ordered = [] for value in ["excuse"] + list(map(str, list(range(1, 22)))): @@ -143,10 +143,10 @@ for suit in ["pique", "coeur", "carreau", "trefle"]: for value in list(map(str, list(range(1, 11)))) + ["valet", "cavalier", "dame", "roi"]: self.deck_ordered.append(TarotCard((suit, value))) - self.__choose_contrat_id = host.registerCallback( - self._contratChoosed, with_data=True + self.__choose_contrat_id = host.register_callback( + self._contrat_choosed, with_data=True ) - self.__score_id = host.registerCallback(self._scoreShowed, with_data=True) + self.__score_id = host.register_callback(self._score_showed, with_data=True) def __card_list_to_xml(self, cards_list, elt_name): """Convert a card list to domish element""" @@ -519,13 +519,13 @@ to_jid = jid.JID(room_jid.userhost() + "/" + next_player) # FIXME: gof: self.send(to_jid, "your_turn", profile=profile) - def _contratChoosed(self, raw_data, profile): + def _contrat_choosed(self, raw_data, profile): """Will be called when the contrat is selected @param raw_data: contains the choosed session id and the chosen contrat @param profile_key: profile """ try: - session_data = self._sessions.profileGet(raw_data["session_id"], profile) + session_data = self._sessions.profile_get(raw_data["session_id"], profile) except KeyError: log.warning(_("session id doesn't exist, session has probably expired")) # TODO: send error dialog @@ -533,8 +533,8 @@ room_jid = session_data["room_jid"] referee_jid = self.games[room_jid]["referee"] - player = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) - data = xml_tools.XMLUIResult2DataFormResult(raw_data) + player = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) + data = xml_tools.xmlui_result_2_data_form_result(raw_data) contrat = data["contrat"] log.debug( _("contrat [%(contrat)s] choosed by %(profile)s") @@ -551,13 +551,13 @@ del self._sessions[raw_data["session_id"]] return d - def _scoreShowed(self, raw_data, profile): + def _score_showed(self, raw_data, profile): """Will be called when the player closes the score dialog @param raw_data: nothing to retrieve from here but the session id @param profile_key: profile """ try: - session_data = self._sessions.profileGet(raw_data["session_id"], profile) + session_data = self._sessions.profile_get(raw_data["session_id"], profile) except KeyError: log.warning(_("session id doesn't exist, session has probably expired")) # TODO: send error dialog @@ -565,7 +565,7 @@ room_jid_s = session_data["room_jid"].userhost() # XXX: empty hand means to the frontend "reset the display"... - self.host.bridge.tarotGameNew(room_jid_s, [], profile) + self.host.bridge.tarot_game_new(room_jid_s, [], profile) del self._sessions[raw_data["session_id"]] return defer.succeed({}) @@ -576,7 +576,7 @@ @cards: cards played (list of tuples) @profile_key: profile """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return @@ -587,7 +587,7 @@ elem = self.__card_list_to_xml(TarotCard.from_tuples(cards), "cards_played") self.send(jid.JID(referee), elem, {"player": player}, profile=profile) - def newRound(self, room_jid, profile): + def new_round(self, room_jid, profile): game_data = self.games[room_jid] players = game_data["players"] game_data["first_player"] = None # first player for the current trick @@ -613,7 +613,7 @@ for player in players: msg_elts[player] = self.__card_list_to_xml(hand[player], "hand") - RoomGame.newRound(self, room_jid, (common_data, msg_elts), profile) + RoomGame.new_round(self, room_jid, (common_data, msg_elts), profile) pl_idx = game_data["current_player"] = (game_data["init_player"] + 1) % len( players @@ -626,14 +626,14 @@ """ @param mess_elt: instance of twisted.words.xish.domish.Element """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) from_jid = jid.JID(mess_elt["from"]) room_jid = jid.JID(from_jid.userhost()) - nick = self.host.plugins["XEP-0045"].getRoomNick(client, room_jid) + nick = self.host.plugins["XEP-0045"].get_room_nick(client, room_jid) game_elt = mess_elt.firstChildElement() game_data = self.games[room_jid] - is_player = self.isPlayer(room_jid, nick) + is_player = self.is_player(room_jid, nick) if "players_data" in game_data: players_data = game_data["players_data"] @@ -649,9 +649,9 @@ for player in elt.elements(): players.append(str(player)) signal = ( - self.host.bridge.tarotGameStarted + self.host.bridge.tarot_game_started if elt.name == "started" - else self.host.bridge.tarotGamePlayers + else self.host.bridge.tarot_game_players ) signal(room_jid.userhost(), from_jid.full(), players, profile) @@ -667,21 +667,21 @@ if ( list(status.values()).count("ready") == nb_players ): # everybody is ready, we can start the game - self.newRound(room_jid, profile) + self.new_round(room_jid, profile) elif elt.name == "hand": # a new hand has been received - self.host.bridge.tarotGameNew( + self.host.bridge.tarot_game_new( room_jid.userhost(), self.__xml_to_list(elt), profile ) elif elt.name == "contrat": # it's time to choose contrat form = data_form.Form.fromElement(elt.firstChildElement()) - session_id, session_data = self._sessions.newSession(profile=profile) + session_id, session_data = self._sessions.new_session(profile=profile) session_data["room_jid"] = room_jid - xml_data = xml_tools.dataForm2XMLUI( + xml_data = xml_tools.data_form_2_xmlui( form, self.__choose_contrat_id, session_id ).toXml() - self.host.bridge.tarotGameChooseContrat( + self.host.bridge.tarot_game_choose_contrat( room_jid.userhost(), xml_data, profile ) @@ -752,7 +752,7 @@ data = {"attaquant": elt["attaquant"]} game_data["stage"] = "ecart" game_data["attaquant"] = elt["attaquant"] - self.host.bridge.tarotGameShowCards( + self.host.bridge.tarot_game_show_cards( room_jid.userhost(), "chien", self.__xml_to_list(elt), data, profile ) @@ -790,7 +790,7 @@ cards = TarotCard.from_tuples(self.__xml_to_list(elt)) if mess_elt["type"] == "groupchat": - self.host.bridge.tarotGameCardsPlayed( + self.host.bridge.tarot_game_cards_played( room_jid.userhost(), elt["player"], self.__xml_to_list(elt), @@ -858,7 +858,7 @@ self.send(to_jid, "your_turn", profile=profile) elif elt.name == "your_turn": - self.host.bridge.tarotGameYourTurn(room_jid.userhost(), profile) + self.host.bridge.tarot_game_your_turn(room_jid.userhost(), profile) elif elt.name == "score": form_elt = next(elt.elements(name="x", uri="jabber:x:data")) @@ -869,12 +869,12 @@ for looser in elt.elements(name="looser", uri=NS_CG): loosers.append(str(looser)) form = data_form.Form.fromElement(form_elt) - session_id, session_data = self._sessions.newSession(profile=profile) + session_id, session_data = self._sessions.new_session(profile=profile) session_data["room_jid"] = room_jid - xml_data = xml_tools.dataForm2XMLUI( + xml_data = xml_tools.data_form_2_xmlui( form, self.__score_id, session_id ).toXml() - self.host.bridge.tarotGameScore( + self.host.bridge.tarot_game_score( room_jid.userhost(), xml_data, winners, loosers, profile ) elif elt.name == "error": @@ -885,7 +885,7 @@ invalid_cards = self.__xml_to_list( next(elt.elements(name="invalid", uri=NS_CG)) ) - self.host.bridge.tarotGameInvalidCards( + self.host.bridge.tarot_game_invalid_cards( room_jid.userhost(), elt["phase"], played_cards, @@ -897,5 +897,5 @@ else: log.error(_("Unmanaged card game element: %s") % elt.name) - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): return [] diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_text_commands.py --- a/sat/plugins/plugin_misc_text_commands.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_text_commands.py Sat Apr 08 13:54:42 2023 +0200 @@ -66,12 +66,12 @@ log.info(_("Text commands initialization")) self.host = host # this is internal command, so we set high priority - host.trigger.add("sendMessage", self.sendMessageTrigger, priority=1000000) + host.trigger.add("sendMessage", self.send_message_trigger, priority=1000000) self._commands = {} self._whois = [] - self.registerTextCommands(self) + self.register_text_commands(self) - def _parseDocString(self, cmd, cmd_name): + def _parse_doc_string(self, cmd, cmd_name): """Parse a docstring to get text command data @param cmd: function or method callback for the command, @@ -150,7 +150,7 @@ return data - def registerTextCommands(self, instance): + def register_text_commands(self, instance): """ Add a text command @param instance: instance of a class containing text commands @@ -176,10 +176,10 @@ ) cmd_name = new_name self._commands[cmd_name] = cmd_data = {"callback": cmd} - cmd_data.update(self._parseDocString(cmd, cmd_name)) + cmd_data.update(self._parse_doc_string(cmd, cmd_name)) log.info(_("Registered text command [%s]") % cmd_name) - def addWhoIsCb(self, callback, priority=0): + def add_who_is_cb(self, callback, priority=0): """Add a callback which give information to the /whois command @param callback: a callback which will be called with the following arguments @@ -193,14 +193,14 @@ self._whois.append((priority, callback)) self._whois.sort(key=lambda item: item[0], reverse=True) - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """Install SendMessage command hook """ - pre_xml_treatments.addCallback(self._sendMessageCmdHook, client) + pre_xml_treatments.addCallback(self._send_message_cmd_hook, client) return True - def _sendMessageCmdHook(self, mess_data, client): + def _send_message_cmd_hook(self, mess_data, client): """ Check text commands in message, and react consequently msg starting with / are potential command. If a command is found, it is executed, @@ -239,7 +239,7 @@ d = None command = msg[1:].partition(" ")[0].lower().strip() if not command.isidentifier(): - self.feedBack( + self.feed_back( client, _("Invalid command /%s. ") % command + self.HELP_SUGGESTION, mess_data, @@ -247,7 +247,7 @@ raise failure.Failure(exceptions.CancelError()) # looks like an actual command, we try to call the corresponding method - def retHandling(ret): + def ret_handling(ret): """ Handle command return value: if ret is True, normally send message (possibly modified by command) else, abord message sending @@ -258,12 +258,12 @@ log.debug("text command detected ({})".format(command)) raise failure.Failure(exceptions.CancelError()) - def genericErrback(failure): + def generic_errback(failure): try: msg = "with condition {}".format(failure.value.condition) except AttributeError: msg = "with error {}".format(failure.value) - self.feedBack(client, "Command failed {}".format(msg), mess_data) + self.feed_back(client, "Command failed {}".format(msg), mess_data) return False mess_data["unparsed"] = msg[ @@ -272,7 +272,7 @@ try: cmd_data = self._commands[command] except KeyError: - self.feedBack( + self.feed_back( client, _("Unknown command /%s. ") % command + self.HELP_SUGGESTION, mess_data, @@ -280,7 +280,7 @@ log.debug("text command help message") raise failure.Failure(exceptions.CancelError()) else: - if not self._contextValid(mess_data, cmd_data): + if not self._context_valid(mess_data, cmd_data): # The command is not launched in the right context, we throw a message with help instructions context_txt = ( _("group discussions") @@ -290,23 +290,23 @@ feedback = _("/{command} command only applies in {context}.").format( command=command, context=context_txt ) - self.feedBack( + self.feed_back( client, "{} {}".format(feedback, self.HELP_SUGGESTION), mess_data ) log.debug("text command invalid message") raise failure.Failure(exceptions.CancelError()) else: - d = utils.asDeferred(cmd_data["callback"], client, mess_data) - d.addErrback(genericErrback) - d.addCallback(retHandling) + d = utils.as_deferred(cmd_data["callback"], client, mess_data) + d.addErrback(generic_errback) + d.addCallback(ret_handling) return d - def _contextValid(self, mess_data, cmd_data): + def _context_valid(self, mess_data, cmd_data): """Tell if a command can be used in the given context @param mess_data(dict): message data as given in sendMessage trigger - @param cmd_data(dict): command data as returned by self._parseDocString + @param cmd_data(dict): command data as returned by self._parse_doc_string @return (bool): True if command can be used in this context """ if (cmd_data["type"] == "group" and mess_data["type"] != "groupchat") or ( @@ -315,7 +315,7 @@ return False return True - def getRoomJID(self, arg, service_jid): + def get_room_jid(self, arg, service_jid): """Return a room jid with a shortcut @param arg: argument: can be a full room jid (e.g.: sat@chat.jabberfr.org) @@ -329,7 +329,7 @@ return jid.JID(arg + service_jid) return jid.JID(f"{arg}@{service_jid}") - def feedBack(self, client, message, mess_data, info_type=FEEDBACK_INFO_TYPE): + def feed_back(self, client, message, mess_data, info_type=FEEDBACK_INFO_TYPE): """Give a message back to the user""" if mess_data["type"] == "groupchat": to_ = mess_data["to"].userhostJID() @@ -342,7 +342,7 @@ mess_data["type"] = C.MESS_TYPE_INFO mess_data["message"] = {"": message} mess_data["extra"]["info_type"] = info_type - client.messageSendToBridge(mess_data) + client.message_send_to_bridge(mess_data) def cmd_whois(self, client, mess_data): """show informations on entity @@ -358,7 +358,7 @@ if mess_data["type"] == "groupchat": room = mess_data["to"].userhostJID() try: - if self.host.plugins["XEP-0045"].isNickInRoom(client, room, entity): + if self.host.plugins["XEP-0045"].is_nick_in_room(client, room, entity): entity = "%s/%s" % (room, entity) except KeyError: log.warning("plugin XEP-0045 is not present") @@ -371,11 +371,11 @@ if not target_jid.user or not target_jid.host: raise jid.InvalidFormat except (RuntimeError, jid.InvalidFormat, AttributeError): - self.feedBack(client, _("Invalid jid, can't whois"), mess_data) + self.feed_back(client, _("Invalid jid, can't whois"), mess_data) return False if not target_jid.resource: - target_jid.resource = self.host.memory.getMainResource(client, target_jid) + target_jid.resource = self.host.memory.main_resource_get(client, target_jid) whois_msg = [_("whois for %(jid)s") % {"jid": target_jid}] @@ -385,14 +385,14 @@ lambda __: callback(client, whois_msg, mess_data, target_jid) ) - def feedBack(__): - self.feedBack(client, "\n".join(whois_msg), mess_data) + def feed_back(__): + self.feed_back(client, "\n".join(whois_msg), mess_data) return False - d.addCallback(feedBack) + d.addCallback(feed_back) return d - def _getArgsHelp(self, cmd_data): + def _get_args_help(self, cmd_data): """Return help string for args of cmd_name, according to docstring data @param cmd_data: command data @@ -420,7 +420,7 @@ def cmd_whoami(self, client, mess_data): """give your own jid""" - self.feedBack(client, client.jid.full(), mess_data) + self.feed_back(client, client.jid.full(), mess_data) def cmd_help(self, client, mess_data): """show help on available commands @@ -432,7 +432,7 @@ if cmd_name and cmd_name[0] == "/": cmd_name = cmd_name[1:] if cmd_name and cmd_name not in self._commands: - self.feedBack( + self.feed_back( client, _("Invalid command name [{}]\n".format(cmd_name)), mess_data ) cmd_name = "" @@ -443,7 +443,7 @@ for command in sorted(self._commands): cmd_data = self._commands[command] - if not self._contextValid(mess_data, cmd_data): + if not self._context_valid(mess_data, cmd_data): continue spaces = (longuest - len(command)) * " " help_cmds.append( @@ -464,8 +464,8 @@ short_help=cmd_data["doc_short_help"], syntax=_(" " * 4 + "syntax: {}\n").format(syntax) if syntax else "", args_help="\n".join( - [" " * 8 + "{}".format(line) for line in self._getArgsHelp(cmd_data)] + [" " * 8 + "{}".format(line) for line in self._get_args_help(cmd_data)] ), ) - self.feedBack(client, help_mess, mess_data) + self.feed_back(client, help_mess, mess_data) diff -r c4464d7ae97b -r 524856bd7b19 sat/plugins/plugin_misc_text_syntaxes.py --- a/sat/plugins/plugin_misc_text_syntaxes.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_text_syntaxes.py Sat Apr 08 13:54:42 2023 +0200 @@ -196,7 +196,7 @@ "syntaxes": self.syntaxes, } - self.addSyntax( + self.add_syntax( self.SYNTAX_XHTML, lambda xhtml: defer.succeed(xhtml), lambda xhtml: defer.succeed(xhtml), @@ -204,10 +204,10 @@ ) # TODO: text => XHTML should add to url like in frontends # it's probably best to move sat_frontends.tools.strings to sat.tools.common or similar - self.addSyntax( + self.add_syntax( self.SYNTAX_TEXT, lambda text: escape(text), - lambda xhtml: self._removeMarkups(xhtml), + lambda xhtml: self._remove_markups(xhtml), [TextSyntaxes.OPT_HIDDEN], ) try: @@ -217,7 +217,7 @@ # XXX: we disable raw HTML parsing by default, to avoid parsing error # when the user is not aware of markdown and HTML class EscapeHTML(Extension): - def extendMarkdown(self, md): + def extend_markdown(self, md): md.preprocessors.deregister('html_block') md.inlinePatterns.deregister('html') @@ -226,7 +226,7 @@ h.body_width = 0 # do not truncate the lines, it breaks the long URLs return h.handle(html) - self.addSyntax( + self.add_syntax( self.SYNTAX_MARKDOWN, partial(markdown.markdown, extensions=[ @@ -251,22 +251,22 @@ "You can download/install them from https://pythonhosted.org/Markdown/ " "and https://github.com/Alir3z4/html2text/" ) - host.bridge.addMethod( - "syntaxConvert", + host.bridge.add_method( + "syntax_convert", ".plugin", in_sign="sssbs", out_sign="s", async_=True, method=self.convert, ) - host.bridge.addMethod( - "syntaxGet", ".plugin", in_sign="s", out_sign="s", method=self.getSyntax + host.bridge.add_method( + "syntax_get", ".plugin", in_sign="s", out_sign="s", method=self.get_syntax ) - if xml_tools.cleanXHTML is None: + if xml_tools.clean_xhtml is None: log.debug("Installing cleaning method") - xml_tools.cleanXHTML = self.cleanXHTML + xml_tools.clean_xhtml = self.clean_xhtml - def _updateParamOptions(self): + def _update_param_options(self): data_synt = self.syntaxes default_synt = TextSyntaxes.default_syntax syntaxes = [] @@ -284,23 +284,23 @@ options.append('%s' % (url, match.group(0)) @@ -82,7 +82,7 @@ return re.sub(pattern, repl, string) -def fixXHTMLLinks(xhtml): +def fix_xhtml_links(xhtml): """Add http:// if the scheme is missing and force opening in a new window. @param string (unicode): XHTML Content diff -r c4464d7ae97b -r 524856bd7b19 sat_frontends/tools/xmltools.py --- a/sat_frontends/tools/xmltools.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/xmltools.py Sat Apr 08 13:54:42 2023 +0200 @@ -23,7 +23,7 @@ # (e.g. NativeDOM in Libervia) -def inlineRoot(doc): +def inline_root(doc): """ make the root attribute inline @param root_node: minidom's Document compatible class @return: plain XML diff -r c4464d7ae97b -r 524856bd7b19 sat_frontends/tools/xmlui.py --- a/sat_frontends/tools/xmlui.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/xmlui.py Sat Apr 08 13:54:42 2023 +0200 @@ -40,8 +40,8 @@ pass -# FIXME: this method is duplicated in frontends.tools.xmlui.getText -def getText(node): +# FIXME: this method is duplicated in frontends.tools.xmlui.get_text +def get_text(node): """Get child text nodes @param node: dom Node @return: joined unicode text of all nodes @@ -168,7 +168,7 @@ """Widget which can contain other ones with a specific layout""" @classmethod - def _xmluiAdapt(cls, instance): + def _xmlui_adapt(cls, instance): """Make cls as instance.__class__ cls must inherit from original instance class @@ -217,24 +217,24 @@ def __init__(self, _xmlui_parent): self._xmlui_parent = _xmlui_parent - def _xmluiValidated(self, data=None): + def _xmlui_validated(self, data=None): if data is None: data = {} - self._xmluiSetData(C.XMLUI_STATUS_VALIDATED, data) - self._xmluiSubmit(data) + self._xmlui_set_data(C.XMLUI_STATUS_VALIDATED, data) + self._xmlui_submit(data) - def _xmluiCancelled(self): + def _xmlui_cancelled(self): data = {C.XMLUI_DATA_CANCELLED: C.BOOL_TRUE} - self._xmluiSetData(C.XMLUI_STATUS_CANCELLED, data) - self._xmluiSubmit(data) + self._xmlui_set_data(C.XMLUI_STATUS_CANCELLED, data) + self._xmlui_submit(data) - def _xmluiSubmit(self, data): + def _xmlui_submit(self, data): if self._xmlui_parent.submit_id is None: log.debug(_("Nothing to submit")) else: self._xmlui_parent.submit(data) - def _xmluiSetData(self, status, data): + def _xmlui_set_data(self, status, data): pass @@ -253,7 +253,7 @@ class ConfirmDialog(Dialog): """Dialog with a OK/Cancel type configuration""" - def _xmluiSetData(self, status, data): + def _xmlui_set_data(self, status, data): if status == C.XMLUI_STATUS_VALIDATED: data[C.XMLUI_DATA_ANSWER] = C.BOOL_TRUE elif status == C.XMLUI_STATUS_CANCELLED: @@ -283,12 +283,12 @@ - NO_CANCEL: the UI can't be cancelled - FROM_BACKEND: the UI come from backend (i.e. it's not the direct result of user operation) - @param callback(callable, None): if not None, will be used with launchAction: + @param callback(callable, None): if not None, will be used with action_launch: - if None is used, default behaviour will be used (closing the dialog and - calling host.actionManager) + calling host.action_manager) - if a callback is provided, it will be used instead, so you'll have to manage dialog closing or new xmlui to display, or other action (you can call - host.actionManager) + host.action_manager) The callback will have data, callback_id and profile as arguments """ self.host = host @@ -300,20 +300,20 @@ if flags is None: flags = [] self.flags = flags - self.callback = callback or self._defaultCb + self.callback = callback or self._default_cb self.profile = profile @property def user_action(self): return "FROM_BACKEND" not in self.flags - def _defaultCb(self, data, cb_id, profile): - # TODO: when XMLUI updates will be managed, the _xmluiClose + def _default_cb(self, data, cb_id, profile): + # TODO: when XMLUI updates will be managed, the _xmlui_close # must be called only if there is no update - self._xmluiClose() - self.host.actionManager(data, profile=profile) + self._xmlui_close() + self.host.action_manager(data, profile=profile) - def _isAttrSet(self, name, node): + def _is_attr_set(self, name, node): """Return widget boolean attribute status @param name: name of the attribute (e.g. "read_only") @@ -323,7 +323,7 @@ read_only = node.getAttribute(name) or C.BOOL_FALSE return read_only.lower().strip() == C.BOOL_TRUE - def _getChildNode(self, node, name): + def _get_child_node(self, node, name): """Return the first child node with the given name @param node: Node instance @@ -337,7 +337,7 @@ return None def submit(self, data): - self._xmluiClose() + self._xmlui_close() if self.submit_id is None: raise ValueError("Can't submit is self.submit_id is not set") if "session_id" in data: @@ -347,14 +347,14 @@ ) if self.session_id is not None: data["session_id"] = self.session_id - self._xmluiLaunchAction(self.submit_id, data) + self._xmlui_launch_action(self.submit_id, data) - def _xmluiLaunchAction(self, action_id, data): - self.host.launchAction( + def _xmlui_launch_action(self, action_id, data): + self.host.action_launch( action_id, data, callback=self.callback, profile=self.profile ) - def _xmluiClose(self): + def _xmlui_close(self): """Close the window/popup/... where the constructor XMLUI is this method must be overrided @@ -427,7 +427,7 @@ self._whitelist = whitelist else: self._whitelist = None - self.constructUI(parsed_dom) + self.construct_ui(parsed_dom) @staticmethod def escape(name): @@ -449,10 +449,10 @@ raise ValueError(_("XMLUI can have only one main container")) self._main_cont = value - def _parseChilds(self, _xmlui_parent, current_node, wanted=("container",), data=None): + def _parse_childs(self, _xmlui_parent, current_node, wanted=("container",), data=None): """Recursively parse childNodes of an element - @param _xmlui_parent: widget container with '_xmluiAppend' method + @param _xmlui_parent: widget container with '_xmlui_append' method @param current_node: element from which childs will be parsed @param wanted: list of tag names that can be present in the childs to be SàT XMLUI compliant @@ -473,16 +473,16 @@ self.main_cont = _xmlui_parent if type_ == "tabs": cont = self.widget_factory.createTabsContainer(_xmlui_parent) - self._parseChilds(_xmlui_parent, node, ("tab",), {"tabs_cont": cont}) + self._parse_childs(_xmlui_parent, node, ("tab",), {"tabs_cont": cont}) elif type_ == "vertical": cont = self.widget_factory.createVerticalContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) elif type_ == "pairs": cont = self.widget_factory.createPairsContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) elif type_ == "label": cont = self.widget_factory.createLabelContainer(_xmlui_parent) - self._parseChilds( + self._parse_childs( # FIXME: the "None" value for CURRENT_LABEL doesn't seem # used or even useful, it should probably be removed # and all "is not None" tests for it should be removed too @@ -507,15 +507,15 @@ "can't have selectable=='no' and callback_id at the same time" ) cont._xmlui_callback_id = callback_id - cont._xmluiOnSelect(self.onAdvListSelect) + cont._xmlui_on_select(self.on_adv_list_select) - self._parseChilds(cont, node, ("row",), data) + self._parse_childs(cont, node, ("row",), data) else: log.warning(_("Unknown container [%s], using default one") % type_) cont = self.widget_factory.createVerticalContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) try: - xmluiAppend = _xmlui_parent._xmluiAppend + xmluiAppend = _xmlui_parent._xmlui_append except ( AttributeError, TypeError, @@ -524,7 +524,7 @@ self.main_cont = cont else: raise Exception( - _("Internal Error, container has not _xmluiAppend method") + _("Internal Error, container has not _xmlui_append method") ) else: xmluiAppend(cont) @@ -540,8 +540,8 @@ name ) # XXX: awful hack because params need category and we don't keep parent tab_cont = data["tabs_cont"] - new_tab = tab_cont._xmluiAddTab(label or name, selected) - self._parseChilds(new_tab, node, ("widget", "container")) + new_tab = tab_cont._xmlui_add_tab(label or name, selected) + self._parse_childs(new_tab, node, ("widget", "container")) elif node.nodeName == "row": try: @@ -550,8 +550,8 @@ index = node.getAttribute("index") or None else: data["index"] += 1 - _xmlui_parent._xmluiAddRow(index) - self._parseChilds(_xmlui_parent, node, ("widget", "container")) + _xmlui_parent._xmlui_add_row(index) + self._parse_childs(_xmlui_parent, node, ("widget", "container")) elif node.nodeName == "widget": name = node.getAttribute("name") @@ -565,12 +565,12 @@ curr_label = data.pop(CURRENT_LABEL) if curr_label is not None: # if so, we remove it from parent - _xmlui_parent._xmluiRemove(curr_label) + _xmlui_parent._xmlui_remove(curr_label) continue type_ = node.getAttribute("type") - value_elt = self._getChildNode(node, "value") + value_elt = self._get_child_node(node, "value") if value_elt is not None: - value = getText(value_elt) + value = get_text(value_elt) else: value = ( node.getAttribute("value") if node.hasAttribute("value") else "" @@ -597,39 +597,39 @@ ctrl = self.widget_factory.createDividerWidget(_xmlui_parent, style) elif type_ == "string": ctrl = self.widget_factory.createStringWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "jid_input": ctrl = self.widget_factory.createJidInputWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "password": ctrl = self.widget_factory.createPasswordWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "textbox": ctrl = self.widget_factory.createTextBoxWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "xhtmlbox": ctrl = self.widget_factory.createXHTMLBoxWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "bool": ctrl = self.widget_factory.createBoolWidget( _xmlui_parent, value == C.BOOL_TRUE, - self._isAttrSet("read_only", node), + self._is_attr_set("read_only", node), ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "int": ctrl = self.widget_factory.createIntWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "list": @@ -652,7 +652,7 @@ self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "jids_list": style = [] - jids = [getText(jid_) for jid_ in node.getElementsByTagName("jid")] + jids = [get_text(jid_) for jid_ in node.getElementsByTagName("jid")] ctrl = self.widget_factory.createJidsListWidget( _xmlui_parent, jids, style ) @@ -660,7 +660,7 @@ elif type_ == "button": callback_id = node.getAttribute("callback") ctrl = self.widget_factory.createButtonWidget( - _xmlui_parent, value, self.onButtonPress + _xmlui_parent, value, self.on_button_press ) ctrl._xmlui_param_id = ( callback_id, @@ -683,7 +683,7 @@ if self.type == "param" and type_ not in ("text", "button"): try: - ctrl._xmluiOnChange(self.onParamChange) + ctrl._xmlui_on_change(self.on_param_change) ctrl._param_category = self._current_category except ( AttributeError, @@ -702,15 +702,15 @@ field.getAttribute("name") for field in node.getElementsByTagName("internal_field") ] - cb_data = self.getInternalCallbackData(callback, node) + cb_data = self.get_internal_callback_data(callback, node) ctrl._xmlui_param_internal = (callback, fields, cb_data) if type_ == "button": - ctrl._xmluiOnClick(self.onChangeInternal) + ctrl._xmlui_on_click(self.on_change_internal) else: - ctrl._xmluiOnChange(self.onChangeInternal) + ctrl._xmlui_on_change(self.on_change_internal) ctrl._xmlui_name = name - _xmlui_parent._xmluiAppend(ctrl) + _xmlui_parent._xmlui_append(ctrl) if CURRENT_LABEL in data and not isinstance(ctrl, LabelWidget): curr_label = data.pop(CURRENT_LABEL) if curr_label is not None: @@ -721,7 +721,7 @@ else: raise NotImplementedError(_("Unknown tag [%s]") % node.nodeName) - def constructUI(self, parsed_dom, post_treat=None): + def construct_ui(self, parsed_dom, post_treat=None): """Actually construct the UI @param parsed_dom: main parsed dom @@ -741,17 +741,17 @@ if self.type == "param": self.param_changed = set() - self._parseChilds(self, parsed_dom.documentElement) + self._parse_childs(self, parsed_dom.documentElement) if post_treat is not None: post_treat() - def _xmluiSetParam(self, name, value, category): - self.host.bridge.setParam(name, value, category, profile_key=self.profile) + def _xmlui_set_param(self, name, value, category): + self.host.bridge.param_set(name, value, category, profile_key=self.profile) ##EVENTS## - def onParamChange(self, ctrl): + def on_param_change(self, ctrl): """Called when type is param and a widget to save is modified @param ctrl: widget modified @@ -759,29 +759,29 @@ assert self.type == "param" self.param_changed.add(ctrl) - def onAdvListSelect(self, ctrl): + def on_adv_list_select(self, ctrl): data = {} - widgets = ctrl._xmluiGetSelectedWidgets() + widgets = ctrl._xmlui_get_selected_widgets() for wid in widgets: try: name = self.escape(wid._xmlui_name) - value = wid._xmluiGetValue() + value = wid._xmlui_get_value() data[name] = value except ( AttributeError, TypeError, ): # XXX: TypeError is here because pyjamas raise a TypeError instead of an AttributeError pass - idx = ctrl._xmluiGetSelectedIndex() + idx = ctrl._xmlui_get_selected_index() if idx is not None: data["index"] = idx callback_id = ctrl._xmlui_callback_id if callback_id is None: log.info(_("No callback_id found")) return - self._xmluiLaunchAction(callback_id, data) + self._xmlui_launch_action(callback_id, data) - def onButtonPress(self, button): + def on_button_press(self, button): """Called when an XMLUI button is clicked Launch the action associated to the button @@ -795,16 +795,16 @@ escaped = self.escape(field) ctrl = self.ctrl_list[field] if isinstance(ctrl["control"], ListWidget): - data[escaped] = "\t".join(ctrl["control"]._xmluiGetSelectedValues()) + data[escaped] = "\t".join(ctrl["control"]._xmlui_get_selected_values()) else: - data[escaped] = ctrl["control"]._xmluiGetValue() - self._xmluiLaunchAction(callback_id, data) + data[escaped] = ctrl["control"]._xmlui_get_value() + self._xmlui_launch_action(callback_id, data) - def onChangeInternal(self, ctrl): + def on_change_internal(self, ctrl): """Called when a widget that has been bound to an internal callback is changed. This is used to perform some UI actions without communicating with the backend. - See sat.tools.xml_tools.Widget.setInternalCallback for more details. + See sat.tools.xml_tools.Widget.set_internal_callback for more details. @param ctrl: widget modified """ action, fields, data = ctrl._xmlui_param_internal @@ -817,32 +817,32 @@ """Depending of 'action' value, copy or move from source to target.""" if isinstance(target, ListWidget): if isinstance(source, ListWidget): - values = source._xmluiGetSelectedValues() + values = source._xmlui_get_selected_values() else: - values = [source._xmluiGetValue()] + values = [source._xmlui_get_value()] if action == "move": - source._xmluiSetValue("") + source._xmlui_set_value("") values = [value for value in values if value] if values: - target._xmluiAddValues(values, select=True) + target._xmlui_add_values(values, select=True) else: if isinstance(source, ListWidget): - value = ", ".join(source._xmluiGetSelectedValues()) + value = ", ".join(source._xmlui_get_selected_values()) else: - value = source._xmluiGetValue() + value = source._xmlui_get_value() if action == "move": - source._xmluiSetValue("") - target._xmluiSetValue(value) + source._xmlui_set_value("") + target._xmlui_set_value(value) def groups_of_contact(source, target): """Select in target the groups of the contact which is selected in source.""" assert isinstance(source, ListWidget) assert isinstance(target, ListWidget) try: - contact_jid_s = source._xmluiGetSelectedValues()[0] + contact_jid_s = source._xmlui_get_selected_values()[0] except IndexError: return - target._xmluiSelectValues(data[contact_jid_s]) + target._xmlui_select_values(data[contact_jid_s]) pass source = None @@ -857,11 +857,11 @@ groups_of_contact(source, widget) source = None - def getInternalCallbackData(self, action, node): + def get_internal_callback_data(self, action, node): """Retrieve from node the data needed to perform given action. @param action (string): a value from the one that can be passed to the - 'callback' parameter of sat.tools.xml_tools.Widget.setInternalCallback + 'callback' parameter of sat.tools.xml_tools.Widget.set_internal_callback @param node (DOM Element): the node of the widget that triggers the callback """ # TODO: it would be better to not have a specific way to retrieve @@ -883,7 +883,7 @@ data[jid_s].append(value_elt.getAttribute("name")) return data - def onFormSubmitted(self, ignore=None): + def on_form_submitted(self, ignore=None): """An XMLUI form has been submited call the submit action associated with this form @@ -894,10 +894,10 @@ ctrl = self.ctrl_list[ctrl_name] if isinstance(ctrl["control"], ListWidget): selected_values.append( - (escaped, "\t".join(ctrl["control"]._xmluiGetSelectedValues())) + (escaped, "\t".join(ctrl["control"]._xmlui_get_selected_values())) ) else: - selected_values.append((escaped, ctrl["control"]._xmluiGetValue())) + selected_values.append((escaped, ctrl["control"]._xmlui_get_value())) data = dict(selected_values) for key, value in self.hidden.items(): data[self.escape(key)] = value @@ -908,9 +908,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - def onFormCancelled(self, *__): + def on_form_cancelled(self, *__): """Called when a form is cancelled""" log.debug(_("Cancelling form")) if self.submit_id is not None: @@ -920,9 +920,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - def onSaveParams(self, ignore=None): + def on_save_params(self, ignore=None): """Params are saved, we send them to backend self.type must be param @@ -930,13 +930,13 @@ assert self.type == "param" for ctrl in self.param_changed: if isinstance(ctrl, ListWidget): - value = "\t".join(ctrl._xmluiGetSelectedValues()) + value = "\t".join(ctrl._xmlui_get_selected_values()) else: - value = ctrl._xmluiGetValue() + value = ctrl._xmlui_get_value() param_name = ctrl._xmlui_name.split(C.SAT_PARAM_SEPARATOR)[1] - self._xmluiSetParam(param_name, value, ctrl._param_category) + self._xmlui_set_param(param_name, value, ctrl._param_category) - self._xmluiClose() + self._xmlui_close() def show(self, *args, **kwargs): pass @@ -945,7 +945,7 @@ class AIOXMLUIPanel(XMLUIPanel): """Asyncio compatible version of XMLUIPanel""" - async def onFormSubmitted(self, ignore=None): + async def on_form_submitted(self, ignore=None): """An XMLUI form has been submited call the submit action associated with this form @@ -956,10 +956,10 @@ ctrl = self.ctrl_list[ctrl_name] if isinstance(ctrl["control"], ListWidget): selected_values.append( - (escaped, "\t".join(ctrl["control"]._xmluiGetSelectedValues())) + (escaped, "\t".join(ctrl["control"]._xmlui_get_selected_values())) ) else: - selected_values.append((escaped, ctrl["control"]._xmluiGetValue())) + selected_values.append((escaped, ctrl["control"]._xmlui_get_value())) data = dict(selected_values) for key, value in self.hidden.items(): data[self.escape(key)] = value @@ -970,9 +970,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - async def onFormCancelled(self, *__): + async def on_form_cancelled(self, *__): """Called when a form is cancelled""" log.debug(_("Cancelling form")) if self.submit_id is not None: @@ -982,10 +982,10 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() async def submit(self, data): - self._xmluiClose() + self._xmlui_close() if self.submit_id is None: raise ValueError("Can't submit is self.submit_id is not set") if "session_id" in data: @@ -995,10 +995,10 @@ ) if self.session_id is not None: data["session_id"] = self.session_id - await self._xmluiLaunchAction(self.submit_id, data) + await self._xmlui_launch_action(self.submit_id, data) - async def _xmluiLaunchAction(self, action_id, data): - await self.host.launchAction( + async def _xmlui_launch_action(self, action_id, data): + await self.host.action_launch( action_id, data, callback=self.callback, profile=self.profile ) @@ -1021,13 +1021,13 @@ host, parsed_dom, title=title, flags=flags, callback=callback, profile=profile ) top = parsed_dom.documentElement - dlg_elt = self._getChildNode(top, "dialog") + dlg_elt = self._get_child_node(top, "dialog") if dlg_elt is None: raise ValueError("Invalid XMLUI: no Dialog element found !") dlg_type = dlg_elt.getAttribute("type") or C.XMLUI_DIALOG_MESSAGE try: - mess_elt = self._getChildNode(dlg_elt, C.XMLUI_DATA_MESS) - message = getText(mess_elt) + mess_elt = self._get_child_node(dlg_elt, C.XMLUI_DATA_MESS) + message = get_text(mess_elt) except ( TypeError, AttributeError, @@ -1045,7 +1045,7 @@ ) elif dlg_type == C.XMLUI_DIALOG_CONFIRM: try: - buttons_elt = self._getChildNode(dlg_elt, "buttons") + buttons_elt = self._get_child_node(dlg_elt, "buttons") buttons_set = ( buttons_elt.getAttribute("set") or C.XMLUI_DATA_BTNS_SET_DEFAULT ) @@ -1059,7 +1059,7 @@ ) elif dlg_type == C.XMLUI_DIALOG_FILE: try: - file_elt = self._getChildNode(dlg_elt, "file") + file_elt = self._get_child_node(dlg_elt, "file") filetype = file_elt.getAttribute("type") or C.XMLUI_DATA_FILETYPE_DEFAULT except ( TypeError, @@ -1073,13 +1073,13 @@ raise ValueError("Unknown dialog type [%s]" % dlg_type) def show(self): - self.dlg._xmluiShow() + self.dlg._xmlui_show() - def _xmluiClose(self): - self.dlg._xmluiClose() + def _xmlui_close(self): + self.dlg._xmlui_close() -def registerClass(type_, class_): +def register_class(type_, class_): """Register the class to use with the factory @param type_: one of: @@ -1090,7 +1090,7 @@ # TODO: remove this method, as there are seme use cases where different XMLUI # classes can be used in the same frontend, so a global value is not good assert type_ in (CLASS_PANEL, CLASS_DIALOG) - log.warning("registerClass for XMLUI is deprecated, please use partial with " + log.warning("register_class for XMLUI is deprecated, please use partial with " "xmlui.create and class_map instead") if type_ in _class_map: log.debug(_("XMLUI class already registered for {type_}, ignoring").format( @@ -1132,7 +1132,7 @@ cls = class_map[CLASS_DIALOG] except KeyError: raise ClassNotRegistedError( - _("You must register classes with registerClass before creating a XMLUI") + _("You must register classes with register_class before creating a XMLUI") ) xmlui = cls( diff -r c4464d7ae97b -r 524856bd7b19 tests/e2e/libervia-cli/test_libervia-cli.py --- a/tests/e2e/libervia-cli/test_libervia-cli.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/e2e/libervia-cli/test_libervia-cli.py Sat Apr 08 13:54:42 2023 +0200 @@ -219,7 +219,7 @@ assert metadata['node'] == self.MICROBLOG_NS assert metadata['rsm'].keys() <= {"first", "last", "index", "count"} item_id = item['id'] - expected_uri = uri.buildXMPPUri( + expected_uri = uri.build_xmpp_uri( 'pubsub', subtype="microblog", path="account1@server1.test", node=self.MICROBLOG_NS, item=item_id ) diff -r c4464d7ae97b -r 524856bd7b19 tests/unit/conftest.py --- a/tests/unit/conftest.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/conftest.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,8 +29,8 @@ @fixture(scope="session") def bridge(): bridge = AsyncMock() - bridge.addSignal = MagicMock() - bridge.addMethod = MagicMock() + bridge.add_signal = MagicMock() + bridge.add_method = MagicMock() return bridge @@ -49,21 +49,21 @@ self.profiles = {} self.plugins = {} # map for short name to whole namespace, - # extended by plugins with registerNamespace + # extended by plugins with register_namespace self.ns_map = { "x-data": xmpp.NS_X_DATA, "disco#info": xmpp.NS_DISCO_INFO, } self.memory = MagicMock() self.memory.storage = storage - self.memory.getConfig.side_effect = self.get_test_config + self.memory.config_get.side_effect = self.get_test_config self.trigger = trigger.TriggerManager() self.bridge = bridge defer.ensureDeferred(self._post_init()) self.common_cache = AsyncMock() self._import_plugins() - self._addBaseMenus() + self._add_base_menus() self.initialised = defer.Deferred() self.initialised.callback(None) diff -r c4464d7ae97b -r 524856bd7b19 tests/unit/test_ap-gateway.py --- a/tests/unit/test_ap-gateway.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/test_ap-gateway.py Sat Apr 08 13:54:42 2023 +0200 @@ -348,8 +348,8 @@ return dict(data) -async def mock_getItems(client, service, node, *args, **kwargs): - """Mock getItems +async def mock_get_items(client, service, node, *args, **kwargs): + """Mock get_items special kwargs can be used: ret_items (List[Domish.Element]): items to be returned, by default XMPP_ITEMS are @@ -367,8 +367,8 @@ return ret_items, {"rsm": rsm_resp.toDict(), "complete": True} -async def mock_getPubsubNode(client, service, node, with_subscriptions=False, **kwargs): - """Mock storage's getPubsubNode +async def mock_get_pubsub_node(client, service, node, with_subscriptions=False, **kwargs): + """Mock storage's get_pubsub_node return an MagicMock with subscription attribute set to empty list """ @@ -377,7 +377,7 @@ return fake_cached_node -def mockClient(jid): +def mock_client(jid): client = MagicMock() client.jid = jid client.host = "test.example" @@ -386,8 +386,8 @@ return client -def getVirtualClient(jid): - return mockClient(jid) +def get_virtual_client(jid): + return mock_client(jid) class FakeTReqPostResponse: @@ -398,10 +398,10 @@ def ap_gateway(host): gateway = plugin_comp_ap_gateway.APGateway(host) gateway.initialised = True - gateway.isPubsub = AsyncMock() - gateway.isPubsub.return_value = False - client = mockClient(jid.JID("ap.test.example")) - client.getVirtualClient = getVirtualClient + gateway.is_pubsub = AsyncMock() + gateway.is_pubsub.return_value = False + client = mock_client(jid.JID("ap.test.example")) + client.get_virtual_client = get_virtual_client gateway.client = client gateway.local_only = True gateway.public_url = PUBLIC_URL @@ -416,7 +416,7 @@ class TestActivityPubGateway: - def getTitleXHTML(self, item_elt: domish.Element) -> domish.Element: + def get_title_xhtml(self, item_elt: domish.Element) -> domish.Element: return next( t for t in item_elt.entry.elements(NS_ATOM, "title") @@ -426,7 +426,7 @@ @ed async def test_jid_and_node_convert_to_ap_handle(self, ap_gateway): """JID and pubsub node are converted correctly to an AP actor handle""" - get_account = ap_gateway.getAPAccountFromJidAndNode + get_account = ap_gateway.get_ap_account_from_jid_and_node # local jid assert ( @@ -447,16 +447,16 @@ ) # local pubsub node - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True assert ( await get_account(jid_=jid.JID("pubsub.test.example"), node="some_node") == "some_node@pubsub.test.example" ) # non local pubsub node - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True assert ( await get_account(jid_=jid.JID("pubsub.example.org"), node="some_node") == "___some_node.40pubsub.2eexample.2eorg@ap.test.example" @@ -465,11 +465,11 @@ @ed async def test_ap_handle_convert_to_jid_and_node(self, ap_gateway, monkeypatch): """AP actor handle convert correctly to JID and pubsub node""" - get_jid_node = ap_gateway.getJIDAndNode + get_jid_node = ap_gateway.get_jid_and_node # for following assertion, host is not a pubsub service - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = False + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = False # simple local jid assert await get_jid_node("toto@test.example") == ( @@ -498,8 +498,8 @@ ) # for following assertion, host is a pubsub service - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True # simple local node assert await get_jid_node("toto@pubsub.test.example") == ( @@ -517,18 +517,18 @@ """AP requests are converted to pubsub""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - actor_data = await ap_gateway.getAPActorDataFromAccount(TEST_AP_ACCOUNT) - outbox = await ap_gateway.apGetObject(actor_data, "outbox") - items, rsm_resp = await ap_gateway.getAPItems(outbox, 2) + actor_data = await ap_gateway.get_ap_actor_data_from_account(TEST_AP_ACCOUNT) + outbox = await ap_gateway.ap_get_object(actor_data, "outbox") + items, rsm_resp = await ap_gateway.get_ap_items(outbox, 2) assert rsm_resp.count == 4 assert rsm_resp.index == 0 assert rsm_resp.first == "https://example.org/users/test_user/statuses/4" assert rsm_resp.last == "https://example.org/users/test_user/statuses/3" - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 4</p></div>" @@ -540,7 +540,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[0].entry.published) == "2021-12-16T17:25:03Z" - title_xhtml = self.getTitleXHTML(items[1]) + title_xhtml = self.get_title_xhtml(items[1]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 3</p></div>" @@ -552,7 +552,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[1].entry.published) == "2021-12-16T17:26:03Z" - items, rsm_resp = await ap_gateway.getAPItems( + items, rsm_resp = await ap_gateway.get_ap_items( outbox, max_items=2, after_id="https://example.org/users/test_user/statuses/3", @@ -563,7 +563,7 @@ assert rsm_resp.first == "https://example.org/users/test_user/statuses/2" assert rsm_resp.last == "https://example.org/users/test_user/statuses/1" - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 2</p></div>" @@ -575,7 +575,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[0].entry.published) == "2021-12-16T17:27:03Z" - title_xhtml = self.getTitleXHTML(items[1]) + title_xhtml = self.get_title_xhtml(items[1]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 1</p></div>" @@ -587,7 +587,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[1].entry.published) == "2021-12-16T17:28:03Z" - items, rsm_resp = await ap_gateway.getAPItems(outbox, max_items=1, start_index=2) + items, rsm_resp = await ap_gateway.get_ap_items(outbox, max_items=1, start_index=2) assert rsm_resp.count == 4 assert rsm_resp.index == 2 @@ -595,7 +595,7 @@ assert rsm_resp.last == "https://example.org/users/test_user/statuses/2" assert len(items) == 1 - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 2</p></div>" @@ -603,7 +603,7 @@ ) assert str(items[0].entry.published) == "2021-12-16T17:27:03Z" - items, rsm_resp = await ap_gateway.getAPItems( + items, rsm_resp = await ap_gateway.get_ap_items( outbox, max_items=3, chronological_pagination=False ) assert rsm_resp.count == 4 @@ -611,13 +611,13 @@ assert rsm_resp.first == "https://example.org/users/test_user/statuses/3" assert rsm_resp.last == "https://example.org/users/test_user/statuses/1" assert len(items) == 3 - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 3</p></div>" "" ) - title_xhtml = self.getTitleXHTML(items[2]) + title_xhtml = self.get_title_xhtml(items[2]) assert title_xhtml.toXml() == ( "" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 1</p></div>" @@ -680,8 +680,8 @@ @ed async def test_pubsub_to_ap_conversion(self, ap_gateway, monkeypatch): """Pubsub nodes are converted to AP collections""" - monkeypatch.setattr(ap_gateway._p, "getItems", mock_getItems) - outbox = await ap_gateway.server.resource.APOutboxRequest( + monkeypatch.setattr(ap_gateway._p, "get_items", mock_get_items) + outbox = await ap_gateway.server.resource.ap_outbox_request( **self.ap_request_params(ap_gateway, "outbox") ) assert outbox["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -691,7 +691,7 @@ assert outbox["first"] assert outbox["last"] - first_page = await ap_gateway.server.resource.APOutboxPageRequest( + first_page = await ap_gateway.server.resource.ap_outbox_page_request( **self.ap_request_params(ap_gateway, url=outbox["first"]) ) assert first_page["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -725,11 +725,11 @@ """AP following items are converted to Public Pubsub Subscription subscriptions""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), ap_gateway._pps.subscriptions_node, None, None, @@ -749,12 +749,12 @@ """AP followers items are converted to Public Pubsub Subscription subscribers""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), - ap_gateway._pps.getPublicSubscribersNode(ap_gateway._m.namespace), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), + ap_gateway._pps.get_public_subscribers_node(ap_gateway._m.namespace), None, None, None, @@ -773,22 +773,22 @@ subscriptions = [ pubsub.Item( id="subscription_1", - payload=ap_gateway._pps.buildSubscriptionElt( + payload=ap_gateway._pps.build_subscription_elt( ap_gateway._m.namespace, jid.JID("local_user@test.example") ), ), pubsub.Item( id="subscription_2", - payload=ap_gateway._pps.buildSubscriptionElt( + payload=ap_gateway._pps.build_subscription_elt( ap_gateway._m.namespace, jid.JID("ext_user\\40example.org@ap.test.example"), ), ), ] monkeypatch.setattr( - ap_gateway._p, "getItems", partial(mock_getItems, ret_items=subscriptions) + ap_gateway._p, "get_items", partial(mock_get_items, ret_items=subscriptions) ) - following = await ap_gateway.server.resource.APFollowingRequest( + following = await ap_gateway.server.resource.ap_following_request( **self.ap_request_params(ap_gateway, "following") ) assert following["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -812,21 +812,21 @@ subscribers = [ pubsub.Item( id="subscriber_1", - payload=ap_gateway._pps.buildSubscriberElt( + payload=ap_gateway._pps.build_subscriber_elt( jid.JID("local_user@test.example") ), ), pubsub.Item( id="subscriber_2", - payload=ap_gateway._pps.buildSubscriberElt( + payload=ap_gateway._pps.build_subscriber_elt( jid.JID("ext_user\\40example.org@ap.test.example") ), ), ] monkeypatch.setattr( - ap_gateway._p, "getItems", partial(mock_getItems, ret_items=subscribers) + ap_gateway._p, "get_items", partial(mock_get_items, ret_items=subscribers) ) - followers = await ap_gateway.server.resource.APFollowersRequest( + followers = await ap_gateway.server.resource.ap_followers_request( **self.ap_request_params(ap_gateway, "followers") ) assert followers["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -849,17 +849,17 @@ """XMPP message are sent as AP direct message""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mess_data = { "from": TEST_JID, - "to": ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + "to": ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), "type": "chat", "message": {"": "This is a test message."}, "extra": {"origin-id": "123"}, } - with patch.object(ap_gateway, "signAndPost") as signAndPost: + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: await ap_gateway.onMessage(ap_gateway.client, mess_data) - url, actor_id, doc = signAndPost.call_args[0] + url, actor_id, doc = sign_and_post.call_args[0] assert url == "https://example.org/users/test_user/inbox" assert actor_id == "https://test.example/_ap/actor/some_user@test.example" obj = doc["object"] @@ -883,11 +883,11 @@ """AP direct message are sent as XMPP message (not Pubsub)""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # we have to patch DeferredList to not wait forever monkeypatch.setattr(defer, "DeferredList", AsyncMock()) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_ap_message = { "attributedTo": TEST_AP_ACTOR_ID, "cc": [], @@ -898,11 +898,11 @@ "to": [xmpp_actor_id], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) with patch.object(client, "sendMessage") as sendMessage: - await ap_gateway.newAPItem( + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_ap_message ) @@ -917,38 +917,38 @@ """Pubsub retract requests are converted to AP delete activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) retract_id = "retract_123" retract_elt = domish.Element((pubsub.NS_PUBSUB_EVENT, "retract")) retract_elt["id"] = retract_id items_event = pubsub.ItemsEvent( sender=TEST_JID, - recipient=ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + recipient=ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), nodeIdentifier=ap_gateway._m.namespace, items=[retract_elt], headers={}, ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value = FakeTReqPostResponse() + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value = FakeTReqPostResponse() # we simulate the reception of a retract event - await ap_gateway._itemsReceived(ap_gateway.client, items_event) - url, actor_id, doc = signAndPost.call_args[0] - jid_account = await ap_gateway.getAPAccountFromJidAndNode(TEST_JID, None) - jid_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, jid_account) + await ap_gateway._items_received(ap_gateway.client, items_event) + url, actor_id, doc = sign_and_post.call_args[0] + jid_account = await ap_gateway.get_ap_account_from_jid_and_node(TEST_JID, None) + jid_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, jid_account) assert url == f"{TEST_BASE_URL}/inbox" assert actor_id == jid_actor_id assert doc["type"] == "Delete" assert doc["actor"] == jid_actor_id obj = doc["object"] assert obj["type"] == ap_const.TYPE_TOMBSTONE - url_item_id = ap_gateway.buildAPURL(ap_const.TYPE_ITEM, jid_account, retract_id) + url_item_id = ap_gateway.build_apurl(ap_const.TYPE_ITEM, jid_account, retract_id) assert obj["id"] == url_item_id @ed async def test_ap_delete_to_pubsub_retract(self, ap_gateway): """AP delete activity is converted to pubsub retract""" - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) ap_item = { @@ -962,12 +962,12 @@ with patch.multiple( ap_gateway.host.memory.storage, get=DEFAULT, - getPubsubNode=DEFAULT, - deletePubsubItems=DEFAULT, + get_pubsub_node=DEFAULT, + delete_pubsub_items=DEFAULT, ) as mock_objs: mock_objs["get"].return_value = None cached_node = MagicMock() - mock_objs["getPubsubNode"].return_value = cached_node + mock_objs["get_pubsub_node"].return_value = cached_node subscription = MagicMock() subscription.state = SubscriptionState.SUBSCRIBED subscription.subscriber = TEST_JID @@ -976,7 +976,7 @@ ap_gateway.pubsub_service, "notifyRetract" ) as notifyRetract: # we simulate a received Delete activity - await ap_gateway.newAPDeleteItem( + await ap_gateway.new_ap_delete_item( client=client, destinee=None, node=ap_gateway._m.namespace, @@ -984,9 +984,9 @@ ) # item is deleted from database - deletePubsubItems = mock_objs["deletePubsubItems"] - assert deletePubsubItems.call_count == 1 - assert deletePubsubItems.call_args.args[1] == [ap_item["id"]] + delete_pubsub_items = mock_objs["delete_pubsub_items"] + assert delete_pubsub_items.call_count == 1 + assert delete_pubsub_items.call_args.args[1] == [ap_item["id"]] # retraction notification is sent to subscribers assert notifyRetract.call_count == 1 @@ -1007,42 +1007,42 @@ """Message retract requests are converted to AP delete activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # origin ID is the ID of the message to retract origin_id = "mess_retract_123" - # we call retractByOriginId to get the message element of a retraction request + # we call retract_by_origin_id to get the message element of a retraction request fake_client = MagicMock() fake_client.jid = TEST_JID - dest_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - ap_gateway._r.retractByOriginId(fake_client, dest_jid, origin_id) + dest_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + ap_gateway._r.retract_by_origin_id(fake_client, dest_jid, origin_id) # message_retract_elt is the message which would be sent for a retraction message_retract_elt = fake_client.send.call_args.args[0] apply_to_elt = next(message_retract_elt.elements(NS_FASTEN, "apply-to")) retract_elt = apply_to_elt.retract - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value = FakeTReqPostResponse() + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value = FakeTReqPostResponse() fake_fastened_elts = MagicMock() fake_fastened_elts.id = origin_id # we simulate the reception of a retract event using the message element that # we generated above - await ap_gateway._onMessageRetract( + await ap_gateway._on_message_retract( ap_gateway.client, message_retract_elt, retract_elt, fake_fastened_elts ) - url, actor_id, doc = signAndPost.call_args[0] + url, actor_id, doc = sign_and_post.call_args[0] - # the AP delete activity must have been sent through signAndPost + # the AP delete activity must have been sent through sign_and_post # we check its values - jid_account = await ap_gateway.getAPAccountFromJidAndNode(TEST_JID, None) - jid_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, jid_account) + jid_account = await ap_gateway.get_ap_account_from_jid_and_node(TEST_JID, None) + jid_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, jid_account) assert url == f"{TEST_BASE_URL}/users/{TEST_USER}/inbox" assert actor_id == jid_actor_id assert doc["type"] == "Delete" assert doc["actor"] == jid_actor_id obj = doc["object"] assert obj["type"] == ap_const.TYPE_TOMBSTONE - url_item_id = ap_gateway.buildAPURL(ap_const.TYPE_ITEM, jid_account, origin_id) + url_item_id = ap_gateway.build_apurl(ap_const.TYPE_ITEM, jid_account, origin_id) assert obj["id"] == url_item_id @ed @@ -1053,11 +1053,11 @@ # by ``test_ap_delete_to_pubsub_retract``) # we don't want actual queries in database - retractDBHistory = AsyncMock() - monkeypatch.setattr(ap_gateway._r, "retractDBHistory", retractDBHistory) + retract_db_history = AsyncMock() + monkeypatch.setattr(ap_gateway._r, "retract_db_history", retract_db_history) - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) fake_send = MagicMock() monkeypatch.setattr(client, "send", fake_send) @@ -1077,14 +1077,14 @@ fake_history.origin_id = ap_item["id"] storage_get.return_value = fake_history # we simulate a received Delete activity - await ap_gateway.newAPDeleteItem( + await ap_gateway.new_ap_delete_item( client=client, destinee=None, node=ap_gateway._m.namespace, item=ap_item ) # item is deleted from database - assert retractDBHistory.call_count == 1 - assert retractDBHistory.call_args.args[0] == client - assert retractDBHistory.call_args.args[1] == fake_history + assert retract_db_history.call_count == 1 + assert retract_db_history.call_args.args[0] == client + assert retract_db_history.call_args.args[1] == fake_history # retraction notification is sent to destinee assert fake_send.call_count == 1 @@ -1103,11 +1103,11 @@ """AP actor metadata are converted to XMPP/vCard4""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), # VCard4 node ap_gateway._v.node, None, @@ -1116,7 +1116,7 @@ ) assert len(items) == 1 vcard_elt = next(items[0].elements(ap_gateway._v.namespace, "vcard")) - vcard = ap_gateway._v.vcard2Dict(vcard_elt) + vcard = ap_gateway._v.vcard_2_dict(vcard_elt) assert "test_user nickname" in vcard["nicknames"] assert vcard["description"] == "test account" @@ -1125,12 +1125,12 @@ """XMPP identity is converted to AP actor metadata""" # XXX: XMPP identity is normally an amalgam of metadata from several # XEPs/locations (vCard4, vcard-tmp, etc) - with patch.object(ap_gateway._i, "getIdentity") as getIdentity: - getIdentity.return_value = { + with patch.object(ap_gateway._i, "get_identity") as get_identity: + get_identity.return_value = { "nicknames": ["nick1", "nick2"], "description": "test description", } - actor_data = await ap_gateway.server.resource.APActorRequest( + actor_data = await ap_gateway.server.resource.ap_actor_request( **self.ap_request_params(ap_gateway, ap_const.TYPE_ACTOR) ) @@ -1143,9 +1143,9 @@ """AP mentions by direct addressing are converted to XEP-0372 references""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_addr_mention = { "attributedTo": TEST_AP_ACTOR_ID, @@ -1156,37 +1156,37 @@ "to": [ap_const.NS_AP_PUBLIC, xmpp_actor_id], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) monkeypatch.setattr(client, "sendMessage", MagicMock()) - with patch.object(ap_gateway._refs, "sendReference") as sendReference: - await ap_gateway.newAPItem( + with patch.object(ap_gateway._refs, "send_reference") as send_reference: + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_addr_mention ) - assert sendReference.call_count == 1 - assert sendReference.call_args.kwargs["to_jid"] == TEST_JID + assert send_reference.call_count == 1 + assert send_reference.call_args.kwargs["to_jid"] == TEST_JID - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - expected_anchor = xmpp_uri.buildXMPPUri( + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + expected_anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=local_actor_jid.full(), node=ap_gateway._m.namespace, item=direct_addr_mention["id"], ) - assert sendReference.call_args.kwargs["anchor"] == expected_anchor + assert send_reference.call_args.kwargs["anchor"] == expected_anchor @ed async def test_tag_mention_to_reference(self, ap_gateway, monkeypatch): """AP mentions in "tag" field are converted to XEP-0372 references""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_addr_mention = { "attributedTo": TEST_AP_ACTOR_ID, @@ -1198,35 +1198,35 @@ "tag": [{"type": "Mention", "href": xmpp_actor_id, "name": f"@{TEST_JID}'"}], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) monkeypatch.setattr(client, "sendMessage", MagicMock()) - with patch.object(ap_gateway._refs, "sendReference") as sendReference: - await ap_gateway.newAPItem( + with patch.object(ap_gateway._refs, "send_reference") as send_reference: + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_addr_mention ) - assert sendReference.call_count == 1 - assert sendReference.call_args.kwargs["to_jid"] == TEST_JID + assert send_reference.call_count == 1 + assert send_reference.call_args.kwargs["to_jid"] == TEST_JID - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - expected_anchor = xmpp_uri.buildXMPPUri( + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + expected_anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=local_actor_jid.full(), node=ap_gateway._m.namespace, item=direct_addr_mention["id"], ) - assert sendReference.call_args.kwargs["anchor"] == expected_anchor + assert send_reference.call_args.kwargs["anchor"] == expected_anchor @ed async def test_auto_mentions(self, ap_gateway, monkeypatch): """Check that mentions in body are converted to AP mentions""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mb_data = { "author_jid": TEST_JID.full(), @@ -1254,7 +1254,7 @@ # in mb_data_2_ap_item monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mb_data = { "author_jid": TEST_JID.full(), @@ -1275,11 +1275,11 @@ """Check that XEP-0372 references are converted to AP mention""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) item_elt = XMPP_ITEMS[0] - anchor = xmpp_uri.buildXMPPUri( + anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=TEST_JID.full(), node=ap_gateway._m.namespace, @@ -1287,18 +1287,18 @@ ) ref_data: Dict[str, Union[str, int, dict]] = { - "uri": xmpp_uri.buildXMPPUri(None, path=local_actor_jid.full()), + "uri": xmpp_uri.build_xmpp_uri(None, path=local_actor_jid.full()), "type_": "mention", "anchor": anchor, } - reference_elt = ap_gateway._refs.buildRefElement(**ref_data) + reference_elt = ap_gateway._refs.build_ref_element(**ref_data) # we now update ref_data to look like what is received in the trigger - ref_data["parsed_uri"] = xmpp_uri.parseXMPPUri(ref_data["uri"]) - ref_data["parsed_anchor"] = xmpp_uri.parseXMPPUri(ref_data["anchor"]) + ref_data["parsed_uri"] = xmpp_uri.parse_xmpp_uri(ref_data["uri"]) + ref_data["parsed_anchor"] = xmpp_uri.parse_xmpp_uri(ref_data["anchor"]) - # "type" is a builtin function, thus "type_" is used in buildRefElement, but in + # "type" is a builtin function, thus "type_" is used in build_ref_element, but in # ref_data is "type" without underscore ref_data["type"] = ref_data["type_"] del ref_data["type_"] @@ -1306,22 +1306,22 @@ message_elt = domish.Element((None, "message")) message_elt.addChild(reference_elt) - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - # getItems returns a sqla_mapping.PubsubItem, thus we need to fake it and set + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + # get_items returns a sqla_mapping.PubsubItem, thus we need to fake it and set # the item_elt we want to use in its "data" attribute mock_pubsub_item = MagicMock mock_pubsub_item.data = item_elt - getItems.return_value = ([mock_pubsub_item], {}) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._onReferenceReceived( + get_items.return_value = ([mock_pubsub_item], {}) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._on_reference_received( ap_gateway.client, message_elt, ref_data ) # when reference is received, the referencing item must be sent to referenced # actor, and they must be in "to" field and in "tag" - assert signAndPost.call_count == 1 - send_ap_item = signAndPost.call_args.args[-1] + assert sign_and_post.call_count == 1 + send_ap_item = sign_and_post.call_args.args[-1] ap_object = send_ap_item["object"] assert TEST_AP_ACTOR_ID in ap_object["to"] expected_mention = { @@ -1338,13 +1338,13 @@ """XEP-0272 post repeat is converted to AP Announce activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # JID repeated AP actor (also the recipient of the message) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # repeated item ap_item = TEST_AP_ITEMS[0] - ap_item_url = xmpp_uri.buildXMPPUri( + ap_item_url = xmpp_uri.build_xmpp_uri( "pubsub", path=recipient_jid.full(), node=ap_gateway._m.namespace, @@ -1374,9 +1374,9 @@ ) item_elt.uri = pubsub.NS_PUBSUB_EVENT - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway.convertAndPostItems( + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway.convert_and_post_items( ap_gateway.client, TEST_AP_ACCOUNT, TEST_JID, @@ -1384,10 +1384,10 @@ [item_elt], ) - assert signAndPost.called - url, actor_id, doc = signAndPost.call_args.args + assert sign_and_post.called + url, actor_id, doc = sign_and_post.call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + assert actor_id == ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) assert doc["type"] == "Announce" assert ap_const.NS_AP_PUBLIC in doc["to"] assert doc["object"] == ap_item["id"] @@ -1397,12 +1397,12 @@ """AP Announce activity is converted to XEP-0272 post repeat""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # announced item xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) announce = { @@ -1415,25 +1415,25 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: mock_pubsub_item = MagicMock mock_pubsub_item.data = xmpp_item - getItems.return_value = ([mock_pubsub_item], {}) + get_items.return_value = ([mock_pubsub_item], {}) with patch.object( - ap_gateway.host.memory.storage, "cachePubsubItems" - ) as cachePubsubItems: - await ap_gateway.server.resource.handleAnnounceActivity( + ap_gateway.host.memory.storage, "cache_pubsub_items" + ) as cache_pubsub_items: + await ap_gateway.server.resource.handle_announce_activity( Request(MagicMock()), announce, None, None, None, "", TEST_AP_ACTOR_ID ) - assert cachePubsubItems.called + assert cache_pubsub_items.called # the microblog data put in cache correspond to the item sent to subscribers - __, __, __, [mb_data] = cachePubsubItems.call_args.args + __, __, __, [mb_data] = cache_pubsub_items.call_args.args extra = mb_data["extra"] assert "repeated" in extra repeated = extra["repeated"] - assert repeated["by"] == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT).full() - xmpp_item_xmpp_url = xmpp_uri.buildXMPPUri( + assert repeated["by"] == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT).full() + xmpp_item_xmpp_url = xmpp_uri.build_xmpp_uri( "pubsub", path=TEST_JID.full(), node=ap_gateway._m.namespace, @@ -1446,12 +1446,12 @@ """Pubsub-attachments ``noticed`` is converted to AP Like activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # noticed item ap_item = TEST_AP_ITEMS[0] - attachment_node = ap_gateway._pa.getAttachmentNodeName( + attachment_node = ap_gateway._pa.get_attachment_node_name( recipient_jid, ap_gateway._m.namespace, ap_item["id"] ) item_elt = xml_tools.parse( @@ -1468,14 +1468,14 @@ TEST_JID, recipient_jid, attachment_node, [item_elt], {} ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._itemsReceived(ap_gateway.client, items_event) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._items_received(ap_gateway.client, items_event) - assert signAndPost.called - url, actor_id, doc = signAndPost.call_args.args + assert sign_and_post.called + url, actor_id, doc = sign_and_post.call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + assert actor_id == ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) assert doc["type"] == "Like" assert ap_const.NS_AP_PUBLIC in doc["cc"] assert doc["object"] == ap_item["id"] @@ -1485,12 +1485,12 @@ """AP Like activity is converted to ``noticed`` attachment""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # liked item xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) like = { @@ -1503,23 +1503,23 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - getItems.return_value = ([], {}) - with patch.object(ap_gateway._p, "sendItems") as sendItems: - await ap_gateway.server.resource.APInboxRequest( + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + get_items.return_value = ([], {}) + with patch.object(ap_gateway._p, "send_items") as send_items: + await ap_gateway.server.resource.ap_inbox_request( **self.ap_request_params( ap_gateway, "inbox", data=like, signing_actor=TEST_AP_ACTOR_ID ) ) - assert sendItems.called - si_client, si_service, si_node, [si_item] = sendItems.call_args.args - assert si_client.jid == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + assert send_items.called + si_client, si_service, si_node, [si_item] = send_items.call_args.args + assert si_client.jid == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) assert si_service == TEST_JID - assert si_node == ap_gateway._pa.getAttachmentNodeName( + assert si_node == ap_gateway._pa.get_attachment_node_name( TEST_JID, ap_gateway._m.namespace, xmpp_item["id"] ) - [parsed_item] = ap_gateway._pa.items2attachmentData(si_client, [si_item]) + [parsed_item] = ap_gateway._pa.items_2_attachment_data(si_client, [si_item]) assert parsed_item["from"] == si_client.jid.full() assert "noticed" in parsed_item assert parsed_item["noticed"]["noticed"] == True @@ -1529,18 +1529,18 @@ """Pubsub-attachments ``reactions`` is converted to AP EmojiReact activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # noticed item ap_item = TEST_AP_ITEMS[0] - ap_item_url = xmpp_uri.buildXMPPUri( + ap_item_url = xmpp_uri.build_xmpp_uri( "pubsub", path=recipient_jid.full(), node=ap_gateway._m.namespace, item=ap_item["id"], ) - attachment_node = ap_gateway._pa.getAttachmentNodeName( + attachment_node = ap_gateway._pa.get_attachment_node_name( recipient_jid, ap_gateway._m.namespace, ap_item["id"] ) reactions = ["🦁", "🥜", "🎻"] @@ -1562,15 +1562,15 @@ TEST_JID, recipient_jid, attachment_node, [item_elt], {} ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._itemsReceived(ap_gateway.client, items_event) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._items_received(ap_gateway.client, items_event) - assert signAndPost.call_count == 3 - for idx, call_args in enumerate(signAndPost.call_args_list): + assert sign_and_post.call_count == 3 + for idx, call_args in enumerate(sign_and_post.call_args_list): url, actor_id, doc = call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL( + assert actor_id == ap_gateway.build_apurl( ap_const.TYPE_ACTOR, TEST_JID.userhost() ) assert doc["type"] == "EmojiReact" @@ -1588,12 +1588,12 @@ """AP EmojiReact activity is converted to ``reactions`` attachment""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # item on which reaction is attached xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) like = { @@ -1607,23 +1607,23 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - getItems.return_value = ([], {}) - with patch.object(ap_gateway._p, "sendItems") as sendItems: - await ap_gateway.server.resource.APInboxRequest( + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + get_items.return_value = ([], {}) + with patch.object(ap_gateway._p, "send_items") as send_items: + await ap_gateway.server.resource.ap_inbox_request( **self.ap_request_params( ap_gateway, "inbox", data=like, signing_actor=TEST_AP_ACTOR_ID ) ) - assert sendItems.called - si_client, si_service, si_node, [si_item] = sendItems.call_args.args - assert si_client.jid == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + assert send_items.called + si_client, si_service, si_node, [si_item] = send_items.call_args.args + assert si_client.jid == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) assert si_service == TEST_JID - assert si_node == ap_gateway._pa.getAttachmentNodeName( + assert si_node == ap_gateway._pa.get_attachment_node_name( TEST_JID, ap_gateway._m.namespace, xmpp_item["id"] ) - [parsed_item] = ap_gateway._pa.items2attachmentData(si_client, [si_item]) + [parsed_item] = ap_gateway._pa.items_2_attachment_data(si_client, [si_item]) assert parsed_item["from"] == si_client.jid.full() assert "reactions" in parsed_item assert parsed_item["reactions"]["reactions"] == ["🐅"] diff -r c4464d7ae97b -r 524856bd7b19 tests/unit/test_pubsub-cache.py --- a/tests/unit/test_pubsub-cache.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/test_pubsub-cache.py Sat Apr 08 13:54:42 2023 +0200 @@ -27,22 +27,22 @@ @ed async def test_cache_is_used_transparently(self, host, client): - """Cache is used when a pubsub getItems operation is done""" + """Cache is used when a pubsub get_items operation is done""" items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - pubsub_node = host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + pubsub_node = host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert cacheNode.call_count == 1 - assert cacheNode.call_args.args[-1] == pubsub_node + assert cache_node.call_count == 1 + assert cache_node.call_args.args[-1] == pubsub_node @ed async def test_cache_is_skipped_with_use_cache_false(self, host, client): @@ -50,18 +50,18 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", extra = {C.KEY_USE_CACHE: False} ) - assert not cacheNode.called + assert not cache_node.called @ed async def test_cache_is_not_used_when_no_cache(self, host, client): @@ -70,17 +70,17 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert not cacheNode.called + assert not cache_node.called @ed @@ -89,20 +89,20 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = PubsubNode( sync_state = SyncState.COMPLETED ) with patch.object( host.plugins["PUBSUB_CACHE"], - "getItemsFromCache" - ) as getItemsFromCache: - getItemsFromCache.return_value = ([], {}) - await host.plugins["XEP-0060"].getItems( + "get_items_from_cache" + ) as get_items_from_cache: + get_items_from_cache.return_value = ([], {}) + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert getItemsFromCache.call_count == 1 + assert get_items_from_cache.call_count == 1 assert not client.pubsub_client.items.called @ed @@ -111,21 +111,21 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = PubsubNode( sync_state = SyncState.IN_PROGRESS ) - with patch.object(host.plugins["PUBSUB_CACHE"], "analyseNode") as analyseNode: - analyseNode.return_value = {"to_sync": True} + with patch.object(host.plugins["PUBSUB_CACHE"], "analyse_node") as analyse_node: + analyse_node.return_value = {"to_sync": True} with patch.object( host.plugins["PUBSUB_CACHE"], - "getItemsFromCache" - ) as getItemsFromCache: - getItemsFromCache.return_value = ([], {}) + "get_items_from_cache" + ) as get_items_from_cache: + get_items_from_cache.return_value = ([], {}) assert client.pubsub_client.items.call_count == 0 - await host.plugins["XEP-0060"].getItems( + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert not getItemsFromCache.called + assert not get_items_from_cache.called assert client.pubsub_client.items.call_count == 1 diff -r c4464d7ae97b -r 524856bd7b19 twisted/plugins/sat_plugin.py --- a/twisted/plugins/sat_plugin.py Fri Apr 07 15:18:39 2023 +0200 +++ b/twisted/plugins/sat_plugin.py Sat Apr 08 13:54:42 2023 +0200 @@ -36,7 +36,7 @@ """Method to initialise global modules""" # XXX: We need to configure logs before any log method is used, so here is the best place. from sat.core import log_config - log_config.satConfigure(C.LOG_BACKEND_TWISTED, C, backend_data=options) + log_config.sat_configure(C.LOG_BACKEND_TWISTED, C, backend_data=options) class Options(usage.Options): @@ -50,7 +50,7 @@ description = _("%s XMPP client backend") % C.APP_NAME_FULL options = Options - def setDebugger(self): + def set_debugger(self): from twisted.internet import defer if defer.Deferred.debug: # if we are in debug mode, we want to use ipdb instead of pdb @@ -65,7 +65,7 @@ def makeService(self, options): from twisted.internet import asyncioreactor asyncioreactor.install() - self.setDebugger() + self.set_debugger() # XXX: Libervia must be imported after log configuration, # because it write stuff to logs initialise(options.parent)