Mercurial > libervia-backend
changeset 4037:524856bd7b19
massive refactoring to switch from camelCase to snake_case:
historically, Libervia (SàT before) was using camelCase as allowed by PEP8 when using a
pre-PEP8 code, to use the same coding style as in Twisted.
However, snake_case is more readable and it's better to follow PEP8 best practices, so it
has been decided to move on full snake_case. Because Libervia has a huge codebase, this
ended with a ugly mix of camelCase and snake_case.
To fix that, this patch does a big refactoring by renaming every function and method
(including bridge) that are not coming from Twisted or Wokkel, to use fully snake_case.
This is a massive change, and may result in some bugs.
line wrap: on
line diff
--- a/doc/components.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/components.rst Sat Apr 08 13:54:42 2023 +0200 @@ -384,7 +384,7 @@ The encoding is explained in the documentation of the following method: -.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.getJIDAndNode +.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.get_jid_and_node .. [#AP_chars] Most if not all AP implementations use webfinger `acct` URI as a de-facto @@ -859,10 +859,10 @@ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Publication of AP items can be tested using the following method (with can be accessed -through the ``APSend`` bridge method, client is then replaced by the ``profile`` name, as +through the ``ap_send`` bridge method, client is then replaced by the ``profile`` name, as last argument): -.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.publishMessage +.. automethod:: sat.plugins.plugin_comp_ap_gateway.APGateway.publish_message The method can be used either with CLI's :ref:`debug bridge method <li_debug_bridge_method>` or with any D-Bus tool like ``qdbus`` or ``d-feet`` (only if you @@ -875,7 +875,7 @@ ``https://example.net/@pierre/106986412193109832``. To send a reply to this message, Louise can use the following command:: - $ li debug bridge method -c APSend '"{\"node\": \"https://example.net/@pierre/106986412193109832\", \"content\": \"A lille hello from XMPP\"}","pierre\\40example.net@ap.example.org", "louise"' + $ li debug bridge method -c ap_send '"{\"node\": \"https://example.net/@pierre/106986412193109832\", \"content\": \"A lille hello from XMPP\"}","pierre\\40example.net@ap.example.org", "louise"' Note the double escaping, one for the shell argument, and the other to specify JSON object.
--- a/doc/developer.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/developer.rst Sat Apr 08 13:54:42 2023 +0200 @@ -60,9 +60,9 @@ analysers are checked, and the first one matching is used to determine if the node must be synchronised or not. -Analysers can be registered by any plugins using ``registerAnalyser`` method: +Analysers can be registered by any plugins using ``register_analyser`` method: -.. automethod:: sat.plugins.plugin_pubsub_cache.PubsubCache.registerAnalyser +.. automethod:: sat.plugins.plugin_pubsub_cache.PubsubCache.register_analyser If no analyser is found, ``to_sync`` is false, or an error happens during the caching, the node won't be synchronised and the pubsub service will always be requested.
--- a/doc/libervia-cli/blog.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/blog.rst Sat Apr 08 13:54:42 2023 +0200 @@ -333,11 +333,11 @@ $ li blog import dotclear -Import a Dotclear blog:: +import a Dotclear blog:: $ li blog import dotclear /path/to/dotclear.dump -Import a Dotclear blog without uploading images:: +import a Dotclear blog without uploading images:: $ li blog import --no-images-upload dotclear /path/to/dotclear.dump
--- a/doc/libervia-cli/debug.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/debug.rst Sat Apr 08 13:54:42 2023 +0200 @@ -17,7 +17,7 @@ You profile is automatically set if the method requires it (using the value of ``-p PROFILE, --profile PROFILE``), so you must not specify it as an extra argument. -You can refer to `Bridge API documentation`_ to get core methods signatures +You can refer to `bridge API documentation`_ to get core methods signatures .. _Bridge API documentation: https://wiki.goffi.org/wiki/Bridge_API @@ -26,12 +26,12 @@ -------- Send a message using a single shell arguments for all Python arguments. We -use first the method name (``messageSend``), then the required arguments (see `Bridge +use first the method name (``message_send``), then the required arguments (see `bridge API documentation`_ for details), without the profile as it is automatically set. We specify them as Python in one shell argument, so we use single quote (``\```)first for s hell string, and inside it we use double quote (``"``) for Python strings:: - $ li debug bridge method messageSend '"louise@example.org", {"": "test message"}, {}, "auto", {}' + $ li debug bridge method message_send '"louise@example.org", {"": "test message"}, {}, "auto", {}' .. note:: @@ -39,7 +39,7 @@ Get version string of Libervia:: - $ li debug bridge method getVersion + $ li debug bridge method version_get bridge signal @@ -56,7 +56,7 @@ store the level, so we can easily change it if we want to use an other level for tests. Note the use of quotes (to escape both for shell and Python):: - $ LEVEL='info'; li debug bridge signal -c actionNew '{"xmlui": '"'"'<?xml version="1.0" ?><sat_xmlui title="test title" type="dialog"><dialog level="'$LEVEL'" type="note"><message>test message\non\nseveral\nlines</message></dialog></sat_xmlui>'"'"'}' '""' -1 + $ LEVEL='info'; li debug bridge signal -c action_new '{"xmlui": '"'"'<?xml version="1.0" ?><sat_xmlui title="test title" type="dialog"><dialog level="'$LEVEL'" type="note"><message>test message\non\nseveral\nlines</message></dialog></sat_xmlui>'"'"'}' '""' -1 monitor
--- a/doc/libervia-cli/event.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/event.rst Sat Apr 08 13:54:42 2023 +0200 @@ -35,10 +35,10 @@ If your organsise an item, the ``--rsvp`` flag should be used: it will use the default RSVP form which ask for attendance. If you want to request more information to your guest, ``--rsvp_json JSON`` can be used: the JSON argument is a data dict as described in -``dataDict2dataForm`` function where the ``namespace`` key is not necessary (it's set +``data_dict_2_data_form`` function where the ``namespace`` key is not necessary (it's set automatically): -.. autofunction:: sat.tools.xml_tools.dataDict2dataForm +.. autofunction:: sat.tools.xml_tools.data_dict_2_data_form If the event links to an other one, ``--external JID NODE ITEM`` can be user
--- a/doc/libervia-cli/list.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/list.rst Sat Apr 08 13:54:42 2023 +0200 @@ -68,7 +68,7 @@ import ====== -Import lists from an external source. This works in the same way as +import lists from an external source. This works in the same way as :ref:`libervia-cli_blog_import`: you need to specify an importer and a data location. If you let both positional argument empty, you'll get list of importers, if you specify importer but not data location, you'll get a description on how the importer works. @@ -99,7 +99,7 @@ $ li list import bugzilla -Import lists from a Bugzilla XML export file at ``~/bugzilla_export.xml`` to the +import lists from a Bugzilla XML export file at ``~/bugzilla_export.xml`` to the ``pubsub.example.org`` PubSub service. We use default lists node and want a progression bar::
--- a/doc/libervia-cli/merge-request.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/merge-request.rst Sat Apr 08 13:54:42 2023 +0200 @@ -66,7 +66,7 @@ import ====== -Import a merge request into your project. You mainly have to be in the project repository +import a merge request into your project. You mainly have to be in the project repository (or specify it using ``-r PATH, --repository PATH``) and to specify the id of the patch to import (using ``-i ITEM, --item ITEM``). The behaviour depends of the type of the patch, for Mercurial, the patch will be imported as `MQ`_ patch. @@ -76,6 +76,6 @@ example ------- -Import the merge request with id 321:: +import the merge request with id 321:: $ li merge-request import -i 321
--- a/doc/libervia-cli/pubsub_node.rst Fri Apr 07 15:18:39 2023 +0200 +++ b/doc/libervia-cli/pubsub_node.rst Sat Apr 08 13:54:42 2023 +0200 @@ -93,7 +93,7 @@ import ====== -Import a raw XML containing items to create in the node. The path to the XML file is used +import a raw XML containing items to create in the node. The path to the XML file is used as positional argument. The XML file must contain full `<item>` element for each item to import. The output of ``pubsub get`` can be used directly. @@ -103,7 +103,7 @@ example ------- -Import a node backup which has previously been saved using ``li blog get -M -1 -n +import a node backup which has previously been saved using ``li blog get -M -1 -n some_node > some_node_backup.xml``:: $ li pubsub node import -n some_node ~/some_node_backup.xml
--- a/sat/bridge/bridge_constructor/base_constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/base_constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,7 +74,7 @@ function[option] = value return function - def getDefault(self, name): + def get_default(self, name): """Return default values of a function in a dict @param name: Name of the function to get @return: dict, each key is the integer param number (no key if no default value)""" @@ -106,7 +106,7 @@ flags.append(option) return flags - def getArgumentsDoc(self, name): + def get_arguments_doc(self, name): """Return documentation of arguments @param name: Name of the function to get @return: dict, each key is the integer param number (no key if no argument doc), value is a tuple (name, doc)""" @@ -131,7 +131,7 @@ doc_dict[idx] = (value_match.group(1), value_match.group(2)) return doc_dict - def getDoc(self, name): + def get_doc(self, name): """Return documentation of the method @param name: Name of the function to get @return: string documentation, or None""" @@ -139,7 +139,7 @@ return self.bridge_template.get(name, "doc") return None - def argumentsParser(self, signature): + def arguments_parser(self, signature): """Generator which return individual arguments signatures from a global signature""" start = 0 i = 0 @@ -176,19 +176,19 @@ yield signature[start:i] start = i - def getArguments(self, signature, name=None, default=None, unicode_protect=False): + def get_arguments(self, signature, name=None, default=None, unicode_protect=False): """Return arguments to user given a signature @param signature: signature in the short form (using s,a,i,b etc) - @param name: dictionary of arguments name like given by getArgumentsDoc - @param default: dictionary of default values, like given by getDefault + @param name: dictionary of arguments name like given by get_arguments_doc + @param default: dictionary of default values, like given by get_default @param unicode_protect: activate unicode protection on strings (return strings as unicode(str)) @return (str): arguments that correspond to a signature (e.g.: "sss" return "arg1, arg2, arg3") """ idx = 0 attr_string = [] - for arg in self.argumentsParser(signature): + for arg in self.arguments_parser(signature): attr_string.append( ( "str(%(name)s)%(default)s" @@ -206,7 +206,7 @@ return ", ".join(attr_string) - def getTemplatePath(self, template_file): + def get_template_path(self, template_file): """return template path corresponding to file name @param template_file(str): name of template file @@ -232,12 +232,12 @@ def generate(self, side): """generate bridge - call generateCoreSide or generateFrontendSide if they exists + call generate_core_side or generateFrontendSide if they exists else call generic self._generate method """ try: if side == "core": - method = self.generateCoreSide + method = self.generate_core_side elif side == "frontend": if not self.FRONTEND_ACTIVATE: print("This constructor only handle core, please use core side") @@ -272,8 +272,8 @@ for section in sections: function = self.getValues(section) print(("Adding %s %s" % (section, function["type"]))) - default = self.getDefault(section) - arg_doc = self.getArgumentsDoc(section) + default = self.get_default(section) + arg_doc = self.get_arguments_doc(section) async_ = "async" in self.getFlags(section) completion = { "sig_in": function["sig_in"] or "", @@ -281,10 +281,10 @@ "category": "plugin" if function["category"] == "plugin" else "core", "name": section, # arguments with default values - "args": self.getArguments( + "args": self.get_arguments( function["sig_in"], name=arg_doc, default=default ), - "args_no_default": self.getArguments(function["sig_in"], name=arg_doc), + "args_no_default": self.get_arguments(function["sig_in"], name=arg_doc), } extend_method = getattr( @@ -305,7 +305,7 @@ for env, v in os.environ.items() if env.startswith(C.ENV_OVERRIDE) } - template_path = self.getTemplatePath(TEMPLATE) + template_path = self.get_template_path(TEMPLATE) try: with open(template_path) as template: for line in template: @@ -332,9 +332,9 @@ sys.exit(1) # now we write to final file - self.finalWrite(DEST, bridge) + self.final_write(DEST, bridge) - def finalWrite(self, filename, file_buf): + def final_write(self, filename, file_buf): """Write the final generated file in [dest dir]/filename @param filename: name of the file to generate
--- a/sat/bridge/bridge_constructor/bridge_constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/bridge_constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,7 +32,7 @@ class BridgeConstructor(object): - def importConstructors(self): + def import_constructors(self): constructors_dir = os.path.dirname(constructors.__file__) self.protocoles = {} for dir_ in os.listdir(constructors_dir): @@ -120,7 +120,7 @@ return parser.parse_args() def go(self): - self.importConstructors() + self.import_constructors() args = self.parse_args() template_parser = Parser() try:
--- a/sat/bridge/bridge_constructor/bridge_template.ini Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/bridge_template.ini Sat Apr 08 13:54:42 2023 +0200 @@ -20,7 +20,7 @@ doc=Connection is finished or lost doc_param_0=%(doc_profile)s -[newContact] +[contact_new] type=signal category=core sig_in=sa{ss}ass @@ -34,7 +34,7 @@ doc_param_2=groups: Roster's groups where the contact is doc_param_3=%(doc_profile)s -[messageNew] +[message_new] type=signal category=core sig_in=sdssa{ss}a{ss}sss @@ -54,7 +54,7 @@ - info_type: subtype for info messages doc_param_8=%(doc_profile)s -[messageEncryptionStarted] +[message_encryption_started] type=signal category=core sig_in=sss @@ -68,7 +68,7 @@ - directed_devices: list or resource where session is encrypted doc_param_2=%(doc_profile_key)s -[messageEncryptionStopped] +[message_encryption_stopped] type=signal category=core sig_in=sa{ss}s @@ -79,7 +79,7 @@ - namespace: namespace of the encryption plugin doc_param_2=%(doc_profile_key)s -[presenceUpdate] +[presence_update] type=signal category=core sig_in=ssia{ss}s @@ -99,7 +99,7 @@ doc_param_1=entity_jid: JID from which the subscription is coming doc_param_2=%(doc_profile)s -[paramUpdate] +[param_update] type=signal category=core sig_in=ssss @@ -109,7 +109,7 @@ doc_param_2=category: Category of the updated parameter doc_param_3=%(doc_profile)s -[contactDeleted] +[contact_deleted] type=signal category=core sig_in=ss @@ -117,7 +117,7 @@ doc_param_0=entity_jid: JID of the contact removed from roster doc_param_1=%(doc_profile)s -[actionNew] +[action_new] type=signal category=core sig_in=a{ss}sis @@ -137,7 +137,7 @@ doc_param_2=%(doc_security_limit)s doc_param_3=%(doc_profile)s -[entityDataUpdated] +[entity_data_updated] type=signal category=core sig_in=ssss @@ -147,7 +147,7 @@ doc_param_2=value: New value doc_param_3=%(doc_profile)s -[progressStarted] +[progress_started] type=signal category=core sig_in=sa{ss}s @@ -160,7 +160,7 @@ C.META_TYPE_FILE: file transfer doc_param_2=%(doc_profile)s -[progressFinished] +[progress_finished] type=signal category=core sig_in=sa{ss}s @@ -170,11 +170,11 @@ - hash: value of the computed hash - hash_algo: alrorithm used to compute hash - hash_verified: C.BOOL_TRUE if hash is verified and OK - C.BOOL_FALSE if hash was not received ([progressError] will be used if there is a mismatch) + C.BOOL_FALSE if hash was not received ([progress_error] will be used if there is a mismatch) - url: url linked to the progression (e.g. download url after a file upload) doc_param_2=%(doc_profile)s -[progressError] +[progress_error] type=signal category=core sig_in=sss @@ -194,7 +194,7 @@ ;methods -[getReady] +[ready_get] async= type=method category=core @@ -202,14 +202,14 @@ sig_out= doc=Return when backend is initialised -[getVersion] +[version_get] type=method category=core sig_in= sig_out=s doc=Get "Salut à Toi" full version -[getFeatures] +[features_get] type=method category=core sig_in=s @@ -221,7 +221,7 @@ plugin import name is used as key, data is an other dict managed by the plugin async= -[profileNameGet] +[profile_name_get] type=method category=core sig_in=s @@ -231,7 +231,7 @@ doc_param_0=%(doc_profile_key)s doc_return=Real profile name -[profilesListGet] +[profiles_list_get] type=method category=core sig_in=bb @@ -242,7 +242,7 @@ doc_param_1=components: get components profiles doc=Get list of profiles -[profileSetDefault] +[profile_set_default] type=method category=core sig_in=s @@ -250,7 +250,7 @@ doc_param_0=%(doc_profile)s doc=Set default profile -[getEntityData] +[entity_data_get] type=method category=core sig_in=sass @@ -262,7 +262,7 @@ doc_return=dictionary of asked key, if key doesn't exist, the resulting dictionary will not have the key -[getEntitiesData] +[entities_data_get] type=method category=core sig_in=asass @@ -275,7 +275,7 @@ values are serialised if key doesn't exist for a jid, the resulting dictionary will not have it -[profileCreate] +[profile_create] async= type=method category=core @@ -293,7 +293,7 @@ - CancelError: profile creation canceled - NotFound: component entry point is not available -[asyncDeleteProfile] +[profile_delete_async] async= type=method category=core @@ -325,7 +325,7 @@ - False if the XMPP connection has been initiated (it may still fail) - failure if the profile authentication failed -[profileStartSession] +[profile_start_session] async= type=method category=core @@ -340,7 +340,7 @@ - True if the profile session was already started - False else -[profileIsSessionStarted] +[profile_is_session_started] type=method category=core sig_in=s @@ -359,7 +359,7 @@ doc=Disconnect a profile doc_param_0=%(doc_profile_key)s -[isConnected] +[is_connected] type=method category=core sig_in=s @@ -368,7 +368,7 @@ doc=Tell if a profile is connected doc_param_0=%(doc_profile_key)s -[contactGet] +[contact_get] async= type=method category=core @@ -378,10 +378,10 @@ doc=Return informations in roster about a contact doc_param_1=%(doc_profile_key)s doc_return=tuple with the following values: - - list of attributes as in [newContact] + - list of attributes as in [contact_new] - groups where the contact is -[getContacts] +[contacts_get] async= type=method category=core @@ -392,10 +392,10 @@ doc_param_0=%(doc_profile_key)s doc_return=array of tuples with the following values: - JID of the contact - - list of attributes as in [newContact] + - list of attributes as in [contact_new] - groups where the contact is -[getContactsFromGroup] +[contacts_get_from_group] type=method category=core sig_in=ss @@ -406,7 +406,7 @@ doc_param_1=%(doc_profile_key)s doc_return=array of jids -[getMainResource] +[main_resource_get] type=method category=core sig_in=ss @@ -417,7 +417,7 @@ doc_param_1=%(doc_profile_key)s doc_return=the resource connected of the contact with highest priority, or "" -[getPresenceStatuses] +[presence_statuses_get] type=method category=core sig_in=s @@ -426,9 +426,9 @@ doc=Return presence information of all contacts doc_param_0=%(doc_profile_key)s doc_return=Dict of presence with bare JID of contact as key, and value as follow: - A dict where key is the resource and the value is a tuple with (show, priority, statuses) as for [presenceUpdate] + A dict where key is the resource and the value is a tuple with (show, priority, statuses) as for [presence_update] -[getWaitingSub] +[sub_waiting_get] type=method category=core sig_in=s @@ -438,7 +438,7 @@ doc_param_0=%(doc_profile_key)s doc_return=Dict where contact JID is the key, and value is the subscription type -[messageSend] +[message_send] async= type=method category=core @@ -458,7 +458,7 @@ doc_param_4=extra: (serialised) optional data that can be used by a plugin to build more specific messages doc_param_5=%(doc_profile_key)s -[messageEncryptionStart] +[message_encryption_start] async= type=method category=core @@ -474,7 +474,7 @@ else a ConflictError will be raised doc_param_3=%(doc_profile_key)s -[messageEncryptionStop] +[message_encryption_stop] async= type=method category=core @@ -484,7 +484,7 @@ doc_param_0=to_jid: JID of the recipient (full jid if encryption must be stopped for one device only) doc_param_1=%(doc_profile_key)s -[messageEncryptionGet] +[message_encryption_get] type=method category=core sig_in=ss @@ -499,21 +499,21 @@ following key can be present if suitable: - directed_devices: list or resource where session is encrypted -[encryptionNamespaceGet] +[encryption_namespace_get] type=method category=core sig_in=s sig_out=s doc=Get algorithm namespace from its name -[encryptionPluginsGet] +[encryption_plugins_get] type=method category=core sig_in= sig_out=s doc=Retrieve registered plugins for encryption -[encryptionTrustUIGet] +[encryption_trust_ui_get] async= type=method category=core @@ -525,7 +525,7 @@ doc_param_2=%(doc_profile_key)s doc_return=(XMLUI) UI of the trust management -[setPresence] +[presence_set] type=method category=core sig_in=ssa{ss}s @@ -536,8 +536,8 @@ param_3_default="@DEFAULT@" doc=Set presence information for the profile doc_param_0=to_jid: the JID to who we send the presence data (emtpy string for broadcast) -doc_param_1=show: as for [presenceUpdate] -doc_param_2=statuses: as for [presenceUpdate] +doc_param_1=show: as for [presence_update] +doc_param_2=statuses: as for [presence_update] doc_param_3=%(doc_profile_key)s [subscription] @@ -551,7 +551,7 @@ doc_param_1=entity: as for [subscribe] doc_param_2=%(doc_profile_key)s -[getConfig] +[config_get] type=method category=core sig_in=ss @@ -560,7 +560,7 @@ doc_param_0=section: section of the configuration file (empty string for DEFAULT) doc_param_1=name: name of the option -[setParam] +[param_set] type=method category=core sig_in=sssis @@ -574,7 +574,7 @@ doc_param_3=%(doc_security_limit)s doc_param_4=%(doc_profile_key)s -[getParamA] +[param_get_a] type=method category=core sig_in=ssss @@ -582,12 +582,12 @@ param_2_default="value" param_3_default="@DEFAULT@" doc=Helper method to get a parameter's attribute *when profile is connected* -doc_param_0=name: as for [setParam] -doc_param_1=category: as for [setParam] +doc_param_0=name: as for [param_set] +doc_param_1=category: as for [param_set] doc_param_2=attribute: Name of the attribute doc_param_3=%(doc_profile_key)s -[privateDataGet] +[private_data_get] async= type=method category=core @@ -599,7 +599,7 @@ doc_param_2=%(doc_profile_key)s doc_return=serialised data -[privateDataSet] +[private_data_set] async= type=method category=core @@ -611,7 +611,7 @@ doc_param_2=data: serialised data doc_param_3=%(doc_profile_key)s -[privateDataDelete] +[private_data_delete] async= type=method category=core @@ -622,7 +622,7 @@ doc_param_1=key: key of the data to delete doc_param_3=%(doc_profile_key)s -[asyncGetParamA] +[param_get_a_async] async= type=method category=core @@ -632,13 +632,13 @@ param_3_default=-1 param_4_default="@DEFAULT@" doc=Helper method to get a parameter's attribute -doc_param_0=name: as for [setParam] -doc_param_1=category: as for [setParam] +doc_param_0=name: as for [param_set] +doc_param_1=category: as for [param_set] doc_param_2=attribute: Name of the attribute doc_param_3=%(doc_security_limit)s doc_param_4=%(doc_profile_key)s -[asyncGetParamsValuesFromCategory] +[params_values_from_category_get_async] async= type=method category=code @@ -649,13 +649,13 @@ param_3_default="" param_4_default="@DEFAULT@" doc=Get "attribute" for all params of a category -doc_param_0=category: as for [setParam] +doc_param_0=category: as for [param_set] doc_param_1=%(doc_security_limit)s doc_param_2=app: name of the frontend requesting the parameters, or '' to get all parameters doc_param_3=extra: extra options/filters doc_param_4=%(doc_profile_key)s -[getParamsUI] +[param_ui_get] async= type=method category=core @@ -671,7 +671,7 @@ doc_param_2=extra: extra options/filters doc_param_3=%(doc_profile_key)s -[getParamsCategories] +[params_categories_get] type=method category=core sig_in= @@ -679,7 +679,7 @@ doc=Get all categories currently existing in parameters doc_return=list of categories -[paramsRegisterApp] +[params_register_app] type=method category=core sig_in=sis @@ -691,7 +691,7 @@ doc_param_1=%(doc_security_limit)s doc_param_2=app: name of the frontend registering the parameters -[historyGet] +[history_get] async= type=method category=core @@ -712,9 +712,9 @@ - not_types: type must not be one of those, values are separated by spaces - before_uid: check only message received before message with given uid doc_param_5=%(doc_profile)s -doc_return=Ordered list (by timestamp) of data as in [messageNew] (without final profile) +doc_return=Ordered list (by timestamp) of data as in [message_new] (without final profile) -[addContact] +[contact_add] type=method category=core sig_in=ss @@ -724,7 +724,7 @@ doc_param_0=entity_jid: JID to add to roster doc_param_1=%(doc_profile_key)s -[updateContact] +[contact_update] type=method category=core sig_in=ssass @@ -736,7 +736,7 @@ doc_param_2=groups: list of group where the entity is doc_param_3=%(doc_profile_key)s -[delContact] +[contact_del] async= type=method category=core @@ -747,7 +747,7 @@ doc_param_0=entity_jid: JID to remove from roster doc_param_1=%(doc_profile_key)s -[rosterResync] +[roster_resync] async= type=method category=core @@ -757,7 +757,7 @@ doc=Do a full resynchronisation of roster with server doc_param_0=%(doc_profile_key)s -[launchAction] +[action_launch] async= type=method category=core @@ -771,7 +771,7 @@ doc_return=dict where key can be: - xmlui: a XMLUI need to be displayed -[actionsGet] +[actions_get] type=method category=core sig_in=s @@ -779,9 +779,9 @@ param_0_default="@DEFAULT@" doc=Get all not yet answered actions doc_param_0=%(doc_profile_key)s -doc_return=list of data as for [actionNew] (without the profile) +doc_return=list of data as for [action_new] (without the profile) -[progressGet] +[progress_get] type=method category=core sig_in=ss @@ -794,7 +794,7 @@ - size: end position (optional if not known) other metadata may be present -[progressGetAllMetadata] +[progress_get_all_metadata] type=method category=core sig_in=s @@ -803,9 +803,9 @@ doc_param_0=%(doc_profile)s or C.PROF_KEY_ALL for all profiles doc_return= a dict which map profile to progress_dict progress_dict map progress_id to progress_metadata - progress_metadata is the same dict as sent by [progressStarted] + progress_metadata is the same dict as sent by [progress_started] -[progressGetAll] +[progress_get_all] type=method category=core sig_in=s @@ -814,9 +814,9 @@ doc_param_0=%(doc_profile)s or C.PROF_KEY_ALL for all profiles doc_return= a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_data is the same dict as returned by [progressGet] + progress_data is the same dict as returned by [progress_get] -[menusGet] +[menus_get] type=method category=core sig_in=si @@ -832,7 +832,7 @@ - menu_path_i18n: translated path of the menu - extra: extra data, like icon name -[menuLaunch] +[menu_launch] async= type=method category=core @@ -847,7 +847,7 @@ doc_return=dict where key can be: - xmlui: a XMLUI need to be displayed -[menuHelpGet] +[menu_help_get] type=method category=core sig_in=ss @@ -858,7 +858,7 @@ doc_param_1=language: language in which the menu should be translated (empty string for default) doc_return=Translated help string -[discoInfos] +[disco_infos] async= type=method category=core @@ -884,7 +884,7 @@ * desc - list of values -[discoItems] +[disco_items] async= type=method category=core @@ -900,7 +900,7 @@ doc_param_3=%(doc_profile_key)s doc_return=array of tuple (entity, node identifier, name) -[discoFindByFeatures] +[disco_find_by_features] async= type=method category=core @@ -927,7 +927,7 @@ - own entities (i.e. entities linked to profile's jid) - roster entities -[saveParamsTemplate] +[params_template_save] type=method category=core sig_in=s @@ -936,7 +936,7 @@ doc_param_0=filename: output filename doc_return=boolean (True in case of success) -[loadParamsTemplate] +[params_template_load] type=method category=core sig_in=s @@ -945,7 +945,7 @@ doc_param_0=filename: input filename doc_return=boolean (True in case of success) -[sessionInfosGet] +[session_infos_get] async= type=method category=core @@ -957,7 +957,7 @@ jid: current full jid started: date of creation of the session (Epoch time) -[devicesInfosGet] +[devices_infos_get] async= type=method category=core @@ -970,7 +970,7 @@ doc_return=list of known devices, where each item is a dict with a least following keys: resource: device resource -[namespacesGet] +[namespaces_get] type=method category=core sig_in= @@ -978,7 +978,7 @@ doc=Get a dict to short name => whole namespaces doc_return=namespaces mapping -[imageCheck] +[image_check] type=method category=core sig_in=s @@ -986,7 +986,7 @@ doc=Analyze an image a return a report doc_return=serialized report -[imageResize] +[image_resize] async= type=method category=core @@ -999,7 +999,7 @@ doc_return=path of the new image with desired size the image must be deleted once not needed anymore -[imageGeneratePreview] +[image_generate_preview] async= type=method category=core @@ -1010,7 +1010,7 @@ doc_param_1=%(doc_profile_key)s doc_return=path to the preview in cache -[imageConvert] +[image_convert] async= type=method category=core
--- a/sat/bridge/bridge_constructor/constructors/dbus-xml/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus-xml/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,9 +38,9 @@ "a{sa{s(sia{ss})}}": "PresenceStatusT", } - def generateCoreSide(self): + def generate_core_side(self): try: - doc = minidom.parse(self.getTemplatePath(self.template)) + doc = minidom.parse(self.get_template_path(self.template)) interface_elt = doc.getElementsByTagName("interface")[0] except IOError: print("Can't access template") @@ -60,8 +60,8 @@ new_elt.setAttribute("name", section) idx = 0 - args_doc = self.getArgumentsDoc(section) - for arg in self.argumentsParser(function["sig_in"] or ""): + args_doc = self.get_arguments_doc(section) + for arg in self.arguments_parser(function["sig_in"] or ""): arg_elt = doc.createElement("arg") arg_elt.setAttribute( "name", args_doc[idx][0] if idx in args_doc else "arg_%i" % idx @@ -99,4 +99,4 @@ interface_elt.appendChild(new_elt) # now we write to final file - self.finalWrite(self.core_dest, [doc.toprettyxml()]) + self.final_write(self.core_dest, [doc.toprettyxml()])
--- a/sat/bridge/bridge_constructor/constructors/dbus/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -81,7 +81,7 @@ "debug": "" if not self.args.debug else 'log.debug ("%s")\n%s' % (completion["name"], 8 * " "), - "args_result": self.getArguments(function["sig_in"], name=arg_doc), + "args_result": self.get_arguments(function["sig_in"], name=arg_doc), "async_args": "callback=None, errback=None", "async_comma": ", " if function["sig_in"] else "", "error_handler": """if callback is None:
--- a/sat/bridge/bridge_constructor/constructors/dbus/dbus_core_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/dbus_core_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,8 +30,8 @@ log = getLogger(__name__) # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -118,13 +118,13 @@ ##METHODS_PART## -class Bridge: +class bridge: def __init__(self): log.info("Init DBus...") self._obj = DBusObject(const_OBJ_PATH) - async def postInit(self): + async def post_init(self): try: conn = await client.connect(reactor) except error.DBusException as e: @@ -145,13 +145,13 @@ log.debug(f"registering DBus bridge method [{name}]") self._obj.register_method(name, callback) - def emitSignal(self, name, *args): + def emit_signal(self, name, *args): self._obj.emitSignal(name, *args) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to D-Bus Bridge""" + """Dynamically add a method to D-Bus bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug(f"Adding method {name!r} to D-Bus bridge") self._obj.plugin_iface.addMethod( @@ -164,8 +164,8 @@ setattr(self._obj, f"dbus_{name}", MethodType(caller, self._obj)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): - """Dynamically add a signal to D-Bus Bridge""" + def add_signal(self, name, int_suffix, signature, doc={}): + """Dynamically add a signal to D-Bus bridge""" log.debug(f"Adding signal {name!r} to D-Bus bridge") self._obj.plugin_iface.addSignal(Signal(name, signature)) - setattr(Bridge, name, partialmethod(Bridge.emitSignal, name)) + setattr(bridge, name, partialmethod(bridge.emit_signal, name))
--- a/sat/bridge/bridge_constructor/constructors/dbus/dbus_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/dbus/dbus_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,8 +32,8 @@ # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -66,9 +66,9 @@ return BridgeException(name, message, condition) -class Bridge: +class bridge: - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): try: self.sessions_bus = dbus.SessionBus() self.db_object = self.sessions_bus.get_object(const_INT_PREFIX, @@ -105,7 +105,7 @@ except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): # We first check if we have an async call. We detect this in two ways: # - if we have the 'callback' and 'errback' keyword arguments # - or if the last two arguments are callable @@ -156,11 +156,11 @@ return self.db_plugin_iface.get_dbus_method(name)(*args, **kwargs) raise e - return getPluginMethod + return get_plugin_method ##METHODS_PART## -class AIOBridge(Bridge): +class AIOBridge(bridge): def register_signal(self, functionName, handler, iface="core"): loop = asyncio.get_running_loop() @@ -173,7 +173,7 @@ return object.__getattribute__(self, name) except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): loop = asyncio.get_running_loop() fut = loop.create_future() method = getattr(self.db_plugin_iface, name) @@ -191,7 +191,7 @@ ) except ValueError as e: if e.args[0].startswith("Unable to guess signature"): - # same hack as for Bridge.__getattribute__ + # same hack as for bridge.__getattribute__ log.warning("using hack to work around inspection issue") proxy = self.db_plugin_iface.proxy_object IN_PROGRESS = proxy.INTROSPECT_STATE_INTROSPECT_IN_PROGRESS @@ -209,12 +209,12 @@ raise e return fut - return getPluginMethod + return get_plugin_method - def bridgeConnect(self): + def bridge_connect(self): loop = asyncio.get_running_loop() fut = loop.create_future() - super().bridgeConnect( + super().bridge_connect( callback=lambda: loop.call_soon_threadsafe(fut.set_result, None), errback=lambda e: loop.call_soon_threadsafe(fut.set_exception, e) )
--- a/sat/bridge/bridge_constructor/constructors/embedded/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ "debug": "" if not self.args.debug else 'log.debug ("%s")\n%s' % (completion["name"], 8 * " "), - "args_result": self.getArguments(function["sig_in"], name=arg_doc), + "args_result": self.get_arguments(function["sig_in"], name=arg_doc), "args_comma": ", " if function["sig_in"] else "", } ) @@ -96,5 +96,5 @@ def core_completion_signal(self, completion, function, default, arg_doc, async_): completion.update( - {"args_result": self.getArguments(function["sig_in"], name=arg_doc)} + {"args_result": self.get_arguments(function["sig_in"], name=arg_doc)} )
--- a/sat/bridge/bridge_constructor/constructors/embedded/embedded_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/embedded_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -17,4 +17,4 @@ # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>. -from sat.bridge.embedded import Bridge +from sat.bridge.embedded import bridge
--- a/sat/bridge/bridge_constructor/constructors/embedded/embedded_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/embedded/embedded_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ self._methods_cbs = {} self._signals_cbs = {"core": {}, "plugin": {}} - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): callback() def register_method(self, name, callback): @@ -85,7 +85,7 @@ else: cb(*args, **kwargs) - def addMethod(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={}): + def add_method(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={}): # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method [{}] to embedded bridge".format(name)) self.register_method(name, method) @@ -97,7 +97,7 @@ ), ) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): setattr( self.__class__, name, @@ -116,7 +116,7 @@ bridge = None -def Bridge(): +def bridge(): global bridge if bridge is None: bridge = _Bridge()
--- a/sat/bridge/bridge_constructor/constructors/mediawiki/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/mediawiki/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ self.core_template = "mediawiki_template.tpl" self.core_dest = "mediawiki.wiki" - def _addTextDecorations(self, text): + def _add_text_decorations(self, text): """Add text decorations like coloration or shortcuts""" def anchor_link(match): @@ -42,43 +42,43 @@ return re.sub(r"\[(\w+)\]", anchor_link, text) - def _wikiParameter(self, name, sig_in): + def _wiki_parameter(self, name, sig_in): """Format parameters with the wiki syntax @param name: name of the function @param sig_in: signature in @return: string of the formated parameters""" - arg_doc = self.getArgumentsDoc(name) - arg_default = self.getDefault(name) - args_str = self.getArguments(sig_in) + arg_doc = self.get_arguments_doc(name) + arg_default = self.get_default(name) + args_str = self.get_arguments(sig_in) args = args_str.split(", ") if args_str else [] # ugly but it works :) wiki = [] for i in range(len(args)): if i in arg_doc: name, doc = arg_doc[i] doc = "\n:".join(doc.rstrip("\n").split("\n")) - wiki.append("; %s: %s" % (name, self._addTextDecorations(doc))) + wiki.append("; %s: %s" % (name, self._add_text_decorations(doc))) else: wiki.append("; arg_%d: " % i) if i in arg_default: wiki.append(":''DEFAULT: %s''" % arg_default[i]) return "\n".join(wiki) - def _wikiReturn(self, name): + def _wiki_return(self, name): """Format return doc with the wiki syntax @param name: name of the function """ - arg_doc = self.getArgumentsDoc(name) + arg_doc = self.get_arguments_doc(name) wiki = [] if "return" in arg_doc: wiki.append("\n|-\n! scope=row | return value\n|") wiki.append( "<br />\n".join( - self._addTextDecorations(arg_doc["return"]).rstrip("\n").split("\n") + self._add_text_decorations(arg_doc["return"]).rstrip("\n").split("\n") ) ) return "\n".join(wiki) - def generateCoreSide(self): + def generate_core_side(self): signals_part = [] methods_part = [] sections = self.bridge_template.sections() @@ -114,13 +114,13 @@ "sig_out": function["sig_out"] or "", "category": function["category"], "name": section, - "doc": self.getDoc(section) or "FIXME: No description available", + "doc": self.get_doc(section) or "FIXME: No description available", "async": async_msg if "async" in self.getFlags(section) else "", "deprecated": deprecated_msg if "deprecated" in self.getFlags(section) else "", - "parameters": self._wikiParameter(section, function["sig_in"]), - "return": self._wikiReturn(section) + "parameters": self._wiki_parameter(section, function["sig_in"]), + "return": self._wiki_return(section) if function["type"] == "method" else "", } @@ -148,7 +148,7 @@ # at this point, signals_part, and methods_part should be filled, # we just have to place them in the right part of the template core_bridge = [] - template_path = self.getTemplatePath(self.core_template) + template_path = self.get_template_path(self.core_template) try: with open(template_path) as core_template: for line in core_template: @@ -165,4 +165,4 @@ sys.exit(1) # now we write to final file - self.finalWrite(self.core_dest, core_bridge) + self.final_write(self.core_dest, core_bridge)
--- a/sat/bridge/bridge_constructor/constructors/pb/constructor.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/constructor.py Sat Apr 08 13:54:42 2023 +0200 @@ -26,7 +26,7 @@ CORE_FORMATS = { "signals": """\ def {name}(self, {args}): - {debug}self.sendSignal("{name}", {args_no_def})\n""" + {debug}self.send_signal("{name}", {args_no_def})\n""" } FRONTEND_TEMPLATE = "pb_frontend_template.py" @@ -49,7 +49,7 @@ } def core_completion_signal(self, completion, function, default, arg_doc, async_): - completion["args_no_def"] = self.getArguments(function["sig_in"], name=arg_doc) + completion["args_no_def"] = self.get_arguments(function["sig_in"], name=arg_doc) completion["debug"] = ( "" if not self.args.debug @@ -60,7 +60,7 @@ completion.update( { "args_comma": ", " if function["sig_in"] else "", - "args_no_def": self.getArguments(function["sig_in"], name=arg_doc), + "args_no_def": self.get_arguments(function["sig_in"], name=arg_doc), "callback": "callback" if function["sig_out"] else "lambda __: callback()",
--- a/sat/bridge/bridge_constructor/constructors/pb/pb_core_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/pb_core_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,17 +55,17 @@ def __init__(self): self.signals_handlers = [] - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(HandlerWrapper(signals_handler)) log.info("registered signal handler") - def sendSignalEb(self, failure_, signal_name): + def send_signal_eb(self, failure_, signal_name): if not failure_.check(pb.PBConnectionLost): log.error( f"Error while sending signal {signal_name}: {failure_}", ) - def sendSignal(self, name, args, kwargs): + def send_signal(self, name, args, kwargs): to_remove = [] for wrapper in self.signals_handlers: handler = wrapper.handler @@ -74,13 +74,13 @@ except pb.DeadReferenceError: to_remove.append(wrapper) else: - d.addErrback(self.sendSignalEb, name) + d.addErrback(self.send_signal_eb, name) if to_remove: for wrapper in to_remove: log.debug("Removing signal handler for dead frontend") self.signals_handlers.remove(wrapper) - def _bridgeDeactivateSignals(self): + def _bridge_deactivate_signals(self): if hasattr(self, "signals_paused"): log.warning("bridge signals already deactivated") if self.signals_handler: @@ -90,7 +90,7 @@ self.signals_handlers = [] log.debug("bridge signals have been deactivated") - def _bridgeReactivateSignals(self): + def _bridge_reactivate_signals(self): try: self.signals_handlers = self.signals_paused except AttributeError: @@ -102,31 +102,31 @@ ##METHODS_PART## -class Bridge(object): +class bridge(object): def __init__(self): log.info("Init Perspective Broker...") self.root = PBRoot() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" log.info(f"using UNIX Socket at {socket_path}") reactor.listenUNIX( str(socket_path), pb.PBServerFactory(self.root), mode=0o600 ) elif conn_type == "socket": - port = int(getConf("port", 8789)) + port = int(get_conf("port", 8789)) log.info(f"using TCP Socket at port {port}") reactor.listenTCP(port, pb.PBServerFactory(self.root)) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") - def sendSignal(self, name, *args, **kwargs): - self.root.sendSignal(name, args, kwargs) + def send_signal(self, name, *args, **kwargs): + self.root.send_signal(name, args, kwargs) - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(signals_handler) log.info("registered signal handler") @@ -135,32 +135,32 @@ setattr(self.root, "remote_" + name, callback) # self.root.register_method(name, callback) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to PB Bridge""" + """Dynamically add a method to PB bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method {name} to PB bridge".format(name=name)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): log.debug("Adding signal {name} to PB bridge".format(name=name)) setattr( - self, name, lambda *args, **kwargs: self.sendSignal(name, *args, **kwargs) + self, name, lambda *args, **kwargs: self.send_signal(name, *args, **kwargs) ) - def bridgeDeactivateSignals(self): + def bridge_deactivate_signals(self): """Stop sending signals to bridge Mainly used for mobile frontends, when the frontend is paused """ - self.root._bridgeDeactivateSignals() + self.root._bridge_deactivate_signals() - def bridgeReactivateSignals(self): + def bridge_reactivate_signals(self): """Send again signals to bridge - Should only be used after bridgeDeactivateSignals has been called + Should only be used after bridge_deactivate_signals has been called """ - self.root._bridgeReactivateSignals() + self.root._bridge_reactivate_signals() ##SIGNALS_PART##
--- a/sat/bridge/bridge_constructor/constructors/pb/pb_frontend_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/bridge_constructor/constructors/pb/pb_frontend_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ setattr(self, method_name, handler) -class Bridge(object): +class bridge(object): def __init__(self): self.signals_handler = SignalsHandler() @@ -75,7 +75,7 @@ ) ) - def remoteCallback(self, result, callback): + def remote_callback(self, result, callback): """call callback with argument or None if result is not None not argument is used, @@ -112,11 +112,11 @@ callback = args.pop() d = self.root.callRemote(name, *args, **kwargs) if callback is not None: - d.addCallback(self.remoteCallback, callback) + d.addCallback(self.remote_callback, callback) if errback is not None: d.addErrback(errback) - def _initBridgeEb(self, failure_): + def _init_bridge_eb(self, failure_): log.error("Can't init bridge: {msg}".format(msg=failure_)) return failure_ @@ -127,28 +127,28 @@ """ self.root = root d = root.callRemote("initBridge", self.signals_handler) - d.addErrback(self._initBridgeEb) + d.addErrback(self._init_bridge_eb) return d - def getRootObjectEb(self, failure_): + def get_root_object_eb(self, failure_): """Call errback with appropriate bridge error""" if failure_.check(ConnectionRefusedError, ConnectError): raise exceptions.BridgeExceptionNoService else: raise failure_ - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): factory = pb.PBClientFactory() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" reactor.connectUNIX(str(socket_path), factory) elif conn_type == "socket": - host = getConf("host", "localhost") - port = int(getConf("port", 8789)) + host = get_conf("host", "localhost") + port = int(get_conf("port", 8789)) reactor.connectTCP(host, port, factory) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") @@ -156,7 +156,7 @@ d.addCallback(self._set_root) if callback is not None: d.addCallback(lambda __: callback()) - d.addErrback(self.getRootObjectEb) + d.addErrback(self.get_root_object_eb) if errback is not None: d.addErrback(lambda failure_: errback(failure_.value)) return d @@ -175,7 +175,7 @@ return super().register_signal(name, async_handler, iface) -class AIOBridge(Bridge): +class AIOBridge(bridge): def __init__(self): self.signals_handler = AIOSignalsHandler() @@ -192,8 +192,8 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - async def bridgeConnect(self): - d = super().bridgeConnect(callback=None, errback=None) + async def bridge_connect(self): + d = super().bridge_connect(callback=None, errback=None) return await d.asFuture(asyncio.get_event_loop()) ##ASYNC_METHODS_PART##
--- a/sat/bridge/dbus_bridge.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/dbus_bridge.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,8 +30,8 @@ log = getLogger(__name__) # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -88,87 +88,87 @@ core_iface = DBusInterface( const_INT_PREFIX + const_CORE_SUFFIX, - Method('actionsGet', arguments='s', returns='a(a{ss}si)'), - Method('addContact', arguments='ss', returns=''), - Method('asyncDeleteProfile', arguments='s', returns=''), - Method('asyncGetParamA', arguments='sssis', returns='s'), - Method('asyncGetParamsValuesFromCategory', arguments='sisss', returns='a{ss}'), + Method('action_launch', arguments='sa{ss}s', returns='a{ss}'), + Method('actions_get', arguments='s', returns='a(a{ss}si)'), + Method('config_get', arguments='ss', returns='s'), Method('connect', arguments='ssa{ss}', returns='b'), - Method('contactGet', arguments='ss', returns='(a{ss}as)'), - Method('delContact', arguments='ss', returns=''), - Method('devicesInfosGet', arguments='ss', returns='s'), - Method('discoFindByFeatures', arguments='asa(ss)bbbbbs', returns='(a{sa(sss)}a{sa(sss)}a{sa(sss)})'), - Method('discoInfos', arguments='ssbs', returns='(asa(sss)a{sa(a{ss}as)})'), - Method('discoItems', arguments='ssbs', returns='a(sss)'), + Method('contact_add', arguments='ss', returns=''), + Method('contact_del', arguments='ss', returns=''), + Method('contact_get', arguments='ss', returns='(a{ss}as)'), + Method('contact_update', arguments='ssass', returns=''), + Method('contacts_get', arguments='s', returns='a(sa{ss}as)'), + Method('contacts_get_from_group', arguments='ss', returns='as'), + Method('devices_infos_get', arguments='ss', returns='s'), + Method('disco_find_by_features', arguments='asa(ss)bbbbbs', returns='(a{sa(sss)}a{sa(sss)}a{sa(sss)})'), + Method('disco_infos', arguments='ssbs', returns='(asa(sss)a{sa(a{ss}as)})'), + Method('disco_items', arguments='ssbs', returns='a(sss)'), Method('disconnect', arguments='s', returns=''), - Method('encryptionNamespaceGet', arguments='s', returns='s'), - Method('encryptionPluginsGet', arguments='', returns='s'), - Method('encryptionTrustUIGet', arguments='sss', returns='s'), - Method('getConfig', arguments='ss', returns='s'), - Method('getContacts', arguments='s', returns='a(sa{ss}as)'), - Method('getContactsFromGroup', arguments='ss', returns='as'), - Method('getEntitiesData', arguments='asass', returns='a{sa{ss}}'), - Method('getEntityData', arguments='sass', returns='a{ss}'), - Method('getFeatures', arguments='s', returns='a{sa{ss}}'), - Method('getMainResource', arguments='ss', returns='s'), - Method('getParamA', arguments='ssss', returns='s'), - Method('getParamsCategories', arguments='', returns='as'), - Method('getParamsUI', arguments='isss', returns='s'), - Method('getPresenceStatuses', arguments='s', returns='a{sa{s(sia{ss})}}'), - Method('getReady', arguments='', returns=''), - Method('getVersion', arguments='', returns='s'), - Method('getWaitingSub', arguments='s', returns='a{ss}'), - Method('historyGet', arguments='ssiba{ss}s', returns='a(sdssa{ss}a{ss}ss)'), - Method('imageCheck', arguments='s', returns='s'), - Method('imageConvert', arguments='ssss', returns='s'), - Method('imageGeneratePreview', arguments='ss', returns='s'), - Method('imageResize', arguments='sii', returns='s'), - Method('isConnected', arguments='s', returns='b'), - Method('launchAction', arguments='sa{ss}s', returns='a{ss}'), - Method('loadParamsTemplate', arguments='s', returns='b'), - Method('menuHelpGet', arguments='ss', returns='s'), - Method('menuLaunch', arguments='sasa{ss}is', returns='a{ss}'), - Method('menusGet', arguments='si', returns='a(ssasasa{ss})'), - Method('messageEncryptionGet', arguments='ss', returns='s'), - Method('messageEncryptionStart', arguments='ssbs', returns=''), - Method('messageEncryptionStop', arguments='ss', returns=''), - Method('messageSend', arguments='sa{ss}a{ss}sss', returns=''), - Method('namespacesGet', arguments='', returns='a{ss}'), - Method('paramsRegisterApp', arguments='sis', returns=''), - Method('privateDataDelete', arguments='sss', returns=''), - Method('privateDataGet', arguments='sss', returns='s'), - Method('privateDataSet', arguments='ssss', returns=''), - Method('profileCreate', arguments='sss', returns=''), - Method('profileIsSessionStarted', arguments='s', returns='b'), - Method('profileNameGet', arguments='s', returns='s'), - Method('profileSetDefault', arguments='s', returns=''), - Method('profileStartSession', arguments='ss', returns='b'), - Method('profilesListGet', arguments='bb', returns='as'), - Method('progressGet', arguments='ss', returns='a{ss}'), - Method('progressGetAll', arguments='s', returns='a{sa{sa{ss}}}'), - Method('progressGetAllMetadata', arguments='s', returns='a{sa{sa{ss}}}'), - Method('rosterResync', arguments='s', returns=''), - Method('saveParamsTemplate', arguments='s', returns='b'), - Method('sessionInfosGet', arguments='s', returns='a{ss}'), - Method('setParam', arguments='sssis', returns=''), - Method('setPresence', arguments='ssa{ss}s', returns=''), + Method('encryption_namespace_get', arguments='s', returns='s'), + Method('encryption_plugins_get', arguments='', returns='s'), + Method('encryption_trust_ui_get', arguments='sss', returns='s'), + Method('entities_data_get', arguments='asass', returns='a{sa{ss}}'), + Method('entity_data_get', arguments='sass', returns='a{ss}'), + Method('features_get', arguments='s', returns='a{sa{ss}}'), + Method('history_get', arguments='ssiba{ss}s', returns='a(sdssa{ss}a{ss}ss)'), + Method('image_check', arguments='s', returns='s'), + Method('image_convert', arguments='ssss', returns='s'), + Method('image_generate_preview', arguments='ss', returns='s'), + Method('image_resize', arguments='sii', returns='s'), + Method('is_connected', arguments='s', returns='b'), + Method('main_resource_get', arguments='ss', returns='s'), + Method('menu_help_get', arguments='ss', returns='s'), + Method('menu_launch', arguments='sasa{ss}is', returns='a{ss}'), + Method('menus_get', arguments='si', returns='a(ssasasa{ss})'), + Method('message_encryption_get', arguments='ss', returns='s'), + Method('message_encryption_start', arguments='ssbs', returns=''), + Method('message_encryption_stop', arguments='ss', returns=''), + Method('message_send', arguments='sa{ss}a{ss}sss', returns=''), + Method('namespaces_get', arguments='', returns='a{ss}'), + Method('param_get_a', arguments='ssss', returns='s'), + Method('param_get_a_async', arguments='sssis', returns='s'), + Method('param_set', arguments='sssis', returns=''), + Method('param_ui_get', arguments='isss', returns='s'), + Method('params_categories_get', arguments='', returns='as'), + Method('params_register_app', arguments='sis', returns=''), + Method('params_template_load', arguments='s', returns='b'), + Method('params_template_save', arguments='s', returns='b'), + Method('params_values_from_category_get_async', arguments='sisss', returns='a{ss}'), + Method('presence_set', arguments='ssa{ss}s', returns=''), + Method('presence_statuses_get', arguments='s', returns='a{sa{s(sia{ss})}}'), + Method('private_data_delete', arguments='sss', returns=''), + Method('private_data_get', arguments='sss', returns='s'), + Method('private_data_set', arguments='ssss', returns=''), + Method('profile_create', arguments='sss', returns=''), + Method('profile_delete_async', arguments='s', returns=''), + Method('profile_is_session_started', arguments='s', returns='b'), + Method('profile_name_get', arguments='s', returns='s'), + Method('profile_set_default', arguments='s', returns=''), + Method('profile_start_session', arguments='ss', returns='b'), + Method('profiles_list_get', arguments='bb', returns='as'), + Method('progress_get', arguments='ss', returns='a{ss}'), + Method('progress_get_all', arguments='s', returns='a{sa{sa{ss}}}'), + Method('progress_get_all_metadata', arguments='s', returns='a{sa{sa{ss}}}'), + Method('ready_get', arguments='', returns=''), + Method('roster_resync', arguments='s', returns=''), + Method('session_infos_get', arguments='s', returns='a{ss}'), + Method('sub_waiting_get', arguments='s', returns='a{ss}'), Method('subscription', arguments='sss', returns=''), - Method('updateContact', arguments='ssass', returns=''), + Method('version_get', arguments='', returns='s'), Signal('_debug', 'sa{ss}s'), - Signal('actionNew', 'a{ss}sis'), + Signal('action_new', 'a{ss}sis'), Signal('connected', 'ss'), - Signal('contactDeleted', 'ss'), + Signal('contact_deleted', 'ss'), + Signal('contact_new', 'sa{ss}ass'), Signal('disconnected', 's'), - Signal('entityDataUpdated', 'ssss'), - Signal('messageEncryptionStarted', 'sss'), - Signal('messageEncryptionStopped', 'sa{ss}s'), - Signal('messageNew', 'sdssa{ss}a{ss}sss'), - Signal('newContact', 'sa{ss}ass'), - Signal('paramUpdate', 'ssss'), - Signal('presenceUpdate', 'ssia{ss}s'), - Signal('progressError', 'sss'), - Signal('progressFinished', 'sa{ss}s'), - Signal('progressStarted', 'sa{ss}s'), + Signal('entity_data_updated', 'ssss'), + Signal('message_encryption_started', 'sss'), + Signal('message_encryption_stopped', 'sa{ss}s'), + Signal('message_new', 'sdssa{ss}a{ss}sss'), + Signal('param_update', 'ssss'), + Signal('presence_update', 'ssia{ss}s'), + Signal('progress_error', 'sss'), + Signal('progress_finished', 'sa{ss}s'), + Signal('progress_started', 'sa{ss}s'), Signal('subscribe', 'sss'), ) plugin_iface = DBusInterface( @@ -196,212 +196,212 @@ d.addErrback(GenericException.create_and_raise) return d - def dbus_actionsGet(self, profile_key="@DEFAULT@"): - return self._callback("actionsGet", profile_key) - - def dbus_addContact(self, entity_jid, profile_key="@DEFAULT@"): - return self._callback("addContact", entity_jid, profile_key) + def dbus_action_launch(self, callback_id, data, profile_key="@DEFAULT@"): + return self._callback("action_launch", callback_id, data, profile_key) - def dbus_asyncDeleteProfile(self, profile): - return self._callback("asyncDeleteProfile", profile) + def dbus_actions_get(self, profile_key="@DEFAULT@"): + return self._callback("actions_get", profile_key) - def dbus_asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): - return self._callback("asyncGetParamA", name, category, attribute, security_limit, profile_key) - - def dbus_asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): - return self._callback("asyncGetParamsValuesFromCategory", category, security_limit, app, extra, profile_key) + def dbus_config_get(self, section, name): + return self._callback("config_get", section, name) def dbus_connect(self, profile_key="@DEFAULT@", password='', options={}): return self._callback("connect", profile_key, password, options) - def dbus_contactGet(self, arg_0, profile_key="@DEFAULT@"): - return self._callback("contactGet", arg_0, profile_key) + def dbus_contact_add(self, entity_jid, profile_key="@DEFAULT@"): + return self._callback("contact_add", entity_jid, profile_key) + + def dbus_contact_del(self, entity_jid, profile_key="@DEFAULT@"): + return self._callback("contact_del", entity_jid, profile_key) - def dbus_delContact(self, entity_jid, profile_key="@DEFAULT@"): - return self._callback("delContact", entity_jid, profile_key) + def dbus_contact_get(self, arg_0, profile_key="@DEFAULT@"): + return self._callback("contact_get", arg_0, profile_key) - def dbus_devicesInfosGet(self, bare_jid, profile_key): - return self._callback("devicesInfosGet", bare_jid, profile_key) + def dbus_contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@"): + return self._callback("contact_update", entity_jid, name, groups, profile_key) + + def dbus_contacts_get(self, profile_key="@DEFAULT@"): + return self._callback("contacts_get", profile_key) - def dbus_discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): - return self._callback("discoFindByFeatures", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) + def dbus_contacts_get_from_group(self, group, profile_key="@DEFAULT@"): + return self._callback("contacts_get_from_group", group, profile_key) + + def dbus_devices_infos_get(self, bare_jid, profile_key): + return self._callback("devices_infos_get", bare_jid, profile_key) - def dbus_discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - return self._callback("discoInfos", entity_jid, node, use_cache, profile_key) + def dbus_disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): + return self._callback("disco_find_by_features", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) - def dbus_discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - return self._callback("discoItems", entity_jid, node, use_cache, profile_key) + def dbus_disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + return self._callback("disco_infos", entity_jid, node, use_cache, profile_key) + + def dbus_disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + return self._callback("disco_items", entity_jid, node, use_cache, profile_key) def dbus_disconnect(self, profile_key="@DEFAULT@"): return self._callback("disconnect", profile_key) - def dbus_encryptionNamespaceGet(self, arg_0): - return self._callback("encryptionNamespaceGet", arg_0) + def dbus_encryption_namespace_get(self, arg_0): + return self._callback("encryption_namespace_get", arg_0) - def dbus_encryptionPluginsGet(self, ): - return self._callback("encryptionPluginsGet", ) + def dbus_encryption_plugins_get(self, ): + return self._callback("encryption_plugins_get", ) - def dbus_encryptionTrustUIGet(self, to_jid, namespace, profile_key): - return self._callback("encryptionTrustUIGet", to_jid, namespace, profile_key) + def dbus_encryption_trust_ui_get(self, to_jid, namespace, profile_key): + return self._callback("encryption_trust_ui_get", to_jid, namespace, profile_key) - def dbus_getConfig(self, section, name): - return self._callback("getConfig", section, name) + def dbus_entities_data_get(self, jids, keys, profile): + return self._callback("entities_data_get", jids, keys, profile) - def dbus_getContacts(self, profile_key="@DEFAULT@"): - return self._callback("getContacts", profile_key) + def dbus_entity_data_get(self, jid, keys, profile): + return self._callback("entity_data_get", jid, keys, profile) - def dbus_getContactsFromGroup(self, group, profile_key="@DEFAULT@"): - return self._callback("getContactsFromGroup", group, profile_key) + def dbus_features_get(self, profile_key): + return self._callback("features_get", profile_key) - def dbus_getEntitiesData(self, jids, keys, profile): - return self._callback("getEntitiesData", jids, keys, profile) + def dbus_history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): + return self._callback("history_get", from_jid, to_jid, limit, between, filters, profile) - def dbus_getEntityData(self, jid, keys, profile): - return self._callback("getEntityData", jid, keys, profile) + def dbus_image_check(self, arg_0): + return self._callback("image_check", arg_0) - def dbus_getFeatures(self, profile_key): - return self._callback("getFeatures", profile_key) + def dbus_image_convert(self, source, dest, arg_2, extra): + return self._callback("image_convert", source, dest, arg_2, extra) - def dbus_getMainResource(self, contact_jid, profile_key="@DEFAULT@"): - return self._callback("getMainResource", contact_jid, profile_key) + def dbus_image_generate_preview(self, image_path, profile_key): + return self._callback("image_generate_preview", image_path, profile_key) - def dbus_getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@"): - return self._callback("getParamA", name, category, attribute, profile_key) + def dbus_image_resize(self, image_path, width, height): + return self._callback("image_resize", image_path, width, height) - def dbus_getParamsCategories(self, ): - return self._callback("getParamsCategories", ) + def dbus_is_connected(self, profile_key="@DEFAULT@"): + return self._callback("is_connected", profile_key) - def dbus_getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): - return self._callback("getParamsUI", security_limit, app, extra, profile_key) + def dbus_main_resource_get(self, contact_jid, profile_key="@DEFAULT@"): + return self._callback("main_resource_get", contact_jid, profile_key) - def dbus_getPresenceStatuses(self, profile_key="@DEFAULT@"): - return self._callback("getPresenceStatuses", profile_key) + def dbus_menu_help_get(self, menu_id, language): + return self._callback("menu_help_get", menu_id, language) - def dbus_getReady(self, ): - return self._callback("getReady", ) + def dbus_menu_launch(self, menu_type, path, data, security_limit, profile_key): + return self._callback("menu_launch", menu_type, path, data, security_limit, profile_key) - def dbus_getVersion(self, ): - return self._callback("getVersion", ) + def dbus_menus_get(self, language, security_limit): + return self._callback("menus_get", language, security_limit) - def dbus_getWaitingSub(self, profile_key="@DEFAULT@"): - return self._callback("getWaitingSub", profile_key) + def dbus_message_encryption_get(self, to_jid, profile_key): + return self._callback("message_encryption_get", to_jid, profile_key) - def dbus_historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): - return self._callback("historyGet", from_jid, to_jid, limit, between, filters, profile) + def dbus_message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): + return self._callback("message_encryption_start", to_jid, namespace, replace, profile_key) - def dbus_imageCheck(self, arg_0): - return self._callback("imageCheck", arg_0) + def dbus_message_encryption_stop(self, to_jid, profile_key): + return self._callback("message_encryption_stop", to_jid, profile_key) - def dbus_imageConvert(self, source, dest, arg_2, extra): - return self._callback("imageConvert", source, dest, arg_2, extra) + def dbus_message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): + return self._callback("message_send", to_jid, message, subject, mess_type, extra, profile_key) - def dbus_imageGeneratePreview(self, image_path, profile_key): - return self._callback("imageGeneratePreview", image_path, profile_key) + def dbus_namespaces_get(self, ): + return self._callback("namespaces_get", ) - def dbus_imageResize(self, image_path, width, height): - return self._callback("imageResize", image_path, width, height) - - def dbus_isConnected(self, profile_key="@DEFAULT@"): - return self._callback("isConnected", profile_key) + def dbus_param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@"): + return self._callback("param_get_a", name, category, attribute, profile_key) - def dbus_launchAction(self, callback_id, data, profile_key="@DEFAULT@"): - return self._callback("launchAction", callback_id, data, profile_key) + def dbus_param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): + return self._callback("param_get_a_async", name, category, attribute, security_limit, profile_key) - def dbus_loadParamsTemplate(self, filename): - return self._callback("loadParamsTemplate", filename) + def dbus_param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): + return self._callback("param_set", name, value, category, security_limit, profile_key) - def dbus_menuHelpGet(self, menu_id, language): - return self._callback("menuHelpGet", menu_id, language) + def dbus_param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): + return self._callback("param_ui_get", security_limit, app, extra, profile_key) - def dbus_menuLaunch(self, menu_type, path, data, security_limit, profile_key): - return self._callback("menuLaunch", menu_type, path, data, security_limit, profile_key) + def dbus_params_categories_get(self, ): + return self._callback("params_categories_get", ) - def dbus_menusGet(self, language, security_limit): - return self._callback("menusGet", language, security_limit) + def dbus_params_register_app(self, xml, security_limit=-1, app=''): + return self._callback("params_register_app", xml, security_limit, app) - def dbus_messageEncryptionGet(self, to_jid, profile_key): - return self._callback("messageEncryptionGet", to_jid, profile_key) + def dbus_params_template_load(self, filename): + return self._callback("params_template_load", filename) - def dbus_messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): - return self._callback("messageEncryptionStart", to_jid, namespace, replace, profile_key) + def dbus_params_template_save(self, filename): + return self._callback("params_template_save", filename) - def dbus_messageEncryptionStop(self, to_jid, profile_key): - return self._callback("messageEncryptionStop", to_jid, profile_key) + def dbus_params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): + return self._callback("params_values_from_category_get_async", category, security_limit, app, extra, profile_key) - def dbus_messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): - return self._callback("messageSend", to_jid, message, subject, mess_type, extra, profile_key) + def dbus_presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): + return self._callback("presence_set", to_jid, show, statuses, profile_key) - def dbus_namespacesGet(self, ): - return self._callback("namespacesGet", ) + def dbus_presence_statuses_get(self, profile_key="@DEFAULT@"): + return self._callback("presence_statuses_get", profile_key) - def dbus_paramsRegisterApp(self, xml, security_limit=-1, app=''): - return self._callback("paramsRegisterApp", xml, security_limit, app) + def dbus_private_data_delete(self, namespace, key, arg_2): + return self._callback("private_data_delete", namespace, key, arg_2) - def dbus_privateDataDelete(self, namespace, key, arg_2): - return self._callback("privateDataDelete", namespace, key, arg_2) - - def dbus_privateDataGet(self, namespace, key, profile_key): - return self._callback("privateDataGet", namespace, key, profile_key) + def dbus_private_data_get(self, namespace, key, profile_key): + return self._callback("private_data_get", namespace, key, profile_key) - def dbus_privateDataSet(self, namespace, key, data, profile_key): - return self._callback("privateDataSet", namespace, key, data, profile_key) + def dbus_private_data_set(self, namespace, key, data, profile_key): + return self._callback("private_data_set", namespace, key, data, profile_key) - def dbus_profileCreate(self, profile, password='', component=''): - return self._callback("profileCreate", profile, password, component) + def dbus_profile_create(self, profile, password='', component=''): + return self._callback("profile_create", profile, password, component) - def dbus_profileIsSessionStarted(self, profile_key="@DEFAULT@"): - return self._callback("profileIsSessionStarted", profile_key) + def dbus_profile_delete_async(self, profile): + return self._callback("profile_delete_async", profile) - def dbus_profileNameGet(self, profile_key="@DEFAULT@"): - return self._callback("profileNameGet", profile_key) + def dbus_profile_is_session_started(self, profile_key="@DEFAULT@"): + return self._callback("profile_is_session_started", profile_key) - def dbus_profileSetDefault(self, profile): - return self._callback("profileSetDefault", profile) + def dbus_profile_name_get(self, profile_key="@DEFAULT@"): + return self._callback("profile_name_get", profile_key) - def dbus_profileStartSession(self, password='', profile_key="@DEFAULT@"): - return self._callback("profileStartSession", password, profile_key) + def dbus_profile_set_default(self, profile): + return self._callback("profile_set_default", profile) - def dbus_profilesListGet(self, clients=True, components=False): - return self._callback("profilesListGet", clients, components) + def dbus_profile_start_session(self, password='', profile_key="@DEFAULT@"): + return self._callback("profile_start_session", password, profile_key) - def dbus_progressGet(self, id, profile): - return self._callback("progressGet", id, profile) + def dbus_profiles_list_get(self, clients=True, components=False): + return self._callback("profiles_list_get", clients, components) - def dbus_progressGetAll(self, profile): - return self._callback("progressGetAll", profile) + def dbus_progress_get(self, id, profile): + return self._callback("progress_get", id, profile) - def dbus_progressGetAllMetadata(self, profile): - return self._callback("progressGetAllMetadata", profile) + def dbus_progress_get_all(self, profile): + return self._callback("progress_get_all", profile) - def dbus_rosterResync(self, profile_key="@DEFAULT@"): - return self._callback("rosterResync", profile_key) + def dbus_progress_get_all_metadata(self, profile): + return self._callback("progress_get_all_metadata", profile) - def dbus_saveParamsTemplate(self, filename): - return self._callback("saveParamsTemplate", filename) + def dbus_ready_get(self, ): + return self._callback("ready_get", ) - def dbus_sessionInfosGet(self, profile_key): - return self._callback("sessionInfosGet", profile_key) + def dbus_roster_resync(self, profile_key="@DEFAULT@"): + return self._callback("roster_resync", profile_key) - def dbus_setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): - return self._callback("setParam", name, value, category, security_limit, profile_key) + def dbus_session_infos_get(self, profile_key): + return self._callback("session_infos_get", profile_key) - def dbus_setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): - return self._callback("setPresence", to_jid, show, statuses, profile_key) + def dbus_sub_waiting_get(self, profile_key="@DEFAULT@"): + return self._callback("sub_waiting_get", profile_key) def dbus_subscription(self, sub_type, entity, profile_key="@DEFAULT@"): return self._callback("subscription", sub_type, entity, profile_key) - def dbus_updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@"): - return self._callback("updateContact", entity_jid, name, groups, profile_key) + def dbus_version_get(self, ): + return self._callback("version_get", ) -class Bridge: +class bridge: def __init__(self): log.info("Init DBus...") self._obj = DBusObject(const_OBJ_PATH) - async def postInit(self): + async def post_init(self): try: conn = await client.connect(reactor) except error.DBusException as e: @@ -420,47 +420,47 @@ def _debug(self, action, params, profile): self._obj.emitSignal("_debug", action, params, profile) - def actionNew(self, action_data, id, security_limit, profile): - self._obj.emitSignal("actionNew", action_data, id, security_limit, profile) + def action_new(self, action_data, id, security_limit, profile): + self._obj.emitSignal("action_new", action_data, id, security_limit, profile) def connected(self, jid_s, profile): self._obj.emitSignal("connected", jid_s, profile) - def contactDeleted(self, entity_jid, profile): - self._obj.emitSignal("contactDeleted", entity_jid, profile) + def contact_deleted(self, entity_jid, profile): + self._obj.emitSignal("contact_deleted", entity_jid, profile) + + def contact_new(self, contact_jid, attributes, groups, profile): + self._obj.emitSignal("contact_new", contact_jid, attributes, groups, profile) def disconnected(self, profile): self._obj.emitSignal("disconnected", profile) - def entityDataUpdated(self, jid, name, value, profile): - self._obj.emitSignal("entityDataUpdated", jid, name, value, profile) + def entity_data_updated(self, jid, name, value, profile): + self._obj.emitSignal("entity_data_updated", jid, name, value, profile) - def messageEncryptionStarted(self, to_jid, encryption_data, profile_key): - self._obj.emitSignal("messageEncryptionStarted", to_jid, encryption_data, profile_key) + def message_encryption_started(self, to_jid, encryption_data, profile_key): + self._obj.emitSignal("message_encryption_started", to_jid, encryption_data, profile_key) - def messageEncryptionStopped(self, to_jid, encryption_data, profile_key): - self._obj.emitSignal("messageEncryptionStopped", to_jid, encryption_data, profile_key) + def message_encryption_stopped(self, to_jid, encryption_data, profile_key): + self._obj.emitSignal("message_encryption_stopped", to_jid, encryption_data, profile_key) - def messageNew(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): - self._obj.emitSignal("messageNew", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - - def newContact(self, contact_jid, attributes, groups, profile): - self._obj.emitSignal("newContact", contact_jid, attributes, groups, profile) + def message_new(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): + self._obj.emitSignal("message_new", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - def paramUpdate(self, name, value, category, profile): - self._obj.emitSignal("paramUpdate", name, value, category, profile) + def param_update(self, name, value, category, profile): + self._obj.emitSignal("param_update", name, value, category, profile) - def presenceUpdate(self, entity_jid, show, priority, statuses, profile): - self._obj.emitSignal("presenceUpdate", entity_jid, show, priority, statuses, profile) + def presence_update(self, entity_jid, show, priority, statuses, profile): + self._obj.emitSignal("presence_update", entity_jid, show, priority, statuses, profile) - def progressError(self, id, error, profile): - self._obj.emitSignal("progressError", id, error, profile) + def progress_error(self, id, error, profile): + self._obj.emitSignal("progress_error", id, error, profile) - def progressFinished(self, id, metadata, profile): - self._obj.emitSignal("progressFinished", id, metadata, profile) + def progress_finished(self, id, metadata, profile): + self._obj.emitSignal("progress_finished", id, metadata, profile) - def progressStarted(self, id, metadata, profile): - self._obj.emitSignal("progressStarted", id, metadata, profile) + def progress_started(self, id, metadata, profile): + self._obj.emitSignal("progress_started", id, metadata, profile) def subscribe(self, sub_type, entity_jid, profile): self._obj.emitSignal("subscribe", sub_type, entity_jid, profile) @@ -469,13 +469,13 @@ log.debug(f"registering DBus bridge method [{name}]") self._obj.register_method(name, callback) - def emitSignal(self, name, *args): + def emit_signal(self, name, *args): self._obj.emitSignal(name, *args) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to D-Bus Bridge""" + """Dynamically add a method to D-Bus bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug(f"Adding method {name!r} to D-Bus bridge") self._obj.plugin_iface.addMethod( @@ -488,8 +488,8 @@ setattr(self._obj, f"dbus_{name}", MethodType(caller, self._obj)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): - """Dynamically add a signal to D-Bus Bridge""" + def add_signal(self, name, int_suffix, signature, doc={}): + """Dynamically add a signal to D-Bus bridge""" log.debug(f"Adding signal {name!r} to D-Bus bridge") self._obj.plugin_iface.addSignal(Signal(name, signature)) - setattr(Bridge, name, partialmethod(Bridge.emitSignal, name)) \ No newline at end of file + setattr(bridge, name, partialmethod(bridge.emit_signal, name)) \ No newline at end of file
--- a/sat/bridge/pb.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/bridge/pb.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,17 +55,17 @@ def __init__(self): self.signals_handlers = [] - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(HandlerWrapper(signals_handler)) log.info("registered signal handler") - def sendSignalEb(self, failure_, signal_name): + def send_signal_eb(self, failure_, signal_name): if not failure_.check(pb.PBConnectionLost): log.error( f"Error while sending signal {signal_name}: {failure_}", ) - def sendSignal(self, name, args, kwargs): + def send_signal(self, name, args, kwargs): to_remove = [] for wrapper in self.signals_handlers: handler = wrapper.handler @@ -74,13 +74,13 @@ except pb.DeadReferenceError: to_remove.append(wrapper) else: - d.addErrback(self.sendSignalEb, name) + d.addErrback(self.send_signal_eb, name) if to_remove: for wrapper in to_remove: log.debug("Removing signal handler for dead frontend") self.signals_handlers.remove(wrapper) - def _bridgeDeactivateSignals(self): + def _bridge_deactivate_signals(self): if hasattr(self, "signals_paused"): log.warning("bridge signals already deactivated") if self.signals_handler: @@ -90,7 +90,7 @@ self.signals_handlers = [] log.debug("bridge signals have been deactivated") - def _bridgeReactivateSignals(self): + def _bridge_reactivate_signals(self): try: self.signals_handlers = self.signals_paused except AttributeError: @@ -102,31 +102,31 @@ ##METHODS_PART## -class Bridge(object): +class bridge(object): def __init__(self): log.info("Init Perspective Broker...") self.root = PBRoot() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" log.info(f"using UNIX Socket at {socket_path}") reactor.listenUNIX( str(socket_path), pb.PBServerFactory(self.root), mode=0o600 ) elif conn_type == "socket": - port = int(getConf("port", 8789)) + port = int(get_conf("port", 8789)) log.info(f"using TCP Socket at port {port}") reactor.listenTCP(port, pb.PBServerFactory(self.root)) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") - def sendSignal(self, name, *args, **kwargs): - self.root.sendSignal(name, args, kwargs) + def send_signal(self, name, *args, **kwargs): + self.root.send_signal(name, args, kwargs) - def remote_initBridge(self, signals_handler): + def remote_init_bridge(self, signals_handler): self.signals_handlers.append(signals_handler) log.info("registered signal handler") @@ -135,78 +135,78 @@ setattr(self.root, "remote_" + name, callback) # self.root.register_method(name, callback) - def addMethod( + def add_method( self, name, int_suffix, in_sign, out_sign, method, async_=False, doc={} ): - """Dynamically add a method to PB Bridge""" + """Dynamically add a method to PB bridge""" # FIXME: doc parameter is kept only temporary, the time to remove it from calls log.debug("Adding method {name} to PB bridge".format(name=name)) self.register_method(name, method) - def addSignal(self, name, int_suffix, signature, doc={}): + def add_signal(self, name, int_suffix, signature, doc={}): log.debug("Adding signal {name} to PB bridge".format(name=name)) setattr( - self, name, lambda *args, **kwargs: self.sendSignal(name, *args, **kwargs) + self, name, lambda *args, **kwargs: self.send_signal(name, *args, **kwargs) ) - def bridgeDeactivateSignals(self): + def bridge_deactivate_signals(self): """Stop sending signals to bridge Mainly used for mobile frontends, when the frontend is paused """ - self.root._bridgeDeactivateSignals() + self.root._bridge_deactivate_signals() - def bridgeReactivateSignals(self): + def bridge_reactivate_signals(self): """Send again signals to bridge - Should only be used after bridgeDeactivateSignals has been called + Should only be used after bridge_deactivate_signals has been called """ - self.root._bridgeReactivateSignals() + self.root._bridge_reactivate_signals() def _debug(self, action, params, profile): - self.sendSignal("_debug", action, params, profile) + self.send_signal("_debug", action, params, profile) - def actionNew(self, action_data, id, security_limit, profile): - self.sendSignal("actionNew", action_data, id, security_limit, profile) + def action_new(self, action_data, id, security_limit, profile): + self.send_signal("action_new", action_data, id, security_limit, profile) def connected(self, jid_s, profile): - self.sendSignal("connected", jid_s, profile) + self.send_signal("connected", jid_s, profile) - def contactDeleted(self, entity_jid, profile): - self.sendSignal("contactDeleted", entity_jid, profile) + def contact_deleted(self, entity_jid, profile): + self.send_signal("contact_deleted", entity_jid, profile) + + def contact_new(self, contact_jid, attributes, groups, profile): + self.send_signal("contact_new", contact_jid, attributes, groups, profile) def disconnected(self, profile): - self.sendSignal("disconnected", profile) + self.send_signal("disconnected", profile) - def entityDataUpdated(self, jid, name, value, profile): - self.sendSignal("entityDataUpdated", jid, name, value, profile) + def entity_data_updated(self, jid, name, value, profile): + self.send_signal("entity_data_updated", jid, name, value, profile) - def messageEncryptionStarted(self, to_jid, encryption_data, profile_key): - self.sendSignal("messageEncryptionStarted", to_jid, encryption_data, profile_key) + def message_encryption_started(self, to_jid, encryption_data, profile_key): + self.send_signal("message_encryption_started", to_jid, encryption_data, profile_key) - def messageEncryptionStopped(self, to_jid, encryption_data, profile_key): - self.sendSignal("messageEncryptionStopped", to_jid, encryption_data, profile_key) + def message_encryption_stopped(self, to_jid, encryption_data, profile_key): + self.send_signal("message_encryption_stopped", to_jid, encryption_data, profile_key) - def messageNew(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): - self.sendSignal("messageNew", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) + def message_new(self, uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile): + self.send_signal("message_new", uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra, profile) - def newContact(self, contact_jid, attributes, groups, profile): - self.sendSignal("newContact", contact_jid, attributes, groups, profile) + def param_update(self, name, value, category, profile): + self.send_signal("param_update", name, value, category, profile) - def paramUpdate(self, name, value, category, profile): - self.sendSignal("paramUpdate", name, value, category, profile) - - def presenceUpdate(self, entity_jid, show, priority, statuses, profile): - self.sendSignal("presenceUpdate", entity_jid, show, priority, statuses, profile) + def presence_update(self, entity_jid, show, priority, statuses, profile): + self.send_signal("presence_update", entity_jid, show, priority, statuses, profile) - def progressError(self, id, error, profile): - self.sendSignal("progressError", id, error, profile) + def progress_error(self, id, error, profile): + self.send_signal("progress_error", id, error, profile) - def progressFinished(self, id, metadata, profile): - self.sendSignal("progressFinished", id, metadata, profile) + def progress_finished(self, id, metadata, profile): + self.send_signal("progress_finished", id, metadata, profile) - def progressStarted(self, id, metadata, profile): - self.sendSignal("progressStarted", id, metadata, profile) + def progress_started(self, id, metadata, profile): + self.send_signal("progress_started", id, metadata, profile) def subscribe(self, sub_type, entity_jid, profile): - self.sendSignal("subscribe", sub_type, entity_jid, profile) + self.send_signal("subscribe", sub_type, entity_jid, profile)
--- a/sat/core/constants.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/constants.py Sat Apr 08 13:54:42 2023 +0200 @@ -435,7 +435,7 @@ return value.lower() in (cls.BOOL_TRUE, "1", "yes", "on") @classmethod - def boolConst(cls, value: bool) -> str: + def bool_const(cls, value: bool) -> str: """@return (str): constant associated to bool value""" assert isinstance(value, bool) return cls.BOOL_TRUE if value else cls.BOOL_FALSE
--- a/sat/core/i18n.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/i18n.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,7 +30,7 @@ _ = gettext.translation("sat", "i18n", fallback=True).gettext _translators = {None: gettext.NullTranslations()} - def languageSwitch(lang=None): + def language_switch(lang=None): if not lang in _translators: _translators[lang] = gettext.translation( "sat", languages=[lang], fallback=True @@ -43,7 +43,7 @@ log.warning("gettext support disabled") _ = cast(Callable[[str], str], lambda msg: msg) # Libervia doesn't support gettext - def languageSwitch(lang=None): + def language_switch(lang=None): pass
--- a/sat/core/log.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/log.py Sat Apr 08 13:54:42 2023 +0200 @@ -121,7 +121,7 @@ 'levelname': level, } try: - if not self.filter_name.dictFilter(record): + if not self.filter_name.dict_filter(record): raise Filtered except (AttributeError, TypeError): # XXX: TypeError is here because of a pyjamas bug which need to be fixed (TypeError is raised instead of AttributeError) if self.filter_name is not None: @@ -133,7 +133,7 @@ except KeyError as e: if e.args[0] == 'profile': # XXX: %(profile)s use some magic with introspection, for debugging purpose only *DO NOT* use in production - record['profile'] = configure_cls[backend].getProfile() + record['profile'] = configure_cls[backend].get_profile() return self.fmt % record else: raise e @@ -174,7 +174,7 @@ return 1 return 0 - def dictFilter(self, dict_record): + def dict_filter(self, dict_record): """Filter using a dictionary record @param dict_record: dictionary with at list a key "name" with logger name @@ -208,26 +208,26 @@ @param force_colors: if True ANSI colors are used even if stdout is not a tty """ self.backend_data = backend_data - self.preTreatment() - self.configureLevel(level) - self.configureFormat(fmt) - self.configureOutput(output) - self.configureLogger(logger) - self.configureColors(colors, force_colors, levels_taints_dict) - self.postTreatment() - self.updateCurrentLogger() + self.pre_treatment() + self.configure_level(level) + self.configure_format(fmt) + self.configure_output(output) + self.configure_logger(logger) + self.configure_colors(colors, force_colors, levels_taints_dict) + self.post_treatment() + self.update_current_logger() - def updateCurrentLogger(self): + def update_current_logger(self): """update existing logger to the class needed for this backend""" if self.LOGGER_CLASS is None: return for name, logger in list(_loggers.items()): _loggers[name] = self.LOGGER_CLASS(logger) - def preTreatment(self): + def pre_treatment(self): pass - def configureLevel(self, level): + def configure_level(self, level): if level is not None: # we deactivate methods below level level_idx = C.LOG_LEVELS.index(level) @@ -236,7 +236,7 @@ for _level in C.LOG_LEVELS[:level_idx]: setattr(Logger, _level.lower(), dev_null) - def configureFormat(self, fmt): + def configure_format(self, fmt): if fmt is not None: if fmt != '%(message)s': # %(message)s is the same as None Logger.fmt = fmt @@ -246,17 +246,17 @@ # color_start not followed by an end, we add it Logger.fmt += COLOR_END - def configureOutput(self, output): + def configure_output(self, output): if output is not None: if output != C.LOG_OPT_OUTPUT_SEP + C.LOG_OPT_OUTPUT_DEFAULT: # TODO: manage other outputs raise NotImplementedError("Basic backend only manage default output yet") - def configureLogger(self, logger): + def configure_logger(self, logger): if logger: Logger.filter_name = FilterName(logger) - def configureColors(self, colors, force_colors, levels_taints_dict): + def configure_colors(self, colors, force_colors, levels_taints_dict): if colors: # if color are used, we need to handle levels_taints_dict for level in list(levels_taints_dict.keys()): @@ -280,10 +280,10 @@ ansi_list.append(ansi) taints[level] = ''.join(ansi_list) - def postTreatment(self): + def post_treatment(self): pass - def manageOutputs(self, outputs_raw): + def manage_outputs(self, outputs_raw): """ Parse output option in a backend agnostic way, and fill handlers consequently @param outputs_raw: output option as enterred in environment variable or in configuration @@ -330,7 +330,7 @@ raise ValueError("options [{options}] are not supported for {handler} output".format(options=options, handler=output)) @staticmethod - def memoryGet(size=None): + def memory_get(size=None): """Return buffered logs @param size: number of logs to return @@ -338,7 +338,7 @@ raise NotImplementedError @classmethod - def ansiColors(cls, level, message): + def ansi_colors(cls, level, message): """Colorise message depending on level for terminals @param level: one of C.LOG_LEVELS @@ -358,7 +358,7 @@ return '%s%s%s' % (start, message, A.RESET) @staticmethod - def getProfile(): + def get_profile(): """Try to find profile value using introspection""" raise NotImplementedError @@ -396,10 +396,10 @@ else: configure_class(**options) -def memoryGet(size=None): +def memory_get(size=None): if not C.LOG_OPT_OUTPUT_MEMORY in handlers: raise ValueError('memory output is not used') - return configure_cls[backend].memoryGet(size) + return configure_cls[backend].memory_get(size) def getLogger(name=C.LOG_BASE_LOGGER) -> Logger: try:
--- a/sat/core/log_config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/log_config.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,8 +47,8 @@ class ConfigureBasic(log.ConfigureBase): - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureBasic, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureBasic, self).configure_colors( colors, force_colors, levels_taints_dict ) if colors: @@ -61,14 +61,14 @@ # FIXME: isatty should be tested on each handler, not globaly if (force_colors or isatty): # we need colors - log.Logger.post_treat = lambda logger, level, message: self.ansiColors( + log.Logger.post_treat = lambda logger, level, message: self.ansi_colors( level, message ) elif force_colors: raise ValueError("force_colors can't be used if colors is False") @staticmethod - def getProfile(): + def get_profile(): """Try to find profile value using introspection""" import inspect @@ -107,7 +107,7 @@ class ConfigureTwisted(ConfigureBasic): LOGGER_CLASS = TwistedLogger - def preTreatment(self): + def pre_treatment(self): from twisted import logger global logger self.level_map = { @@ -119,17 +119,17 @@ } self.LOGGER_CLASS.level_map = self.level_map - def configureLevel(self, level): + def configure_level(self, level): self.level = self.level_map[level] - def configureOutput(self, output): + def configure_output(self, output): import sys from twisted.python import logfile self.log_publisher = logger.LogPublisher() if output is None: output = C.LOG_OPT_OUTPUT_SEP + C.LOG_OPT_OUTPUT_DEFAULT - self.manageOutputs(output) + self.manage_outputs(output) if C.LOG_OPT_OUTPUT_DEFAULT in log.handlers: if self.backend_data is None: @@ -139,11 +139,11 @@ options = self.backend_data log_file = logfile.LogFile.fromFullPath(options['logfile']) self.log_publisher.addObserver( - logger.FileLogObserver(log_file, self.textFormatter)) + logger.FileLogObserver(log_file, self.text_formatter)) # we also want output to stdout if we are in debug or nodaemon mode if options.get("nodaemon", False) or options.get("debug", False): self.log_publisher.addObserver( - logger.FileLogObserver(sys.stdout, self.textFormatter)) + logger.FileLogObserver(sys.stdout, self.text_formatter)) if C.LOG_OPT_OUTPUT_FILE in log.handlers: @@ -152,15 +152,15 @@ sys.stdout if path == "-" else logfile.LogFile.fromFullPath(path) ) self.log_publisher.addObserver( - logger.FileLogObserver(log_file, self.textFormatter)) + logger.FileLogObserver(log_file, self.text_formatter)) if C.LOG_OPT_OUTPUT_MEMORY in log.handlers: raise NotImplementedError( "Memory observer is not implemented in Twisted backend" ) - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureTwisted, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureTwisted, self).configure_colors( colors, force_colors, levels_taints_dict ) self.LOGGER_CLASS.colors = colors @@ -168,7 +168,7 @@ if force_colors and not colors: raise ValueError("colors must be True if force_colors is True") - def postTreatment(self): + def post_treatment(self): """Install twistedObserver which manage non SàT logs""" # from twisted import logger import sys @@ -180,7 +180,7 @@ ) logger.globalLogBeginner.beginLoggingTo([filtering_obs]) - def textFormatter(self, event): + def text_formatter(self, event): if event.get('sat_logged', False): timestamp = ''.join([logger.formatTime(event.get("log_time", None)), " "]) return f"{timestamp}{event.get('log_format', '')}\n" @@ -219,7 +219,7 @@ backend_data, ) - def preTreatment(self): + def pre_treatment(self): """We use logging methods directly, instead of using Logger""" import logging @@ -230,13 +230,13 @@ log.error = logging.error log.critical = logging.critical - def configureLevel(self, level): + def configure_level(self, level): if level is None: level = C.LOG_LVL_DEBUG self.level = level - def configureFormat(self, fmt): - super(ConfigureStandard, self).configureFormat(fmt) + def configure_format(self, fmt): + super(ConfigureStandard, self).configure_format(fmt) import logging class SatFormatter(logging.Formatter): @@ -250,11 +250,11 @@ def format(self, record): if self._with_profile: - record.profile = ConfigureStandard.getProfile() + record.profile = ConfigureStandard.get_profile() do_color = self.with_colors and (self.can_colors or self.force_colors) if ConfigureStandard._color_location: # we copy raw formatting strings for color_* - # as formatting is handled in ansiColors in this case + # as formatting is handled in ansi_colors in this case if do_color: record.color_start = log.COLOR_START record.color_end = log.COLOR_END @@ -262,19 +262,19 @@ record.color_start = record.color_end = "" s = super(SatFormatter, self).format(record) if do_color: - s = ConfigureStandard.ansiColors(record.levelname, s) + s = ConfigureStandard.ansi_colors(record.levelname, s) return s self.formatterClass = SatFormatter - def configureOutput(self, output): - self.manageOutputs(output) + def configure_output(self, output): + self.manage_outputs(output) - def configureLogger(self, logger): + def configure_logger(self, logger): self.name_filter = log.FilterName(logger) if logger else None - def configureColors(self, colors, force_colors, levels_taints_dict): - super(ConfigureStandard, self).configureColors( + def configure_colors(self, colors, force_colors, levels_taints_dict): + super(ConfigureStandard, self).configure_colors( colors, force_colors, levels_taints_dict ) self.formatterClass.with_colors = colors @@ -282,14 +282,14 @@ if not colors and force_colors: raise ValueError("force_colors can't be used if colors is False") - def _addHandler(self, root_logger, hdlr, can_colors=False): + def _add_handler(self, root_logger, hdlr, can_colors=False): hdlr.setFormatter(self.formatterClass(can_colors)) root_logger.addHandler(hdlr) root_logger.setLevel(self.level) if self.name_filter is not None: hdlr.addFilter(self.name_filter) - def postTreatment(self): + def post_treatment(self): import logging root_logger = logging.getLogger() @@ -301,7 +301,7 @@ can_colors = hdlr.stream.isatty() except AttributeError: can_colors = False - self._addHandler(root_logger, hdlr, can_colors=can_colors) + self._add_handler(root_logger, hdlr, can_colors=can_colors) elif handler == C.LOG_OPT_OUTPUT_MEMORY: from logging.handlers import BufferingHandler @@ -315,20 +315,20 @@ ] = ( hdlr ) # we keep a reference to the handler to read the buffer later - self._addHandler(root_logger, hdlr, can_colors=False) + self._add_handler(root_logger, hdlr, can_colors=False) elif handler == C.LOG_OPT_OUTPUT_FILE: import os.path for path in options: hdlr = logging.FileHandler(os.path.expanduser(path)) - self._addHandler(root_logger, hdlr, can_colors=False) + self._add_handler(root_logger, hdlr, can_colors=False) else: raise ValueError("Unknown handler type") else: root_logger.warning("Handlers already set on root logger") @staticmethod - def memoryGet(size=None): + def memory_get(size=None): """Return buffered logs @param size: number of logs to return @@ -355,7 +355,7 @@ return log.configure(backend, **options) -def _parseOptions(options): +def _parse_options(options): """Parse string options as given in conf or environment variable, and return expected python value @param options (dict): options with (key: name, value: string value) @@ -378,7 +378,7 @@ options[LEVEL] = level -def satConfigure(backend=C.LOG_BACKEND_STANDARD, const=None, backend_data=None): +def sat_configure(backend=C.LOG_BACKEND_STANDARD, const=None, backend_data=None): """Configure logging system for SàT, can be used by frontends logs conf is read in SàT conf, then in environment variables. It must be done before Memory init @@ -396,16 +396,16 @@ import os log_conf = {} - sat_conf = config.parseMainConf() + sat_conf = config.parse_main_conf() for opt_name, opt_default in C.LOG_OPTIONS(): try: log_conf[opt_name] = os.environ[ "".join((C.ENV_PREFIX, C.LOG_OPT_PREFIX.upper(), opt_name.upper())) ] except KeyError: - log_conf[opt_name] = config.getConfig( + log_conf[opt_name] = config.config_get( sat_conf, C.LOG_OPT_SECTION, C.LOG_OPT_PREFIX + opt_name, opt_default ) - _parseOptions(log_conf) + _parse_options(log_conf) configure(backend, backend_data=backend_data, **log_conf)
--- a/sat/core/patches.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/patches.py Sat Apr 08 13:54:42 2023 +0200 @@ -71,7 +71,7 @@ self._onElementHooks = [] self._sendHooks = [] - def addHook(self, hook_type, callback): + def add_hook(self, hook_type, callback): """Add a send or receive hook""" conflict_msg = f"Hook conflict: can't add {hook_type} hook {callback}" if hook_type == C.STREAM_HOOK_RECEIVE:
--- a/sat/core/sat_main.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/sat_main.py Sat Apr 08 13:54:42 2023 +0200 @@ -26,7 +26,7 @@ from wokkel.data_form import Option import sat -from sat.core.i18n import _, D_, languageSwitch +from sat.core.i18n import _, D_, language_switch from sat.core import patches patches.apply() from twisted.application import service @@ -69,7 +69,7 @@ self.profiles = {} self.plugins = {} # map for short name to whole namespace, - # extended by plugins with registerNamespace + # extended by plugins with register_namespace self.ns_map = { "x-data": xmpp.NS_X_DATA, "disco#info": xmpp.NS_DISCO_INFO, @@ -84,7 +84,7 @@ bridge_name = ( os.getenv("LIBERVIA_BRIDGE_NAME") - or self.memory.getConfig("", "bridge", "dbus") + or self.memory.config_get("", "bridge", "dbus") ) bridge_module = dynamic_import.bridge(bridge_name) @@ -93,9 +93,9 @@ sys.exit(1) log.info(f"using {bridge_name} bridge") try: - self.bridge = bridge_module.Bridge() + self.bridge = bridge_module.bridge() except exceptions.BridgeInitError: - log.exception("Bridge can't be initialised, can't start Libervia Backend") + log.exception("bridge can't be initialised, can't start Libervia Backend") sys.exit(1) defer.ensureDeferred(self._post_init()) @@ -118,7 +118,7 @@ return self._version_cache except AttributeError: self._version_cache = "{} « {} » ({})".format( - version, C.APP_RELEASE_NAME, utils.getRepositoryData(sat) + version, C.APP_RELEASE_NAME, utils.get_repository_data(sat) ) return self._version_cache else: @@ -130,7 +130,7 @@ async def _post_init(self): try: - bridge_pi = self.bridge.postInit + bridge_pi = self.bridge.post_init except AttributeError: pass else: @@ -142,84 +142,84 @@ reactor.callLater(0, self.stop) return - self.bridge.register_method("getReady", lambda: self.initialised) - self.bridge.register_method("getVersion", lambda: self.full_version) - self.bridge.register_method("getFeatures", self.getFeatures) - self.bridge.register_method("profileNameGet", self.memory.getProfileName) - self.bridge.register_method("profilesListGet", self.memory.getProfilesList) - self.bridge.register_method("getEntityData", self.memory._getEntityData) - self.bridge.register_method("getEntitiesData", self.memory._getEntitiesData) - self.bridge.register_method("profileCreate", self.memory.createProfile) - self.bridge.register_method("asyncDeleteProfile", self.memory.asyncDeleteProfile) - self.bridge.register_method("profileStartSession", self.memory.startSession) + self.bridge.register_method("ready_get", lambda: self.initialised) + self.bridge.register_method("version_get", lambda: self.full_version) + self.bridge.register_method("features_get", self.features_get) + self.bridge.register_method("profile_name_get", self.memory.get_profile_name) + self.bridge.register_method("profiles_list_get", self.memory.get_profiles_list) + self.bridge.register_method("entity_data_get", self.memory._get_entity_data) + self.bridge.register_method("entities_data_get", self.memory._get_entities_data) + self.bridge.register_method("profile_create", self.memory.create_profile) + self.bridge.register_method("profile_delete_async", self.memory.profile_delete_async) + self.bridge.register_method("profile_start_session", self.memory.start_session) self.bridge.register_method( - "profileIsSessionStarted", self.memory._isSessionStarted + "profile_is_session_started", self.memory._is_session_started ) - self.bridge.register_method("profileSetDefault", self.memory.profileSetDefault) + self.bridge.register_method("profile_set_default", self.memory.profile_set_default) self.bridge.register_method("connect", self._connect) self.bridge.register_method("disconnect", self.disconnect) - self.bridge.register_method("contactGet", self._contactGet) - self.bridge.register_method("getContacts", self.getContacts) - self.bridge.register_method("getContactsFromGroup", self.getContactsFromGroup) - self.bridge.register_method("getMainResource", self.memory._getMainResource) + self.bridge.register_method("contact_get", self._contact_get) + self.bridge.register_method("contacts_get", self.contacts_get) + self.bridge.register_method("contacts_get_from_group", self.contacts_get_from_group) + self.bridge.register_method("main_resource_get", self.memory._get_main_resource) self.bridge.register_method( - "getPresenceStatuses", self.memory._getPresenceStatuses + "presence_statuses_get", self.memory._get_presence_statuses ) - self.bridge.register_method("getWaitingSub", self.memory.getWaitingSub) - self.bridge.register_method("messageSend", self._messageSend) - self.bridge.register_method("messageEncryptionStart", - self._messageEncryptionStart) - self.bridge.register_method("messageEncryptionStop", - self._messageEncryptionStop) - self.bridge.register_method("messageEncryptionGet", - self._messageEncryptionGet) - self.bridge.register_method("encryptionNamespaceGet", - self._encryptionNamespaceGet) - self.bridge.register_method("encryptionPluginsGet", self._encryptionPluginsGet) - self.bridge.register_method("encryptionTrustUIGet", self._encryptionTrustUIGet) - self.bridge.register_method("getConfig", self._getConfig) - self.bridge.register_method("setParam", self.setParam) - self.bridge.register_method("getParamA", self.memory.getStringParamA) - self.bridge.register_method("privateDataGet", self.memory._privateDataGet) - self.bridge.register_method("privateDataSet", self.memory._privateDataSet) - self.bridge.register_method("privateDataDelete", self.memory._privateDataDelete) - self.bridge.register_method("asyncGetParamA", self.memory.asyncGetStringParamA) + self.bridge.register_method("sub_waiting_get", self.memory.sub_waiting_get) + self.bridge.register_method("message_send", self._message_send) + self.bridge.register_method("message_encryption_start", + self._message_encryption_start) + self.bridge.register_method("message_encryption_stop", + self._message_encryption_stop) + self.bridge.register_method("message_encryption_get", + self._message_encryption_get) + self.bridge.register_method("encryption_namespace_get", + self._encryption_namespace_get) + self.bridge.register_method("encryption_plugins_get", self._encryption_plugins_get) + self.bridge.register_method("encryption_trust_ui_get", self._encryption_trust_ui_get) + self.bridge.register_method("config_get", self._get_config) + self.bridge.register_method("param_set", self.param_set) + self.bridge.register_method("param_get_a", self.memory.get_string_param_a) + self.bridge.register_method("private_data_get", self.memory._private_data_get) + self.bridge.register_method("private_data_set", self.memory._private_data_set) + self.bridge.register_method("private_data_delete", self.memory._private_data_delete) + self.bridge.register_method("param_get_a_async", self.memory.async_get_string_param_a) self.bridge.register_method( - "asyncGetParamsValuesFromCategory", - self.memory._getParamsValuesFromCategory, + "params_values_from_category_get_async", + self.memory._get_params_values_from_category, ) - self.bridge.register_method("getParamsUI", self.memory._getParamsUI) + self.bridge.register_method("param_ui_get", self.memory._get_params_ui) self.bridge.register_method( - "getParamsCategories", self.memory.getParamsCategories + "params_categories_get", self.memory.params_categories_get ) - self.bridge.register_method("paramsRegisterApp", self.memory.paramsRegisterApp) - self.bridge.register_method("historyGet", self.memory._historyGet) - self.bridge.register_method("setPresence", self._setPresence) + self.bridge.register_method("params_register_app", self.memory.params_register_app) + self.bridge.register_method("history_get", self.memory._history_get) + self.bridge.register_method("presence_set", self._set_presence) self.bridge.register_method("subscription", self.subscription) - self.bridge.register_method("addContact", self._addContact) - self.bridge.register_method("updateContact", self._updateContact) - self.bridge.register_method("delContact", self._delContact) - self.bridge.register_method("rosterResync", self._rosterResync) - self.bridge.register_method("isConnected", self.isConnected) - self.bridge.register_method("launchAction", self.launchCallback) - self.bridge.register_method("actionsGet", self.actionsGet) - self.bridge.register_method("progressGet", self._progressGet) - self.bridge.register_method("progressGetAll", self._progressGetAll) - self.bridge.register_method("menusGet", self.getMenus) - self.bridge.register_method("menuHelpGet", self.getMenuHelp) - self.bridge.register_method("menuLaunch", self._launchMenu) - self.bridge.register_method("discoInfos", self.memory.disco._discoInfos) - self.bridge.register_method("discoItems", self.memory.disco._discoItems) - self.bridge.register_method("discoFindByFeatures", self._findByFeatures) - self.bridge.register_method("saveParamsTemplate", self.memory.save_xml) - self.bridge.register_method("loadParamsTemplate", self.memory.load_xml) - self.bridge.register_method("sessionInfosGet", self.getSessionInfos) - self.bridge.register_method("devicesInfosGet", self._getDevicesInfos) - self.bridge.register_method("namespacesGet", self.getNamespaces) - self.bridge.register_method("imageCheck", self._imageCheck) - self.bridge.register_method("imageResize", self._imageResize) - self.bridge.register_method("imageGeneratePreview", self._imageGeneratePreview) - self.bridge.register_method("imageConvert", self._imageConvert) + self.bridge.register_method("contact_add", self._add_contact) + self.bridge.register_method("contact_update", self._update_contact) + self.bridge.register_method("contact_del", self._del_contact) + self.bridge.register_method("roster_resync", self._roster_resync) + self.bridge.register_method("is_connected", self.is_connected) + self.bridge.register_method("action_launch", self.launch_callback) + self.bridge.register_method("actions_get", self.actions_get) + self.bridge.register_method("progress_get", self._progress_get) + self.bridge.register_method("progress_get_all", self._progress_get_all) + self.bridge.register_method("menus_get", self.get_menus) + self.bridge.register_method("menu_help_get", self.get_menu_help) + self.bridge.register_method("menu_launch", self._launch_menu) + self.bridge.register_method("disco_infos", self.memory.disco._disco_infos) + self.bridge.register_method("disco_items", self.memory.disco._disco_items) + self.bridge.register_method("disco_find_by_features", self._find_by_features) + self.bridge.register_method("params_template_save", self.memory.save_xml) + self.bridge.register_method("params_template_load", self.memory.load_xml) + self.bridge.register_method("session_infos_get", self.get_session_infos) + self.bridge.register_method("devices_infos_get", self._get_devices_infos) + self.bridge.register_method("namespaces_get", self.get_namespaces) + self.bridge.register_method("image_check", self._image_check) + self.bridge.register_method("image_resize", self._image_resize) + self.bridge.register_method("image_generate_preview", self._image_generate_preview) + self.bridge.register_method("image_convert", self._image_convert) await self.memory.initialise() @@ -232,14 +232,14 @@ except Exception as e: log.error(f"Could not initialize backend: {e}") sys.exit(1) - self._addBaseMenus() + self._add_base_menus() self.initialised.callback(None) log.info(_("Backend is ready")) # profile autoconnection must be done after self.initialised is called because - # startSession waits for it. - autoconnect_dict = await self.memory.storage.getIndParamValues( + # start_session waits for it. + autoconnect_dict = await self.memory.storage.get_ind_param_values( category='Connection', name='autoconnect_backend', ) profiles_autoconnect = [p for p, v in autoconnect_dict.items() if C.bool(v)] @@ -264,9 +264,9 @@ reason = result) ) - def _addBaseMenus(self): + def _add_base_menus(self): """Add base menus""" - encryption.EncryptionHandler._importMenus(self) + encryption.EncryptionHandler._import_menus(self) def _unimport_plugin(self, plugin_path): """remove a plugin from sys.modules if it is there""" @@ -276,7 +276,7 @@ pass def _import_plugins(self): - """Import all plugins found in plugins directory""" + """import all plugins found in plugins directory""" # FIXME: module imported but cancelled should be deleted # TODO: make this more generic and reusable in tools.common # FIXME: should use imp @@ -446,7 +446,7 @@ self.plugins[import_name]._info = plugin_info # TODO: test xmppclient presence and register handler parent - def pluginsUnload(self): + def plugins_unload(self): """Call unload method on every loaded plugin, if exists @return (D): A deferred which return None when all method have been called @@ -461,11 +461,11 @@ except AttributeError: continue else: - defers_list.append(utils.asDeferred(unload)) + defers_list.append(utils.as_deferred(unload)) return defers_list def _connect(self, profile_key, password="", options=None): - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) return defer.ensureDeferred(self.connect(profile, password, options)) async def connect( @@ -487,16 +487,16 @@ if options is None: options = {} - await self.memory.startSession(password, profile) + await self.memory.start_session(password, profile) - if self.isConnected(profile): + if self.is_connected(profile): log.info(_("already connected !")) return True - if self.memory.isComponent(profile): - await xmpp.SatXMPPComponent.startConnection(self, profile, max_retries) + if self.memory.is_component(profile): + await xmpp.SatXMPPComponent.start_connection(self, profile, max_retries) else: - await xmpp.SatXMPPClient.startConnection(self, profile, max_retries) + await xmpp.SatXMPPClient.start_connection(self, profile, max_retries) return False @@ -504,15 +504,15 @@ """disconnect from jabber server""" # FIXME: client should not be deleted if only disconnected # it shoud be deleted only when session is finished - if not self.isConnected(profile_key): - # isConnected is checked here and not on client + if not self.is_connected(profile_key): + # is_connected is checked here and not on client # because client is deleted when session is ended log.info(_("not connected !")) return defer.succeed(None) - client = self.getClient(profile_key) - return client.entityDisconnect() + client = self.get_client(profile_key) + return client.entity_disconnect() - def getFeatures(self, profile_key=C.PROF_KEY_NONE): + def features_get(self, profile_key=C.PROF_KEY_NONE): """Get available features Return list of activated plugins and plugin specific data @@ -528,7 +528,7 @@ try: # FIXME: there is no method yet to check profile session # as soon as one is implemented, it should be used here - self.getClient(profile_key) + self.get_client(profile_key) except KeyError: log.warning("Requesting features for a profile outside a session") profile_key = C.PROF_KEY_NONE @@ -538,14 +538,14 @@ features = [] for import_name, plugin in self.plugins.items(): try: - features_d = utils.asDeferred(plugin.getFeatures, profile_key) + features_d = utils.as_deferred(plugin.features_get, profile_key) except AttributeError: features_d = defer.succeed({}) features.append(features_d) d_list = defer.DeferredList(features) - def buildFeatures(result, import_names): + def build_features(result, import_names): assert len(result) == len(import_names) ret = {} for name, (success, data) in zip(import_names, result): @@ -560,30 +560,30 @@ ret[name] = {} return ret - d_list.addCallback(buildFeatures, list(self.plugins.keys())) + d_list.addCallback(build_features, list(self.plugins.keys())) return d_list - def _contactGet(self, entity_jid_s, profile_key): - client = self.getClient(profile_key) + def _contact_get(self, entity_jid_s, profile_key): + client = self.get_client(profile_key) entity_jid = jid.JID(entity_jid_s) - return defer.ensureDeferred(self.getContact(client, entity_jid)) + return defer.ensureDeferred(self.get_contact(client, entity_jid)) - async def getContact(self, client, entity_jid): + async def get_contact(self, client, entity_jid): # we want to be sure that roster has been received await client.roster.got_roster - item = client.roster.getItem(entity_jid) + item = client.roster.get_item(entity_jid) if item is None: raise exceptions.NotFound(f"{entity_jid} is not in roster!") - return (client.roster.getAttributes(item), list(item.groups)) + return (client.roster.get_attributes(item), list(item.groups)) - def getContacts(self, profile_key): - client = self.getClient(profile_key) + def contacts_get(self, profile_key): + client = self.get_client(profile_key) def got_roster(__): ret = [] - for item in client.roster.getItems(): # we get all items for client's roster + for item in client.roster.get_items(): # we get all items for client's roster # and convert them to expected format - attr = client.roster.getAttributes(item) + attr = client.roster.get_attributes(item) # we use full() and not userhost() because jid with resources are allowed # in roster, even if it's not common. ret.append([item.entity.full(), attr, list(item.groups)]) @@ -591,11 +591,11 @@ return client.roster.got_roster.addCallback(got_roster) - def getContactsFromGroup(self, group, profile_key): - client = self.getClient(profile_key) - return [jid_.full() for jid_ in client.roster.getJidsFromGroup(group)] + def contacts_get_from_group(self, group, profile_key): + client = self.get_client(profile_key) + return [jid_.full() for jid_ in client.roster.get_jids_from_group(group)] - def purgeEntity(self, profile): + def purge_entity(self, profile): """Remove reference to a profile client/component and purge cache the garbage collector can then free the memory @@ -605,7 +605,7 @@ except KeyError: log.error(_("Trying to remove reference to a client not referenced")) else: - self.memory.purgeProfileSession(profile) + self.memory.purge_profile_session(profile) def startService(self): self._init() @@ -613,7 +613,7 @@ def stopService(self): log.info("Salut aussi à Rantanplan") - return self.pluginsUnload() + return self.plugins_unload() def run(self): log.debug(_("running app")) @@ -625,16 +625,16 @@ ## Misc methods ## - def getJidNStream(self, profile_key): + def get_jid_n_stream(self, profile_key): """Convenient method to get jid and stream from profile key @return: tuple (jid, xmlstream) from profile, can be None""" - # TODO: deprecate this method (getClient is enough) - profile = self.memory.getProfileName(profile_key) - if not profile or not self.profiles[profile].isConnected(): + # TODO: deprecate this method (get_client is enough) + profile = self.memory.get_profile_name(profile_key) + if not profile or not self.profiles[profile].is_connected(): return (None, None) return (self.profiles[profile].jid, self.profiles[profile].xmlstream) - def getClient(self, profile_key: str) -> xmpp.SatXMPPClient: + def get_client(self, profile_key: str) -> xmpp.SatXMPPClient: """Convenient method to get client from profile key @return: the client @@ -642,7 +642,7 @@ @raise exceptions.NotFound: client is not available This happen if profile has not been used yet """ - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileKeyUnknown try: @@ -650,7 +650,7 @@ except KeyError: raise exceptions.NotFound(profile_key) - def getClients(self, profile_key): + def get_clients(self, profile_key): """Convenient method to get list of clients from profile key Manage list through profile_key like C.PROF_KEY_ALL @@ -660,7 +660,7 @@ if not profile_key: raise exceptions.DataError(_("profile_key must not be empty")) try: - profile = self.memory.getProfileName(profile_key, True) + profile = self.memory.get_profile_name(profile_key, True) except exceptions.ProfileUnknownError: return [] if profile == C.PROF_KEY_ALL: @@ -669,16 +669,16 @@ raise exceptions.ProfileKeyUnknown return [self.profiles[profile]] - def _getConfig(self, section, name): + def _get_config(self, section, name): """Get the main configuration option @param section: section of the config file (None or '' for DEFAULT) @param name: name of the option @return: unicode representation of the option """ - return str(self.memory.getConfig(section, name, "")) + return str(self.memory.config_get(section, name, "")) - def logErrback(self, failure_, msg=_("Unexpected error: {failure_}")): + def log_errback(self, failure_, msg=_("Unexpected error: {failure_}")): """Generic errback logging @param msg(unicode): error message ("failure_" key will be use for format) @@ -689,41 +689,41 @@ # namespaces - def registerNamespace(self, short_name, namespace): + def register_namespace(self, short_name, namespace): """associate a namespace to a short name""" if short_name in self.ns_map: raise exceptions.ConflictError("this short name is already used") log.debug(f"registering namespace {short_name} => {namespace}") self.ns_map[short_name] = namespace - def getNamespaces(self): + def get_namespaces(self): return self.ns_map - def getNamespace(self, short_name): + def get_namespace(self, short_name): try: return self.ns_map[short_name] except KeyError: raise exceptions.NotFound("namespace {short_name} is not registered" .format(short_name=short_name)) - def getSessionInfos(self, profile_key): + def get_session_infos(self, profile_key): """compile interesting data on current profile session""" - client = self.getClient(profile_key) + client = self.get_client(profile_key) data = { "jid": client.jid.full(), "started": str(int(client.started)) } return defer.succeed(data) - def _getDevicesInfos(self, bare_jid, profile_key): - client = self.getClient(profile_key) + def _get_devices_infos(self, bare_jid, profile_key): + client = self.get_client(profile_key) if not bare_jid: bare_jid = None - d = defer.ensureDeferred(self.getDevicesInfos(client, bare_jid)) + d = defer.ensureDeferred(self.get_devices_infos(client, bare_jid)) d.addCallback(lambda data: data_format.serialise(data)) return d - async def getDevicesInfos(self, client, bare_jid=None): + async def get_devices_infos(self, client, bare_jid=None): """compile data on an entity devices @param bare_jid(jid.JID, None): bare jid of entity to check @@ -737,7 +737,7 @@ bare_jid = own_jid else: bare_jid = jid.JID(bare_jid) - resources = self.memory.getAllResources(client, bare_jid) + resources = self.memory.get_all_resources(client, bare_jid) if bare_jid == own_jid: # our own jid is not stored in memory's cache resources.add(client.jid.resource) @@ -745,7 +745,7 @@ for resource in resources: res_jid = copy.copy(bare_jid) res_jid.resource = resource - cache_data = self.memory.getEntityData(client, res_jid) + cache_data = self.memory.entity_data_get(client, res_jid) res_data = { "resource": resource, } @@ -760,7 +760,7 @@ "statuses": presence.statuses, } - disco = await self.getDiscoInfos(client, res_jid) + disco = await self.get_disco_infos(client, res_jid) for (category, type_), name in disco.identities.items(): identities = res_data.setdefault('identities', []) @@ -776,22 +776,22 @@ # images - def _imageCheck(self, path): + def _image_check(self, path): report = image.check(self, path) return data_format.serialise(report) - def _imageResize(self, path, width, height): + def _image_resize(self, path, width, height): d = image.resize(path, (width, height)) d.addCallback(lambda new_image_path: str(new_image_path)) return d - def _imageGeneratePreview(self, path, profile_key): - client = self.getClient(profile_key) - d = defer.ensureDeferred(self.imageGeneratePreview(client, Path(path))) + def _image_generate_preview(self, path, profile_key): + client = self.get_client(profile_key) + d = defer.ensureDeferred(self.image_generate_preview(client, Path(path))) d.addCallback(lambda preview_path: str(preview_path)) return d - async def imageGeneratePreview(self, client, path): + async def image_generate_preview(self, client, path): """Helper method to generate in cache a preview of an image @param path(Path): path to the image @@ -807,11 +807,11 @@ path_hash = hashlib.sha256(str(path).encode()).hexdigest() uid = f"{path.stem}_{path_hash}_preview" filename = f"{uid}{path.suffix.lower()}" - metadata = client.cache.getMetadata(uid=uid) + metadata = client.cache.get_metadata(uid=uid) if metadata is not None: preview_path = metadata['path'] else: - with client.cache.cacheData( + with client.cache.cache_data( source='HOST_PREVIEW', uid=uid, filename=filename) as cache_f: @@ -824,16 +824,16 @@ return preview_path - def _imageConvert(self, source, dest, extra, profile_key): - client = self.getClient(profile_key) if profile_key else None + def _image_convert(self, source, dest, extra, profile_key): + client = self.get_client(profile_key) if profile_key else None source = Path(source) dest = None if not dest else Path(dest) extra = data_format.deserialise(extra) - d = defer.ensureDeferred(self.imageConvert(client, source, dest, extra)) + d = defer.ensureDeferred(self.image_convert(client, source, dest, extra)) d.addCallback(lambda dest_path: str(dest_path)) return d - async def imageConvert(self, client, source, dest=None, extra=None): + async def image_convert(self, client, source, dest=None, extra=None): """Helper method to convert an image from one format to an other @param client(SatClient, None): client to use for caching @@ -861,12 +861,12 @@ cache = self.common_cache else: cache = client.cache - metadata = cache.getMetadata(uid=uid) + metadata = cache.get_metadata(uid=uid) if metadata is not None: # there is already a conversion for this image in cache return metadata['path'] else: - with cache.cacheData( + with cache.cache_data( source='HOST_IMAGE_CONVERT', uid=uid, filename=filename) as cache_f: @@ -900,69 +900,69 @@ @param component: if True, path will be prefixed with C.COMPONENTS_DIR @return: path """ - local_dir = self.memory.getConfig("", "local_dir") + local_dir = self.memory.config_get("", "local_dir") if not local_dir: raise exceptions.InternalError("local_dir must be set") path_elts = [] if component: path_elts.append(C.COMPONENTS_DIR) - path_elts.append(regex.pathEscape(dir_name)) + path_elts.append(regex.path_escape(dir_name)) if extra_path: - path_elts.extend([regex.pathEscape(p) for p in extra_path]) + path_elts.extend([regex.path_escape(p) for p in extra_path]) if client is not None: - path_elts.append(regex.pathEscape(client.profile)) + path_elts.append(regex.path_escape(client.profile)) local_path = Path(*path_elts) local_path.mkdir(0o700, parents=True, exist_ok=True) return local_path ## Client management ## - def setParam(self, name, value, category, security_limit, profile_key): + def param_set(self, name, value, category, security_limit, profile_key): """set wanted paramater and notice observers""" - self.memory.setParam(name, value, category, security_limit, profile_key) + self.memory.param_set(name, value, category, security_limit, profile_key) - def isConnected(self, profile_key): + def is_connected(self, profile_key): """Return connection status of profile @param profile_key: key_word or profile name to determine profile name @return: True if connected """ - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: log.error(_("asking connection status for a non-existant profile")) raise exceptions.ProfileUnknownError(profile_key) if profile not in self.profiles: return False - return self.profiles[profile].isConnected() + return self.profiles[profile].is_connected() ## Encryption ## - def registerEncryptionPlugin(self, *args, **kwargs): - return encryption.EncryptionHandler.registerPlugin(*args, **kwargs) + def register_encryption_plugin(self, *args, **kwargs): + return encryption.EncryptionHandler.register_plugin(*args, **kwargs) - def _messageEncryptionStart(self, to_jid_s, namespace, replace=False, + def _message_encryption_start(self, to_jid_s, namespace, replace=False, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return defer.ensureDeferred( client.encryption.start(to_jid, namespace or None, replace)) - def _messageEncryptionStop(self, to_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + def _message_encryption_stop(self, to_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return defer.ensureDeferred( client.encryption.stop(to_jid)) - def _messageEncryptionGet(self, to_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + def _message_encryption_get(self, to_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) session_data = client.encryption.getSession(to_jid) - return client.encryption.getBridgeData(session_data) + return client.encryption.get_bridge_data(session_data) - def _encryptionNamespaceGet(self, name): - return encryption.EncryptionHandler.getNSFromName(name) + def _encryption_namespace_get(self, name): + return encryption.EncryptionHandler.get_ns_from_name(name) - def _encryptionPluginsGet(self): + def _encryption_plugins_get(self): plugins = encryption.EncryptionHandler.getPlugins() ret = [] for p in plugins: @@ -974,20 +974,20 @@ }) return data_format.serialise(ret) - def _encryptionTrustUIGet(self, to_jid_s, namespace, profile_key): - client = self.getClient(profile_key) + def _encryption_trust_ui_get(self, to_jid_s, namespace, profile_key): + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) d = defer.ensureDeferred( - client.encryption.getTrustUI(to_jid, namespace=namespace or None)) + client.encryption.get_trust_ui(to_jid, namespace=namespace or None)) d.addCallback(lambda xmlui: xmlui.toXml()) return d ## XMPP methods ## - def _messageSend( + def _message_send( self, to_jid_s, message, subject=None, mess_type="auto", extra_s="", profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) + client = self.get_client(profile_key) to_jid = jid.JID(to_jid_s) return client.sendMessage( to_jid, @@ -997,25 +997,25 @@ data_format.deserialise(extra_s) ) - def _setPresence(self, to="", show="", statuses=None, profile_key=C.PROF_KEY_NONE): - return self.setPresence(jid.JID(to) if to else None, show, statuses, profile_key) + def _set_presence(self, to="", show="", statuses=None, profile_key=C.PROF_KEY_NONE): + return self.presence_set(jid.JID(to) if to else None, show, statuses, profile_key) - def setPresence(self, to_jid=None, show="", statuses=None, + def presence_set(self, to_jid=None, show="", statuses=None, profile_key=C.PROF_KEY_NONE): """Send our presence information""" if statuses is None: statuses = {} - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile priority = int( - self.memory.getParamA("Priority", "Connection", profile_key=profile) + self.memory.param_get_a("Priority", "Connection", profile_key=profile) ) self.profiles[profile].presence.available(to_jid, show, statuses, priority) # XXX: FIXME: temporary fix to work around openfire 3.7.0 bug (presence is not # broadcasted to generating resource) if "" in statuses: statuses[C.PRESENCE_STATUSES_DEFAULT] = statuses.pop("") - self.bridge.presenceUpdate( + self.bridge.presence_update( self.profiles[profile].jid.full(), show, int(priority), statuses, profile ) @@ -1024,7 +1024,7 @@ @param subs_type: subsciption type (cf RFC 3921) @param raw_jid: unicode entity's jid @param profile_key: profile""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile to_jid = jid.JID(raw_jid) log.debug( @@ -1040,22 +1040,22 @@ elif subs_type == "unsubscribed": self.profiles[profile].presence.unsubscribed(to_jid) - def _addContact(self, to_jid_s, profile_key): - return self.addContact(jid.JID(to_jid_s), profile_key) + def _add_contact(self, to_jid_s, profile_key): + return self.contact_add(jid.JID(to_jid_s), profile_key) - def addContact(self, to_jid, profile_key): + def contact_add(self, to_jid, profile_key): """Add a contact in roster list""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile # presence is sufficient, as a roster push will be sent according to # RFC 6121 §3.1.2 self.profiles[profile].presence.subscribe(to_jid) - def _updateContact(self, to_jid_s, name, groups, profile_key): - client = self.getClient(profile_key) - return self.updateContact(client, jid.JID(to_jid_s), name, groups) + def _update_contact(self, to_jid_s, name, groups, profile_key): + client = self.get_client(profile_key) + return self.contact_update(client, jid.JID(to_jid_s), name, groups) - def updateContact(self, client, to_jid, name, groups): + def contact_update(self, client, to_jid, name, groups): """update a contact in roster list""" roster_item = RosterItem(to_jid) roster_item.name = name or u'' @@ -1064,18 +1064,18 @@ return return client.roster.setItem(roster_item) - def _delContact(self, to_jid_s, profile_key): - return self.delContact(jid.JID(to_jid_s), profile_key) + def _del_contact(self, to_jid_s, profile_key): + return self.contact_del(jid.JID(to_jid_s), profile_key) - def delContact(self, to_jid, profile_key): + def contact_del(self, to_jid, profile_key): """Remove contact from roster list""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) assert profile self.profiles[profile].presence.unsubscribe(to_jid) # is not asynchronous return self.profiles[profile].roster.removeItem(to_jid) - def _rosterResync(self, profile_key): - client = self.getClient(profile_key) + def _roster_resync(self, profile_key): + client = self.get_client(profile_key) return client.roster.resync() ## Discovery ## @@ -1085,39 +1085,39 @@ def hasFeature(self, *args, **kwargs): return self.memory.disco.hasFeature(*args, **kwargs) - def checkFeature(self, *args, **kwargs): - return self.memory.disco.checkFeature(*args, **kwargs) + def check_feature(self, *args, **kwargs): + return self.memory.disco.check_feature(*args, **kwargs) - def checkFeatures(self, *args, **kwargs): - return self.memory.disco.checkFeatures(*args, **kwargs) + def check_features(self, *args, **kwargs): + return self.memory.disco.check_features(*args, **kwargs) - def hasIdentity(self, *args, **kwargs): - return self.memory.disco.hasIdentity(*args, **kwargs) + def has_identity(self, *args, **kwargs): + return self.memory.disco.has_identity(*args, **kwargs) - def getDiscoInfos(self, *args, **kwargs): - return self.memory.disco.getInfos(*args, **kwargs) + def get_disco_infos(self, *args, **kwargs): + return self.memory.disco.get_infos(*args, **kwargs) def getDiscoItems(self, *args, **kwargs): - return self.memory.disco.getItems(*args, **kwargs) + return self.memory.disco.get_items(*args, **kwargs) - def findServiceEntity(self, *args, **kwargs): - return self.memory.disco.findServiceEntity(*args, **kwargs) + def find_service_entity(self, *args, **kwargs): + return self.memory.disco.find_service_entity(*args, **kwargs) - def findServiceEntities(self, *args, **kwargs): - return self.memory.disco.findServiceEntities(*args, **kwargs) + def find_service_entities(self, *args, **kwargs): + return self.memory.disco.find_service_entities(*args, **kwargs) - def findFeaturesSet(self, *args, **kwargs): - return self.memory.disco.findFeaturesSet(*args, **kwargs) + def find_features_set(self, *args, **kwargs): + return self.memory.disco.find_features_set(*args, **kwargs) - def _findByFeatures(self, namespaces, identities, bare_jids, service, roster, own_jid, + def _find_by_features(self, namespaces, identities, bare_jids, service, roster, own_jid, local_device, profile_key): - client = self.getClient(profile_key) + client = self.get_client(profile_key) identities = [tuple(i) for i in identities] if identities else None - return defer.ensureDeferred(self.findByFeatures( + return defer.ensureDeferred(self.find_by_features( client, namespaces, identities, bare_jids, service, roster, own_jid, local_device)) - async def findByFeatures( + async def find_by_features( self, client: xmpp.SatXMPPEntity, namespaces: List[str], @@ -1164,10 +1164,10 @@ found_own = {} found_roster = {} if service: - services_jids = await self.findFeaturesSet(client, namespaces) + services_jids = await self.find_features_set(client, namespaces) services_jids = list(services_jids) # we need a list to map results below services_infos = await defer.DeferredList( - [self.getDiscoInfos(client, service_jid) for service_jid in services_jids] + [self.get_disco_infos(client, service_jid) for service_jid in services_jids] ) for idx, (success, infos) in enumerate(services_infos): @@ -1190,7 +1190,7 @@ if own_jid: to_find.append((found_own, [client.jid.userhostJID()])) if roster: - to_find.append((found_roster, client.roster.getJids())) + to_find.append((found_roster, client.roster.get_jids())) for found, jids in to_find: full_jids = [] @@ -1206,7 +1206,7 @@ resources = [None] else: try: - resources = self.memory.getAvailableResources(client, jid_) + resources = self.memory.get_available_resources(client, jid_) except exceptions.UnknownEntityError: continue if not resources and jid_ == client.jid.userhostJID() and own_jid: @@ -1220,7 +1220,7 @@ continue full_jids.append(full_jid) - disco_defers.append(self.getDiscoInfos(client, full_jid)) + disco_defers.append(self.get_disco_infos(client, full_jid)) d_list = defer.DeferredList(disco_defers) # XXX: 10 seconds may be too low for slow connections (e.g. mobiles) @@ -1251,18 +1251,18 @@ ## Generic HMI ## - def _killAction(self, keep_id, client): + def _kill_action(self, keep_id, client): log.debug("Killing action {} for timeout".format(keep_id)) client.actions[keep_id] - def actionNew( + def action_new( self, action_data, security_limit=C.NO_SECURITY_LIMIT, keep_id=None, profile=C.PROF_KEY_NONE, ): - """Shortcut to bridge.actionNew which generate and id and keep for retrieval + """Shortcut to bridge.action_new which generate and id and keep for retrieval @param action_data(dict): action data (see bridge documentation) @param security_limit: %(doc_security_limit)s @@ -1273,44 +1273,44 @@ """ id_ = str(uuid.uuid4()) if keep_id is not None: - client = self.getClient(profile) - action_timer = reactor.callLater(60 * 30, self._killAction, keep_id, client) + client = self.get_client(profile) + action_timer = reactor.callLater(60 * 30, self._kill_action, keep_id, client) client.actions[keep_id] = (action_data, id_, security_limit, action_timer) - self.bridge.actionNew(action_data, id_, security_limit, profile) + self.bridge.action_new(action_data, id_, security_limit, profile) - def actionsGet(self, profile): + def actions_get(self, profile): """Return current non answered actions @param profile: %(doc_profile)s """ - client = self.getClient(profile) + client = self.get_client(profile) return [action_tuple[:-1] for action_tuple in client.actions.values()] - def registerProgressCb( + def register_progress_cb( self, progress_id, callback, metadata=None, profile=C.PROF_KEY_NONE ): """Register a callback called when progress is requested for id""" if metadata is None: metadata = {} - client = self.getClient(profile) + client = self.get_client(profile) if progress_id in client._progress_cb: raise exceptions.ConflictError("Progress ID is not unique !") client._progress_cb[progress_id] = (callback, metadata) - def removeProgressCb(self, progress_id, profile): + def remove_progress_cb(self, progress_id, profile): """Remove a progress callback""" - client = self.getClient(profile) + client = self.get_client(profile) try: del client._progress_cb[progress_id] except KeyError: log.error(_("Trying to remove an unknow progress callback")) - def _progressGet(self, progress_id, profile): - data = self.progressGet(progress_id, profile) + def _progress_get(self, progress_id, profile): + data = self.progress_get(progress_id, profile) return {k: str(v) for k, v in data.items()} - def progressGet(self, progress_id, profile): + def progress_get(self, progress_id, profile): """Return a dict with progress information @param progress_id(unicode): unique id of the progressing element @@ -1321,22 +1321,22 @@ if id doesn't exists (may be a finished progression), and empty dict is returned """ - client = self.getClient(profile) + client = self.get_client(profile) try: data = client._progress_cb[progress_id][0](progress_id, profile) except KeyError: data = {} return data - def _progressGetAll(self, profile_key): - progress_all = self.progressGetAll(profile_key) + def _progress_get_all(self, profile_key): + progress_all = self.progress_get_all(profile_key) for profile, progress_dict in progress_all.items(): for progress_id, data in progress_dict.items(): for key, value in data.items(): data[key] = str(value) return progress_all - def progressGetAllMetadata(self, profile_key): + def progress_get_all_metadata(self, profile_key): """Return all progress metadata at once @param profile_key: %(doc_profile)s @@ -1344,9 +1344,9 @@ returned @return (dict[dict[dict]]): a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_metadata is the same dict as sent by [progressStarted] + progress_metadata is the same dict as sent by [progress_started] """ - clients = self.getClients(profile_key) + clients = self.get_clients(profile_key) progress_all = {} for client in clients: profile = client.profile @@ -1359,16 +1359,16 @@ progress_dict[progress_id] = progress_metadata return progress_all - def progressGetAll(self, profile_key): + def progress_get_all(self, profile_key): """Return all progress status at once @param profile_key: %(doc_profile)s if C.PROF_KEY_ALL is used, all progress status from all profiles are returned @return (dict[dict[dict]]): a dict which map profile to progress_dict progress_dict map progress_id to progress_data - progress_data is the same dict as returned by [progressGet] + progress_data is the same dict as returned by [progress_get] """ - clients = self.getClients(profile_key) + clients = self.get_clients(profile_key) progress_all = {} for client in clients: profile = client.profile @@ -1378,7 +1378,7 @@ progress_dict[progress_id] = progress_cb(progress_id, profile) return progress_all - def registerCallback(self, callback, *args, **kwargs): + def register_callback(self, callback, *args, **kwargs): """Register a callback. @param callback(callable): method to call @@ -1399,23 +1399,23 @@ if "one_shot" in kwargs: # One Shot callback are removed after 30 min - def purgeCallback(): + def purge_callback(): try: self.removeCallback(callback_id) except KeyError: pass - reactor.callLater(1800, purgeCallback) + reactor.callLater(1800, purge_callback) return callback_id def removeCallback(self, callback_id): """ Remove a previously registered callback - @param callback_id: id returned by [registerCallback] """ + @param callback_id: id returned by [register_callback] """ log.debug("Removing callback [%s]" % callback_id) del self._cb_map[callback_id] - def launchCallback(self, callback_id, data=None, profile_key=C.PROF_KEY_NONE): + def launch_callback(self, callback_id, data=None, profile_key=C.PROF_KEY_NONE): """Launch a specific callback @param callback_id: id of the action (callback) to launch @@ -1430,10 +1430,10 @@ """ # FIXME: security limit need to be checked here try: - client = self.getClient(profile_key) + client = self.get_client(profile_key) except exceptions.NotFound: # client is not available yet - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileUnknownError( _("trying to launch action with a non-existant profile") @@ -1468,11 +1468,11 @@ if kwargs.pop("one_shot", False): self.removeCallback(callback_id) - return utils.asDeferred(callback, *args, **kwargs) + return utils.as_deferred(callback, *args, **kwargs) # Menus management - def _getMenuCanonicalPath(self, path): + def _get_menu_canonical_path(self, path): """give canonical form of path canonical form is a tuple of the path were every element is stripped and lowercase @@ -1481,7 +1481,7 @@ """ return tuple((p.lower().strip() for p in path)) - def importMenu(self, path, callback, security_limit=C.NO_SECURITY_LIMIT, + def import_menu(self, path, callback, security_limit=C.NO_SECURITY_LIMIT, help_string="", type_=C.MENU_GLOBAL): r"""register a new menu for frontends @@ -1491,9 +1491,9 @@ untranslated/lower case path can be used to identity a menu, for this reason it must be unique independently of case. @param callback(callable): method to be called when menuitem is selected, callable - or a callback id (string) as returned by [registerCallback] + or a callback id (string) as returned by [register_callback] @param security_limit(int): %(doc_security_limit)s - /!\ security_limit MUST be added to data in launchCallback if used #TODO + /!\ security_limit MUST be added to data in launch_callback if used #TODO @param help_string(unicode): string used to indicate what the menu do (can be show as a tooltip). /!\ use D_() instead of _() for translations @@ -1517,7 +1517,7 @@ """ if callable(callback): - callback_id = self.registerCallback(callback, with_data=True) + callback_id = self.register_callback(callback, with_data=True) elif isinstance(callback, str): # The callback is already registered callback_id = callback @@ -1535,7 +1535,7 @@ _("A menu with the same path and type already exists") ) - path_canonical = self._getMenuCanonicalPath(path) + path_canonical = self._get_menu_canonical_path(path) menu_key = (type_, path_canonical) if menu_key in self._menus_paths: @@ -1558,7 +1558,7 @@ return callback_id - def getMenus(self, language="", security_limit=C.NO_SECURITY_LIMIT): + def get_menus(self, language="", security_limit=C.NO_SECURITY_LIMIT): """Return all menus registered @param language: language used for translation, or empty string for default @@ -1582,20 +1582,20 @@ or menu_security_limit > security_limit ): continue - languageSwitch(language) + language_switch(language) path_i18n = [_(elt) for elt in path] - languageSwitch() + language_switch() extra = {} # TODO: manage extra data like icon ret.append((menu_id, type_, path, path_i18n, extra)) return ret - def _launchMenu(self, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT, + def _launch_menu(self, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): - client = self.getClient(profile_key) - return self.launchMenu(client, menu_type, path, data, security_limit) + client = self.get_client(profile_key) + return self.launch_menu(client, menu_type, path, data, security_limit) - def launchMenu(self, client, menu_type, path, data=None, + def launch_menu(self, client, menu_type, path, data=None, security_limit=C.NO_SECURITY_LIMIT): """launch action a menu action @@ -1606,7 +1606,7 @@ """ # FIXME: manage security_limit here # defaut security limit should be high instead of C.NO_SECURITY_LIMIT - canonical_path = self._getMenuCanonicalPath(path) + canonical_path = self._get_menu_canonical_path(path) menu_key = (menu_type, canonical_path) try: callback_id = self._menus_paths[menu_key] @@ -1616,9 +1616,9 @@ path=canonical_path, menu_type=menu_type ) ) - return self.launchCallback(callback_id, data, client.profile) + return self.launch_callback(callback_id, data, client.profile) - def getMenuHelp(self, menu_id, language=""): + def get_menu_help(self, menu_id, language=""): """return the help string of the menu @param menu_id: id of the menu (same as callback_id) @@ -1630,7 +1630,7 @@ menu_data = self._menus[menu_id] except KeyError: raise exceptions.DataError("Trying to access an unknown menu") - languageSwitch(language) + language_switch(language) help_string = _(menu_data["help_string"]) - languageSwitch() + language_switch() return help_string
--- a/sat/core/xmpp.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/core/xmpp.py Sat Apr 08 13:54:42 2023 +0200 @@ -89,20 +89,20 @@ class SatXMPPEntity(core_types.SatXMPPEntity): """Common code for Client and Component""" - # profile is added there when startConnection begins and removed when it is finished + # profile is added there when start_connection begins and removed when it is finished profiles_connecting = set() def __init__(self, host_app, profile, max_retries): factory = self.factory - # we monkey patch clientConnectionLost to handle networkEnabled/networkDisabled + # we monkey patch clientConnectionLost to handle network_enabled/network_disabled # and to allow plugins to tune reconnection mechanism clientConnectionFailed_ori = factory.clientConnectionFailed clientConnectionLost_ori = factory.clientConnectionLost factory.clientConnectionFailed = partial( - self.connectionTerminated, term_type="failed", cb=clientConnectionFailed_ori) + self.connection_terminated, term_type="failed", cb=clientConnectionFailed_ori) factory.clientConnectionLost = partial( - self.connectionTerminated, term_type="lost", cb=clientConnectionLost_ori) + self.connection_terminated, term_type="lost", cb=clientConnectionLost_ori) factory.maxRetries = max_retries factory.maxDelay = 30 @@ -129,41 +129,41 @@ ## initialisation ## - async def _callConnectionTriggers(self, connection_timer): + async def _call_connection_triggers(self, connection_timer): """Call conneting trigger prepare connected trigger @param plugins(iterable): plugins to use @return (list[object, callable]): plugin to trigger tuples with: - plugin instance - - profileConnected* triggers (to call after connection) + - profile_connected* triggers (to call after connection) """ plugin_conn_cb = [] - for plugin in self._getPluginsList(): + for plugin in self._get_plugins_list(): # we check if plugin handle client mode if plugin.is_handler: - plugin.getHandler(self).setHandlerParent(self) + plugin.get_handler(self).setHandlerParent(self) - # profileConnecting/profileConnected methods handling + # profile_connecting/profile_connected methods handling timer = connection_timer[plugin] = { "total": 0 } # profile connecting is called right now (before actually starting client) - connecting_cb = getattr(plugin, "profileConnecting", None) + connecting_cb = getattr(plugin, "profile_connecting", None) if connecting_cb is not None: connecting_start = time.time() - await utils.asDeferred(connecting_cb, self) + await utils.as_deferred(connecting_cb, self) timer["connecting"] = time.time() - connecting_start timer["total"] += timer["connecting"] # profile connected is called after client is ready and roster is got - connected_cb = getattr(plugin, "profileConnected", None) + connected_cb = getattr(plugin, "profile_connected", None) if connected_cb is not None: plugin_conn_cb.append((plugin, connected_cb)) return plugin_conn_cb - def _getPluginsList(self): + def _get_plugins_list(self): """Return list of plugin to use need to be implemented by subclasses @@ -172,10 +172,10 @@ """ raise NotImplementedError - def _createSubProtocols(self): + def _create_sub_protocols(self): return - def entityConnected(self): + def entity_connected(self): """Called once connection is done may return a Deferred, to perform initialisation tasks @@ -189,12 +189,12 @@ timer: Dict[str, float] ) -> None: connected_start = time.time() - await utils.asDeferred(callback, entity) + await utils.as_deferred(callback, entity) timer["connected"] = time.time() - connected_start timer["total"] += timer["connected"] @classmethod - async def startConnection(cls, host, profile, max_retries): + async def start_connection(cls, host, profile, max_retries): """instantiate the entity and start the connection""" # FIXME: reconnection doesn't seems to be handled correclty # (client is deleted then recreated from scratch) @@ -208,7 +208,7 @@ try: try: port = int( - host.memory.getParamA( + host.memory.param_get_a( C.FORCE_PORT_PARAM, "Connection", profile_key=profile ) ) @@ -218,11 +218,11 @@ None ) # will use default value 5222 or be retrieved from a DNS SRV record - password = await host.memory.asyncGetParamA( + password = await host.memory.param_get_a_async( "Password", "Connection", profile_key=profile ) - entity_jid_s = await host.memory.asyncGetParamA( + entity_jid_s = await host.memory.param_get_a_async( "JabberID", "Connection", profile_key=profile) entity_jid = jid.JID(entity_jid_s) @@ -231,13 +231,13 @@ # server returned one, as it will then stay stable in case of # reconnection. we only do that for client and if there is a user part, to # let server decide for anonymous login - resource_dict = await host.memory.storage.getPrivates( + resource_dict = await host.memory.storage.get_privates( "core:xmpp", ["resource"] , profile=profile) try: resource = resource_dict["resource"] except KeyError: resource = f"{C.APP_NAME_FILE}.{shortuuid.uuid()}" - await host.memory.storage.setPrivateValue( + await host.memory.storage.set_private_value( "core:xmpp", "resource", resource, profile=profile) log.info(_("We'll use the stable resource {resource}").format( @@ -245,7 +245,7 @@ entity_jid.resource = resource if profile in host.profiles: - if host.profiles[profile].isConnected(): + if host.profiles[profile].is_connected(): raise exceptions.InternalError( f"There is already a connected profile of name {profile!r} in " f"host") @@ -254,14 +254,14 @@ del host.profiles[profile] entity = host.profiles[profile] = cls( host, profile, entity_jid, password, - host.memory.getParamA(C.FORCE_SERVER_PARAM, "Connection", + host.memory.param_get_a(C.FORCE_SERVER_PARAM, "Connection", profile_key=profile) or None, port, max_retries, ) - await entity.encryption.loadSessions() + await entity.encryption.load_sessions() - entity._createSubProtocols() + entity._create_sub_protocols() entity.fallBack = SatFallbackHandler(host) entity.fallBack.setHandlerParent(entity) @@ -275,15 +275,15 @@ log.debug(_("setting plugins parents")) connection_timer: Dict[str, Dict[str, float]] = {} - plugin_conn_cb = await entity._callConnectionTriggers(connection_timer) + plugin_conn_cb = await entity._call_connection_triggers(connection_timer) entity.startService() await entity.conn_deferred - await defer.maybeDeferred(entity.entityConnected) + await defer.maybeDeferred(entity.entity_connected) - # Call profileConnected callback for all plugins, + # Call profile_connected callback for all plugins, # and print error message if any of them fails conn_cb_list = [] for plugin, callback in plugin_conn_cb: @@ -296,7 +296,7 @@ ) list_d = defer.DeferredList(conn_cb_list) - def logPluginResults(results): + def log_plugin_results(results): if not results: log.info("no plugin loaded") return @@ -347,22 +347,22 @@ ) await list_d.addCallback( - logPluginResults + log_plugin_results ) # FIXME: we should have a timeout here, and a way to know if a plugin freeze # TODO: mesure launch time of each plugin finally: cls.profiles_connecting.remove(profile) - def _disconnectionCb(self, __): + def _disconnection_cb(self, __): self._connected_d = None - def _disconnectionEb(self, failure_): + def _disconnection_eb(self, failure_): log.error(_("Error while disconnecting: {}".format(failure_))) def _authd(self, xmlstream): super(SatXMPPEntity, self)._authd(xmlstream) log.debug(_("{profile} identified").format(profile=self.profile)) - self.streamInitialized() + self.stream_initialized() def _finish_connection(self, __): if self.conn_deferred.called: @@ -371,14 +371,14 @@ else: self.conn_deferred.callback(None) - def streamInitialized(self): + def stream_initialized(self): """Called after _authd""" log.debug(_("XML stream is initialized")) if not self.host_app.trigger.point("xml_init", self): return - self.postStreamInit() + self.post_stream_init() - def postStreamInit(self): + def post_stream_init(self): """Workflow after stream initalisation.""" log.info( _("********** [{profile}] CONNECTED **********").format(profile=self.profile) @@ -387,9 +387,9 @@ # the following Deferred is used to know when we are connected # so we need to be set it to None when connection is lost self._connected_d = defer.Deferred() - self._connected_d.addCallback(self._cleanConnection) - self._connected_d.addCallback(self._disconnectionCb) - self._connected_d.addErrback(self._disconnectionEb) + self._connected_d.addCallback(self._clean_connection) + self._connected_d.addCallback(self._disconnection_cb) + self._connected_d.addErrback(self._disconnection_eb) # we send the signal to the clients self.host_app.bridge.connected(self.jid.full(), self.profile) @@ -421,7 +421,7 @@ ## connection ## - def connectionTerminated(self, connector, reason, term_type, cb): + def connection_terminated(self, connector, reason, term_type, cb): """Display disconnection reason, and call factory method This method is monkey patched to factory, allowing plugins to handle finely @@ -453,7 +453,7 @@ return return cb(connector, reason) - def networkDisabled(self): + def network_disabled(self): """Indicate that network has been completely disabled In other words, internet is not available anymore and transport must be stopped. @@ -466,7 +466,7 @@ if self.xmlstream is not None: self.xmlstream.transport.abortConnection() - def networkEnabled(self): + def network_enabled(self): """Indicate that network has been (re)enabled This happens when e.g. user activate WIFI connection. @@ -475,7 +475,7 @@ connector = self._saved_connector network_disabled = self._network_disabled except AttributeError: - # connection has not been stopped by networkDisabled + # connection has not been stopped by network_disabled # we don't have to restart it log.debug(f"no connection to restart [{self.profile}]") return @@ -496,12 +496,12 @@ self.host_app.trigger.point( "stream_hooks", self, receive_hooks, send_hooks) for hook in receive_hooks: - xs.addHook(C.STREAM_HOOK_RECEIVE, hook) + xs.add_hook(C.STREAM_HOOK_RECEIVE, hook) for hook in send_hooks: - xs.addHook(C.STREAM_HOOK_SEND, hook) + xs.add_hook(C.STREAM_HOOK_SEND, hook) super(SatXMPPEntity, self)._connected(xs) - def disconnectProfile(self, reason): + def disconnect_profile(self, reason): if self._connected_d is not None: self.host_app.bridge.disconnected( self.profile @@ -516,7 +516,7 @@ log.debug("continueTrying not set, purging entity") self._connected_d.callback(None) # and we remove references to this client - self.host_app.purgeEntity(self.profile) + self.host_app.purge_entity(self.profile) if not self.conn_deferred.called: if reason is None: @@ -535,7 +535,7 @@ try: # with invalid certificate, we should not retry to connect # so we delete saved connector to avoid reconnection if - # networkEnabled is called. + # network_enabled is called. del self._saved_connector except AttributeError: pass @@ -547,21 +547,21 @@ super(SatXMPPEntity, self)._disconnected(reason) if not self.host_app.trigger.point("disconnected", self, reason): return - self.disconnectProfile(reason) + self.disconnect_profile(reason) @defer.inlineCallbacks - def _cleanConnection(self, __): + def _clean_connection(self, __): """method called on disconnection - used to call profileDisconnected* triggers + used to call profile_disconnected* triggers """ - trigger_name = "profileDisconnected" - for plugin in self._getPluginsList(): + trigger_name = "profile_disconnected" + for plugin in self._get_plugins_list(): disconnected_cb = getattr(plugin, trigger_name, None) if disconnected_cb is not None: yield disconnected_cb(self) - def isConnected(self): + def is_connected(self): """Return True is client is fully connected client is considered fully connected if transport is started and all plugins @@ -574,7 +574,7 @@ return self._connected_d is not None and transport_connected - def entityDisconnect(self): + def entity_disconnect(self): if not self.host_app.trigger.point("disconnecting", self): return log.info(_("Disconnecting...")) @@ -609,7 +609,7 @@ ).toResponse(iq_elt) self.xmlstream.send(iq_error_elt) - def generateMessageXML( + def generate_message_xml( self, data: core_types.MessageData, post_xml_treatments: Optional[defer.Deferred] = None @@ -666,9 +666,9 @@ @property def is_admin(self) -> bool: """True if a client is an administrator with extra privileges""" - return self.host_app.memory.isAdmin(self.profile) + return self.host_app.memory.is_admin(self.profile) - def addPostXmlCallbacks(self, post_xml_treatments): + def add_post_xml_callbacks(self, post_xml_treatments): """Used to add class level callbacks at the end of the workflow @param post_xml_treatments(D): the same Deferred as in sendMessage trigger @@ -685,20 +685,20 @@ # (out of band transmission for instance). # e2e should have a priority of 0 here, and out of band transmission # a lower priority - if not (await self.host_app.trigger.asyncPoint("send", self, obj)): + if not (await self.host_app.trigger.async_point("send", self, obj)): return super().send(obj) def send(self, obj): defer.ensureDeferred(self.a_send(obj)) - async def sendMessageData(self, mess_data): + async def send_message_data(self, mess_data): """Convenient method to send message data to stream This method will send mess_data[u'xml'] to stream, but a trigger is there The trigger can't be cancelled, it's a good place for e2e encryption which don't handle full stanza encryption - This trigger can return a Deferred (it's an asyncPoint) + This trigger can return a Deferred (it's an async_point) @param mess_data(dict): message data as constructed by onMessage workflow @return (dict): mess_data (so it can be used in a deferred chain) """ @@ -707,7 +707,7 @@ # This is intented for e2e encryption which doesn't do full stanza # encryption (e.g. OTR) # This trigger point can't cancel the method - await self.host_app.trigger.asyncPoint("sendMessageData", self, mess_data, + await self.host_app.trigger.async_point("send_message_data", self, mess_data, triggers_no_cancel=True) await self.a_send(mess_data["xml"]) return mess_data @@ -762,7 +762,7 @@ elif not data["to"].resource: # we may have a groupchat message, we check if the we know this jid try: - entity_type = self.host_app.memory.getEntityDatum( + entity_type = self.host_app.memory.get_entity_datum( self, data["to"], C.ENTITY_TYPE ) # FIXME: should entity_type manage resources ? @@ -783,7 +783,7 @@ if not no_trigger and not send_only: # is the session encrypted? If so we indicate it in data - self.encryption.setEncryptionFlag(data) + self.encryption.set_encryption_flag(data) if not self.host_app.trigger.point( "sendMessage" + self.trigger_suffix, @@ -797,27 +797,27 @@ log.debug(_("Sending message (type {type}, to {to})") .format(type=data["type"], to=to_jid.full())) - pre_xml_treatments.addCallback(lambda __: self.generateMessageXML(data, post_xml_treatments)) + pre_xml_treatments.addCallback(lambda __: self.generate_message_xml(data, post_xml_treatments)) pre_xml_treatments.addCallback(lambda __: post_xml_treatments) - pre_xml_treatments.addErrback(self._cancelErrorTrap) + pre_xml_treatments.addErrback(self._cancel_error_trap) post_xml_treatments.addCallback( - lambda __: defer.ensureDeferred(self.sendMessageData(data)) + lambda __: defer.ensureDeferred(self.send_message_data(data)) ) if send_only: log.debug(_("Triggers, storage and echo have been inhibited by the " "'send_only' parameter")) else: - self.addPostXmlCallbacks(post_xml_treatments) - post_xml_treatments.addErrback(self._cancelErrorTrap) - post_xml_treatments.addErrback(self.host_app.logErrback) + self.add_post_xml_callbacks(post_xml_treatments) + post_xml_treatments.addErrback(self._cancel_error_trap) + post_xml_treatments.addErrback(self.host_app.log_errback) pre_xml_treatments.callback(data) return pre_xml_treatments - def _cancelErrorTrap(self, failure): + def _cancel_error_trap(self, failure): """A message sending can be cancelled by a plugin treatment""" failure.trap(exceptions.CancelError) - def isMessagePrintable(self, mess_data): + def is_message_printable(self, mess_data): """Return True if a message contain payload to show in frontends""" return ( mess_data["message"] or mess_data["subject"] @@ -825,7 +825,7 @@ or mess_data["type"] == C.MESS_TYPE_INFO ) - async def messageAddToHistory(self, data): + async def message_add_to_history(self, data): """Store message into database (for local history) @param data: message data dictionnary @@ -836,22 +836,22 @@ # and they will be added then # we need a message to store - if self.isMessagePrintable(data): - await self.host_app.memory.addToHistory(self, data) + if self.is_message_printable(data): + await self.host_app.memory.add_to_history(self, data) else: log.warning( "No message found" ) # empty body should be managed by plugins before this point return data - def messageGetBridgeArgs(self, data): + def message_get_bridge_args(self, data): """Generate args to use with bridge from data dict""" return (data["uid"], data["timestamp"], data["from"].full(), data["to"].full(), data["message"], data["subject"], data["type"], data_format.serialise(data["extra"])) - def messageSendToBridge(self, data): + def message_send_to_bridge(self, data): """Send message to bridge, so frontends can display it @param data: message data dictionnary @@ -862,11 +862,11 @@ # and they will be added the # we need a message to send something - if self.isMessagePrintable(data): + if self.is_message_printable(data): # We send back the message, so all frontends are aware of it - self.host_app.bridge.messageNew( - *self.messageGetBridgeArgs(data), + self.host_app.bridge.message_new( + *self.message_get_bridge_args(data), profile=self.profile ) else: @@ -913,7 +913,7 @@ # for now we consider Android devices to be always phones self.identities = [disco.DiscoIdentity("client", "phone", C.APP_NAME)] - hosts_map = host_app.memory.getConfig(None, "hosts_dict", {}) + hosts_map = host_app.memory.config_get(None, "hosts_dict", {}) if host is None and user_jid.host in hosts_map: host_data = hosts_map[user_jid.host] if isinstance(host_data, str): @@ -934,7 +934,7 @@ .format(host_ori=user_jid.host, host=host, port=port) ) - self.check_certificate = host_app.memory.getParamA( + self.check_certificate = host_app.memory.param_get_a( "check_certificate", "Connection", profile_key=profile) if self.check_certificate: @@ -954,19 +954,19 @@ "somebody may be spying on you. If you have no good reason to disable " "certificate validation, please activate \"Check certificate\" in your " "settings in \"Connection\" tab.")) - xml_tools.quickNote(host_app, self, msg, _("Security notice"), + xml_tools.quick_note(host_app, self, msg, _("Security notice"), level = C.XMLUI_DATA_LVL_WARNING) @property def server_jid(self): return jid.JID(self.jid.host) - def _getPluginsList(self): + def _get_plugins_list(self): for p in self.host_app.plugins.values(): if C.PLUG_MODE_CLIENT in p._info["modes"]: yield p - def _createSubProtocols(self): + def _create_sub_protocols(self): self.messageProt = SatMessageProtocol(self.host_app) self.messageProt.setHandlerParent(self) @@ -977,26 +977,26 @@ self.presence.setHandlerParent(self) @classmethod - async def startConnection(cls, host, profile, max_retries): + async def start_connection(cls, host, profile, max_retries): try: - await super(SatXMPPClient, cls).startConnection(host, profile, max_retries) + await super(SatXMPPClient, cls).start_connection(host, profile, max_retries) except exceptions.CancelError as e: - log.warning(f"startConnection cancelled: {e}") + log.warning(f"start_connection cancelled: {e}") return entity = host.profiles[profile] # we finally send our presence entity.presence.available() - def entityConnected(self): + def entity_connected(self): # we want to be sure that we got the roster return self.roster.got_roster - def addPostXmlCallbacks(self, post_xml_treatments): - post_xml_treatments.addCallback(self.messageProt.completeAttachments) + def add_post_xml_callbacks(self, post_xml_treatments): + post_xml_treatments.addCallback(self.messageProt.complete_attachments) post_xml_treatments.addCallback( - lambda ret: defer.ensureDeferred(self.messageAddToHistory(ret)) + lambda ret: defer.ensureDeferred(self.message_add_to_history(ret)) ) - post_xml_treatments.addCallback(self.messageSendToBridge) + post_xml_treatments.addCallback(self.message_send_to_bridge) def feedback( self, @@ -1015,7 +1015,7 @@ """ if extra is None: extra = {} - self.host_app.bridge.messageNew( + self.host_app.bridge.message_new( uid=str(uuid.uuid4()), timestamp=time.time(), from_jid=self.jid.full(), @@ -1028,7 +1028,7 @@ ) def _finish_connection(self, __): - d = self.roster.requestRoster() + d = self.roster.request_roster() d.addCallback(lambda __: super(SatXMPPClient, self)._finish_connection(__)) @@ -1057,7 +1057,7 @@ port = C.XMPP_COMPONENT_PORT ## entry point ## - entry_point = host_app.memory.getEntryPoint(profile) + entry_point = host_app.memory.get_entry_point(profile) try: self.entry_plugin = host_app.plugins[entry_point] except KeyError: @@ -1090,11 +1090,11 @@ def is_admin(self) -> bool: return False - def _createSubProtocols(self): + def _create_sub_protocols(self): self.messageProt = SatMessageProtocol(self.host_app) self.messageProt.setHandlerParent(self) - def _buildDependencies(self, current, plugins, required=True): + def _build_dependencies(self, current, plugins, required=True): """build recursively dependencies needed for a plugin this method build list of plugin needed for a component and raises @@ -1128,7 +1128,7 @@ # plugins are already loaded as dependencies # so we know they are in self.host_app.plugins dep = self.host_app.plugins[import_name] - self._buildDependencies(dep, plugins) + self._build_dependencies(dep, plugins) for import_name in current._info.get(C.PI_RECOMMENDATIONS, []): # here plugins are only recommendations, @@ -1137,21 +1137,21 @@ dep = self.host_app.plugins[import_name] except KeyError: continue - self._buildDependencies(dep, plugins, required=False) + self._build_dependencies(dep, plugins, required=False) if current not in plugins: # current can be required for several plugins and so # it can already be present in the list plugins.append(current) - def _getPluginsList(self): + def _get_plugins_list(self): # XXX: for component we don't launch all plugins triggers # but only the ones from which there is a dependency plugins = [] - self._buildDependencies(self.entry_plugin, plugins) + self._build_dependencies(self.entry_plugin, plugins) return plugins - def entityConnected(self): + def entity_connected(self): # we can now launch entry point try: start_cb = self.entry_plugin.componentStart @@ -1160,13 +1160,13 @@ else: return start_cb(self) - def addPostXmlCallbacks(self, post_xml_treatments): + def add_post_xml_callbacks(self, post_xml_treatments): if self.sendHistory: post_xml_treatments.addCallback( - lambda ret: defer.ensureDeferred(self.messageAddToHistory(ret)) + lambda ret: defer.ensureDeferred(self.message_add_to_history(ret)) ) - def getOwnerFromJid(self, to_jid: jid.JID) -> jid.JID: + def get_owner_from_jid(self, to_jid: jid.JID) -> jid.JID: """Retrieve "owner" of a component resource from the destination jid of the request This method needs plugin XEP-0106 for unescaping, if you use it you must add the @@ -1187,7 +1187,7 @@ # only user part is specified, we use our own host to build the full jid return jid.JID(None, (user, self.host, None)) - def getOwnerAndPeer(self, iq_elt: domish.Element) -> Tuple[jid.JID, jid.JID]: + def get_owner_and_peer(self, iq_elt: domish.Element) -> Tuple[jid.JID, jid.JID]: """Retrieve owner of a component jid, and the jid of the requesting peer "owner" is found by either unescaping full jid from node, or by combining node @@ -1198,14 +1198,14 @@ """ to_jid = jid.JID(iq_elt['to']) if to_jid.user: - owner = self.getOwnerFromJid(to_jid) + owner = self.get_owner_from_jid(to_jid) else: owner = jid.JID(iq_elt["from"]).userhostJID() peer_jid = jid.JID(iq_elt["from"]) return peer_jid, owner - def getVirtualClient(self, jid_: jid.JID) -> SatXMPPEntity: + def get_virtual_client(self, jid_: jid.JID) -> SatXMPPEntity: """Get client for this component with a specified jid This is needed to perform operations with a virtual JID corresponding to a virtual @@ -1229,13 +1229,13 @@ def client(self): return self.parent - def normalizeNS(self, elt: domish.Element, namespace: Optional[str]) -> None: + def normalize_ns(self, elt: domish.Element, namespace: Optional[str]) -> None: if elt.uri == namespace: elt.defaultUri = elt.uri = C.NS_CLIENT for child in elt.elements(): - self.normalizeNS(child, namespace) + self.normalize_ns(child, namespace) - def parseMessage(self, message_elt): + def parse_message(self, message_elt): """Parse a message XML and return message_data @param message_elt(domish.Element): raw <message> xml @@ -1245,13 +1245,13 @@ """ if message_elt.name != "message": log.warning(_( - "parseMessage used with a non <message/> stanza, ignoring: {xml}" + "parse_message used with a non <message/> stanza, ignoring: {xml}" .format(xml=message_elt.toXml()))) return {} if message_elt.uri == None: # xmlns may be None when wokkel element parsing strip out root namespace - self.normalizeNS(message_elt, None) + self.normalize_ns(message_elt, None) elif message_elt.uri != C.NS_CLIENT: log.warning(_( "received <message> with a wrong namespace: {xml}" @@ -1297,7 +1297,7 @@ received_timestamp = message_elt._received_timestamp except AttributeError: # message_elt._received_timestamp should have been set in onMessage - # but if parseMessage is called directly, it can be missing + # but if parse_message is called directly, it can be missing log.debug("missing received timestamp for {message_elt}".format( message_elt=message_elt)) received_timestamp = time.time() @@ -1316,7 +1316,7 @@ self.host.trigger.point("message_parse", client, message_elt, data) return data - def _onMessageStartWorkflow(self, cont, client, message_elt, post_treat): + def _on_message_start_workflow(self, cont, client, message_elt, post_treat): """Parse message and do post treatments It is the first callback called after messageReceived trigger @@ -1327,16 +1327,16 @@ """ if not cont: return - data = self.parseMessage(message_elt) - post_treat.addCallback(self.completeAttachments) - post_treat.addCallback(self.skipEmptyMessage) + data = self.parse_message(message_elt) + post_treat.addCallback(self.complete_attachments) + post_treat.addCallback(self.skip_empty_message) if not client.is_component or client.receiveHistory: post_treat.addCallback( - lambda ret: defer.ensureDeferred(self.addToHistory(ret)) + lambda ret: defer.ensureDeferred(self.add_to_history(ret)) ) if not client.is_component: - post_treat.addCallback(self.bridgeSignal, data) - post_treat.addErrback(self.cancelErrorTrap) + post_treat.addCallback(self.bridge_signal, data) + post_treat.addErrback(self.cancel_error_trap) post_treat.callback(data) def onMessage(self, message_elt): @@ -1348,18 +1348,18 @@ log.debug(_("got message from: {from_}").format(from_=message_elt["from"])) if self.client.is_component and message_elt.uri == component.NS_COMPONENT_ACCEPT: # we use client namespace all the time to simplify parsing - self.normalizeNS(message_elt, component.NS_COMPONENT_ACCEPT) + self.normalize_ns(message_elt, component.NS_COMPONENT_ACCEPT) # plugin can add their treatments to this deferred post_treat = defer.Deferred() - d = self.host.trigger.asyncPoint( + d = self.host.trigger.async_point( "messageReceived", client, message_elt, post_treat ) - d.addCallback(self._onMessageStartWorkflow, client, message_elt, post_treat) + d.addCallback(self._on_message_start_workflow, client, message_elt, post_treat) - def completeAttachments(self, data): + def complete_attachments(self, data): """Complete missing metadata of attachments""" for attachment in data['extra'].get(C.KEY_ATTACHMENTS, []): if "name" not in attachment and "url" in attachment: @@ -1374,24 +1374,24 @@ return data - def skipEmptyMessage(self, data): + def skip_empty_message(self, data): if not data["message"] and not data["extra"] and not data["subject"]: raise failure.Failure(exceptions.CancelError("Cancelled empty message")) return data - async def addToHistory(self, data): + async def add_to_history(self, data): if data.pop("history", None) == C.HISTORY_SKIP: log.debug("history is skipped as requested") data["extra"]["history"] = C.HISTORY_SKIP else: # we need a message to store - if self.parent.isMessagePrintable(data): - return await self.host.memory.addToHistory(self.parent, data) + if self.parent.is_message_printable(data): + return await self.host.memory.add_to_history(self.parent, data) else: log.debug("not storing empty message to history: {data}" .format(data=data)) - def bridgeSignal(self, __, data): + def bridge_signal(self, __, data): try: data["extra"]["received_timestamp"] = str(data["received_timestamp"]) data["extra"]["delay_sender"] = data["delay_sender"] @@ -1400,8 +1400,8 @@ if self.client.encryption.isEncrypted(data): data["extra"]["encrypted"] = True if data is not None: - if self.parent.isMessagePrintable(data): - self.host.bridge.messageNew( + if self.parent.is_message_printable(data): + self.host.bridge.message_new( data["uid"], data["timestamp"], data["from"].full(), @@ -1417,7 +1417,7 @@ data=data)) return data - def cancelErrorTrap(self, failure_): + def cancel_error_trap(self, failure_): """A message sending can be cancelled by a plugin treatment""" failure_.trap(exceptions.CancelError) @@ -1433,7 +1433,7 @@ self._groups = {} # map from groups to jids: key=group value=set of jids def __contains__(self, entity_jid): - return self.isJidInRoster(entity_jid) + return self.is_jid_in_roster(entity_jid) @property def versioning(self): @@ -1449,7 +1449,7 @@ """ return persistent.PersistentDict(NS_ROSTER_VER, self.parent.profile) - def _registerItem(self, item): + def _register_item(self, item): """Register item in local cache item must be already registered in self._jids before this method is called @@ -1477,7 +1477,7 @@ self._groups.setdefault(group, set()).add(item.entity) @defer.inlineCallbacks - def _cacheRoster(self, version): + def _cache_roster(self, version): """Serialise local roster and save it to storage @param version(unicode): version of roster in local cache @@ -1501,10 +1501,10 @@ yield roster_cache.clear() self._jids.clear() self._groups.clear() - yield self.requestRoster() + yield self.request_roster() @defer.inlineCallbacks - def requestRoster(self): + def request_roster(self): """Ask the server for Roster list """ if self.versioning: log.info(_("our server support roster versioning, we use it")) @@ -1526,7 +1526,7 @@ roster_item_elt = generic.parseXml(roster_item_elt_s.encode('utf-8')) roster_item = xmppim.RosterItem.fromElement(roster_item_elt) self._jids[roster_jid] = roster_item - self._registerItem(roster_item) + self._register_item(roster_item) else: log.warning(_("our server doesn't support roster versioning")) version = None @@ -1553,8 +1553,8 @@ ) self.removeItem(item.entity) # FIXME: to be checked else: - self._registerItem(item) - yield self._cacheRoster(roster.version) + self._register_item(item) + yield self._cache_roster(roster.version) if not self.got_roster.called: # got_roster may already be called if we use resync() @@ -1567,7 +1567,7 @@ """ return xmppim.RosterClientProtocol.removeItem(self, to_jid) - def getAttributes(self, item): + def get_attributes(self, item): """Return dictionary of attributes as used in bridge from a RosterItem @param item: RosterItem @@ -1602,9 +1602,9 @@ except KeyError: pass # no previous item registration (or it's been cleared) self._jids[entity] = item - self._registerItem(item) - self.host.bridge.newContact( - entity.full(), self.getAttributes(item), list(item.groups), + self._register_item(item) + self.host.bridge.contact_new( + entity.full(), self.get_attributes(item), list(item.groups), self.parent.profile ) @@ -1645,13 +1645,13 @@ ) # then we send the bridge signal - self.host.bridge.contactDeleted(entity.full(), self.parent.profile) + self.host.bridge.contact_deleted(entity.full(), self.parent.profile) - def getGroups(self): + def get_groups(self): """Return a list of groups""" return list(self._groups.keys()) - def getItem(self, entity_jid): + def get_item(self, entity_jid): """Return RosterItem for a given jid @param entity_jid(jid.JID): jid of the contact @@ -1660,18 +1660,18 @@ """ return self._jids.get(entity_jid, None) - def getJids(self): + def get_jids(self): """Return all jids of the roster""" return list(self._jids.keys()) - def isJidInRoster(self, entity_jid): + def is_jid_in_roster(self, entity_jid): """Return True if jid is in roster""" if not isinstance(entity_jid, jid.JID): raise exceptions.InternalError( f"a JID is expected, not {type(entity_jid)}: {entity_jid!r}") return entity_jid in self._jids - def isSubscribedFrom(self, entity_jid: jid.JID) -> bool: + def is_subscribed_from(self, entity_jid: jid.JID) -> bool: """Return True if entity is authorised to see our presence""" try: item = self._jids[entity_jid.userhostJID()] @@ -1679,7 +1679,7 @@ return False return item.subscriptionFrom - def isSubscribedTo(self, entity_jid: jid.JID) -> bool: + def is_subscribed_to(self, entity_jid: jid.JID) -> bool: """Return True if we are subscribed to entity""" try: item = self._jids[entity_jid.userhostJID()] @@ -1687,17 +1687,17 @@ return False return item.subscriptionTo - def getItems(self): + def get_items(self): """Return all items of the roster""" return list(self._jids.values()) - def getJidsFromGroup(self, group): + def get_jids_from_group(self, group): try: return self._groups[group] except KeyError: raise exceptions.UnknownGroupError(group) - def getJidsSet(self, type_, groups=None): + def get_jids_set(self, type_, groups=None): """Helper method to get a set of jids @param type_(unicode): one of: @@ -1710,22 +1710,22 @@ raise ValueError("groups must not be set for {} type".format(C.ALL)) if type_ == C.ALL: - return set(self.getJids()) + return set(self.get_jids()) elif type_ == C.GROUP: jids = set() for group in groups: - jids.update(self.getJidsFromGroup(group)) + jids.update(self.get_jids_from_group(group)) return jids else: raise ValueError("Unexpected type_ {}".format(type_)) - def getNick(self, entity_jid): + def get_nick(self, entity_jid): """Return a nick name for an entity return nick choosed by user if available else return user part of entity_jid """ - item = self.getItem(entity_jid) + item = self.get_item(entity_jid) if item is None: return entity_jid.user else: @@ -1761,12 +1761,12 @@ ): return - self.host.memory.setPresenceStatus( + self.host.memory.set_presence_status( entity, show or "", int(priority), statuses, self.parent.profile ) # now it's time to notify frontends - self.host.bridge.presenceUpdate( + self.host.bridge.presence_update( entity.full(), show or "", int(priority), statuses, self.parent.profile ) @@ -1791,7 +1791,7 @@ # if the entity is not known yet in this session or is already unavailable, # there is no need to send an unavailable signal try: - presence = self.host.memory.getEntityDatum( + presence = self.host.memory.get_entity_datum( self.client, entity, "presence" ) except (KeyError, exceptions.UnknownEntityError): @@ -1799,7 +1799,7 @@ pass else: if presence.show != C.PRESENCE_UNAVAILABLE: - self.host.bridge.presenceUpdate( + self.host.bridge.presence_update( entity.full(), C.PRESENCE_UNAVAILABLE, 0, @@ -1807,7 +1807,7 @@ self.parent.profile, ) - self.host.memory.setPresenceStatus( + self.host.memory.set_presence_status( entity, C.PRESENCE_UNAVAILABLE, 0, statuses, self.parent.profile ) @@ -1822,7 +1822,7 @@ if priority is None: try: priority = int( - self.host.memory.getParamA( + self.host.memory.param_get_a( "Priority", "Connection", profile_key=self.parent.profile ) ) @@ -1851,8 +1851,8 @@ def subscribed(self, entity): yield self.parent.roster.got_roster xmppim.PresenceClientProtocol.subscribed(self, entity) - self.host.memory.delWaitingSub(entity.userhost(), self.parent.profile) - item = self.parent.roster.getItem(entity) + self.host.memory.del_waiting_sub(entity.userhost(), self.parent.profile) + item = self.parent.roster.get_item(entity) if ( not item or not item.subscriptionTo ): # we automatically subscribe to 'to' presence @@ -1861,7 +1861,7 @@ def unsubscribed(self, entity): xmppim.PresenceClientProtocol.unsubscribed(self, entity) - self.host.memory.delWaitingSub(entity.userhost(), self.parent.profile) + self.host.memory.del_waiting_sub(entity.userhost(), self.parent.profile) def subscribedReceived(self, entity): log.debug(_("subscription approved for [%s]") % entity.userhost()) @@ -1875,14 +1875,14 @@ def subscribeReceived(self, entity): log.debug(_("subscription request from [%s]") % entity.userhost()) yield self.parent.roster.got_roster - item = self.parent.roster.getItem(entity) + item = self.parent.roster.get_item(entity) if item and item.subscriptionTo: # We automatically accept subscription if we are already subscribed to # contact presence log.debug(_("sending automatic subscription acceptance")) self.subscribed(entity) else: - self.host.memory.addWaitingSub( + self.host.memory.add_waiting_sub( "subscribe", entity.userhost(), self.parent.profile ) self.host.bridge.subscribe( @@ -1893,10 +1893,10 @@ def unsubscribeReceived(self, entity): log.debug(_("unsubscription asked for [%s]") % entity.userhost()) yield self.parent.roster.got_roster - item = self.parent.roster.getItem(entity) + item = self.parent.roster.get_item(entity) if item and item.subscriptionFrom: # we automatically remove contact log.debug(_("automatic contact deletion")) - self.host.delContact(entity, self.parent.profile) + self.host.contact_del(entity, self.parent.profile) self.host.bridge.subscribe("unsubscribe", entity.userhost(), self.parent.profile)
--- a/sat/memory/cache.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/cache.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,9 +45,9 @@ if None, the cache will be common for all profiles """ self.profile = profile - path_elts = [host.memory.getConfig("", "local_dir"), C.CACHE_DIR] + path_elts = [host.memory.config_get("", "local_dir"), C.CACHE_DIR] if profile: - path_elts.extend(["profiles", regex.pathEscape(profile)]) + path_elts.extend(["profiles", regex.path_escape(profile)]) else: path_elts.append("common") self.cache_dir = Path(*path_elts) @@ -121,14 +121,14 @@ raise exceptions.DataError("Invalid char found") return self.cache_dir / filename - def getMetadata(self, uid: str, update_eol: bool = True) -> Optional[Dict[str, Any]]: + def get_metadata(self, uid: str, update_eol: bool = True) -> Optional[Dict[str, Any]]: """Retrieve metadata for cached data @param uid(unicode): unique identifier of file @param update_eol(bool): True if eol must extended if True, max_age will be added to eol (only if it is not already expired) @return (dict, None): metadata with following keys: - see [cacheData] for data details, an additional "path" key is the full path to + see [cache_data] for data details, an additional "path" key is the full path to cached file. None if file is not in cache (or cache is invalid) """ @@ -176,23 +176,23 @@ cache_data["path"] = self.getPath(cache_data["filename"]) return cache_data - def getFilePath(self, uid: str) -> Path: + def get_file_path(self, uid: str) -> Path: """Retrieve absolute path to file @param uid(unicode): unique identifier of file @return (unicode, None): absolute path to cached file None if file is not in cache (or cache is invalid) """ - metadata = self.getMetadata(uid) + metadata = self.get_metadata(uid) if metadata is not None: return metadata["path"] - def removeFromCache(self, uid, metadata=None): + def remove_from_cache(self, uid, metadata=None): """Remove data from cache @param uid(unicode): unique identifier cache file """ - cache_data = self.getMetadata(uid, update_eol=False) + cache_data = self.get_metadata(uid, update_eol=False) if cache_data is None: log.debug(f"cache with uid {uid!r} has already expired or been removed") return @@ -215,7 +215,7 @@ cache_file.unlink() log.debug(f"cache with uid {uid!r} has been removed") - def cacheData( + def cache_data( self, source: str, uid: str,
--- a/sat/memory/crypto.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/crypto.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,7 +47,7 @@ """ if leave_empty and text == "": return "" - iv = BlockCipher.getRandomKey() + iv = BlockCipher.get_random_key() key = key.encode() key = ( key[: BlockCipher.MAX_KEY_SIZE] @@ -91,7 +91,7 @@ return BlockCipher.unpad(decrypted) @staticmethod - def getRandomKey(size=None, base64=False): + def get_random_key(size=None, base64=False): """Return a random key suitable for block cipher encryption. Note: a good value for the key length is to make it as long as the block size.
--- a/sat/memory/disco.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/disco.py Sat Apr 08 13:54:42 2023 +0200 @@ -87,7 +87,7 @@ return self.hashes.__contains__(hash_) def load(self): - def fillHashes(hashes): + def fill_hashes(hashes): for hash_, xml in hashes.items(): element = xml_tools.ElementParser()(xml) disco_info = disco.DiscoInfo.fromElement(element) @@ -106,7 +106,7 @@ log.info("Disco hashes loaded") d = self.persistent.load() - d.addCallback(fillHashes) + d.addCallback(fill_hashes) return d @@ -131,11 +131,11 @@ @param node(unicode): optional node to use for disco request @return: a Deferred which fire a boolean (True if feature is available) """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) defer.returnValue(feature in disco_infos.features) @defer.inlineCallbacks - def checkFeature(self, client, feature, jid_=None, node=""): + def check_feature(self, client, feature, jid_=None, node=""): """Like hasFeature, but raise an exception is feature is not Found @param feature: feature namespace @@ -144,13 +144,13 @@ @raise: exceptions.FeatureNotFound """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) if not feature in disco_infos.features: raise failure.Failure(exceptions.FeatureNotFound()) @defer.inlineCallbacks - def checkFeatures(self, client, features, jid_=None, identity=None, node=""): - """Like checkFeature, but check several features at once, and check also identity + def check_features(self, client, features, jid_=None, identity=None, node=""): + """Like check_feature, but check several features at once, and check also identity @param features(iterable[unicode]): features to check @param jid_(jid.JID): jid of the target, or None for profile's server @@ -159,14 +159,14 @@ @raise: exceptions.FeatureNotFound """ - disco_infos = yield self.getInfos(client, jid_, node) + disco_infos = yield self.get_infos(client, jid_, node) if not set(features).issubset(disco_infos.features): raise failure.Failure(exceptions.FeatureNotFound()) if identity is not None and identity not in disco_infos.identities: raise failure.Failure(exceptions.FeatureNotFound()) - async def hasIdentity( + async def has_identity( self, client: SatXMPPEntity, category: str, @@ -182,10 +182,10 @@ @param node(unicode): optional node to use for disco request @return: True if the entity has the given identity """ - disco_infos = await self.getInfos(client, jid_, node) + disco_infos = await self.get_infos(client, jid_, node) return (category, type_) in disco_infos.identities - def getInfos(self, client, jid_=None, node="", use_cache=True): + def get_infos(self, client, jid_=None, node="", use_cache=True): """get disco infos from jid_, filling capability hash if needed @param jid_: jid of the target, or None for profile's server @@ -199,13 +199,13 @@ if not use_cache: # we ignore cache, so we pretend we haven't found it raise KeyError - cap_hash = self.host.memory.getEntityData( + cap_hash = self.host.memory.entity_data_get( client, jid_, [C.ENTITY_CAP_HASH] )[C.ENTITY_CAP_HASH] except (KeyError, exceptions.UnknownEntityError): # capability hash is not available, we'll compute one - def infosCb(disco_infos): - cap_hash = self.generateHash(disco_infos) + def infos_cb(disco_infos): + cap_hash = self.generate_hash(disco_infos) for ext_form in disco_infos.extensions.values(): # wokkel doesn't call typeCheck on reception, so we do it here # to avoid ending up with incorrect types. We have to do it after @@ -213,12 +213,12 @@ # hash) ext_form.typeCheck() self.hashes[cap_hash] = disco_infos - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, C.ENTITY_CAP_HASH, cap_hash ) return disco_infos - def infosEb(fail): + def infos_eb(fail): if fail.check(defer.CancelledError): reason = "request time-out" fail = failure.Failure(exceptions.TimeOutError(str(fail.value))) @@ -236,21 +236,21 @@ # XXX we set empty disco in cache, to avoid getting an error or waiting # for a timeout again the next time - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, C.ENTITY_CAP_HASH, CAP_HASH_ERROR ) raise fail d = client.disco.requestInfo(jid_, nodeIdentifier=node) - d.addCallback(infosCb) - d.addErrback(infosEb) + d.addCallback(infos_cb) + d.addErrback(infos_eb) return d else: disco_infos = self.hashes[cap_hash] return defer.succeed(disco_infos) @defer.inlineCallbacks - def getItems(self, client, jid_=None, node="", use_cache=True): + def get_items(self, client, jid_=None, node="", use_cache=True): """get disco items from jid_, cache them for our own server @param jid_(jid.JID): jid of the target, or None for profile's server @@ -264,7 +264,7 @@ if jid_ == client.server_jid and not node: # we cache items only for our own server and if node is not set try: - items = self.host.memory.getEntityData( + items = self.host.memory.entity_data_get( client, jid_, ["DISCO_ITEMS"] )["DISCO_ITEMS"] log.debug("[%s] disco items are in cache" % jid_.full()) @@ -274,7 +274,7 @@ except (KeyError, exceptions.UnknownEntityError): log.debug("Caching [%s] disco items" % jid_.full()) items = yield client.disco.requestItems(jid_, nodeIdentifier=node) - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, jid_, "DISCO_ITEMS", items ) else: @@ -290,24 +290,24 @@ defer.returnValue(items) - def _infosEb(self, failure_, entity_jid): + def _infos_eb(self, failure_, entity_jid): failure_.trap(StanzaError) log.warning( _("Error while requesting [%(jid)s]: %(error)s") % {"jid": entity_jid.full(), "error": failure_.getErrorMessage()} ) - def findServiceEntity(self, client, category, type_, jid_=None): - """Helper method to find first available entity from findServiceEntities + def find_service_entity(self, client, category, type_, jid_=None): + """Helper method to find first available entity from find_service_entities - args are the same as for [findServiceEntities] + args are the same as for [find_service_entities] @return (jid.JID, None): found entity """ - d = self.host.findServiceEntities(client, category, type_) + d = self.host.find_service_entities(client, category, type_) d.addCallback(lambda entities: entities.pop() if entities else None) return d - def findServiceEntities(self, client, category, type_, jid_=None): + def find_service_entities(self, client, category, type_, jid_=None): """Return all available items of an entity which correspond to (category, type_) @param category: identity's category @@ -318,29 +318,29 @@ """ found_entities = set() - def infosCb(infos, entity_jid): + def infos_cb(infos, entity_jid): if (category, type_) in infos.identities: found_entities.add(entity_jid) - def gotItems(items): + def got_items(items): defers_list = [] for item in items: - info_d = self.getInfos(client, item.entity) + info_d = self.get_infos(client, item.entity) info_d.addCallbacks( - infosCb, self._infosEb, [item.entity], None, [item.entity] + infos_cb, self._infos_eb, [item.entity], None, [item.entity] ) defers_list.append(info_d) return defer.DeferredList(defers_list) - d = self.getItems(client, jid_) - d.addCallback(gotItems) + d = self.get_items(client, jid_) + d.addCallback(got_items) d.addCallback(lambda __: found_entities) reactor.callLater( TIMEOUT, d.cancel ) # FIXME: one bad service make a general timeout return d - def findFeaturesSet(self, client, features, identity=None, jid_=None): + def find_features_set(self, client, features, identity=None, jid_=None): """Return entities (including jid_ and its items) offering features @param features: iterable of features which must be present @@ -355,7 +355,7 @@ features = set(features) found_entities = set() - def infosCb(infos, entity): + def infos_cb(infos, entity): if entity is None: log.warning(_("received an item without jid")) return @@ -364,23 +364,23 @@ if features.issubset(infos.features): found_entities.add(entity) - def gotItems(items): + def got_items(items): defer_list = [] for entity in [jid_] + [item.entity for item in items]: - infos_d = self.getInfos(client, entity) - infos_d.addCallbacks(infosCb, self._infosEb, [entity], None, [entity]) + infos_d = self.get_infos(client, entity) + infos_d.addCallbacks(infos_cb, self._infos_eb, [entity], None, [entity]) defer_list.append(infos_d) return defer.DeferredList(defer_list) - d = self.getItems(client, jid_) - d.addCallback(gotItems) + d = self.get_items(client, jid_) + d.addCallback(got_items) d.addCallback(lambda __: found_entities) reactor.callLater( TIMEOUT, d.cancel ) # FIXME: one bad service make a general timeout return d - def generateHash(self, services): + def generate_hash(self, services): """ Generate a unique hash for given service hash algorithm is the one described in XEP-0115 @@ -433,7 +433,7 @@ return cap_hash @defer.inlineCallbacks - def _discoInfos( + def _disco_infos( self, entity_jid_s, node="", use_cache=True, profile_key=C.PROF_KEY_NONE ): """Discovery method for the bridge @@ -443,9 +443,9 @@ @return: list of tuples """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) entity = jid.JID(entity_jid_s) - disco_infos = yield self.getInfos(client, entity, node, use_cache) + disco_infos = yield self.get_infos(client, entity, node, use_cache) extensions = {} # FIXME: should extensions be serialised using tools.common.data_format? for form_type, form in list(disco_infos.extensions.items()): @@ -459,7 +459,7 @@ values = [field.value] if field.value is not None else field.values if field.fieldType == "boolean": - values = [C.boolConst(v) for v in values] + values = [C.bool_const(v) for v in values] fields.append((data, values)) extensions[form_type or ""] = fields @@ -483,7 +483,7 @@ yield (item.entity.full(), item.nodeIdentifier or "", item.name or "") @defer.inlineCallbacks - def _discoItems( + def _disco_items( self, entity_jid_s, node="", use_cache=True, profile_key=C.PROF_KEY_NONE ): """ Discovery method for the bridge @@ -492,8 +492,8 @@ @param node(unicode): optional node to use @param use_cache(bool): if True, use cached data if available @return: list of tuples""" - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) entity = jid.JID(entity_jid_s) - disco_items = yield self.getItems(client, entity, node, use_cache) + disco_items = yield self.get_items(client, entity, node, use_cache) ret = list(self.items2tuples(disco_items)) defer.returnValue(ret)
--- a/sat/memory/encryption.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/encryption.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,7 +50,7 @@ def host(self): return self.client.host_app - async def loadSessions(self): + async def load_sessions(self): """Load persistent sessions""" await self._stored_session.load() start_d_list = [] @@ -69,18 +69,18 @@ log.info(_("encryption sessions restored")) @classmethod - def registerPlugin(cls, plg_instance, name, namespace, priority=0, directed=False): + def register_plugin(cls, plg_instance, name, namespace, priority=0, directed=False): """Register a plugin handling an encryption algorithm @param plg_instance(object): instance of the plugin it must have the following methods: - - getTrustUI(entity): return a XMLUI for trust management + - get_trust_ui(entity): return a XMLUI for trust management entity(jid.JID): entity to manage The returned XMLUI must be a form if may have the following methods: - - startEncryption(entity): start encrypted session + - start_encryption(entity): start encrypted session entity(jid.JID): entity to start encrypted session with - - stopEncryption(entity): start encrypted session + - stop_encryption(entity): start encrypted session entity(jid.JID): entity to stop encrypted session with if they don't exists, those 2 methods will be ignored. @@ -115,7 +115,7 @@ return cls.plugins @classmethod - def getPlugin(cls, namespace): + def get_plugin(cls, namespace): try: return next(p for p in cls.plugins if p.namespace == namespace) except StopIteration: @@ -124,12 +124,12 @@ namespace=namespace)) @classmethod - def getNamespaces(cls): + def get_namespaces(cls): """Get available plugin namespaces""" return {p.namespace for p in cls.getPlugins()} @classmethod - def getNSFromName(cls, name): + def get_ns_from_name(cls, name): """Retrieve plugin namespace from its name @param name(unicode): name of the plugin (case insensitive) @@ -143,7 +143,7 @@ "Can't find a plugin with the name \"{name}\".".format( name=name))) - def getBridgeData(self, session): + def get_bridge_data(self, session): """Retrieve session data serialized for bridge. @param session(dict): encryption session @@ -159,7 +159,7 @@ return data_format.serialise(bridge_data) - async def _startEncryption(self, plugin, entity): + async def _start_encryption(self, plugin, entity): """Start encryption with a plugin This method must be called just before adding a plugin session. @@ -168,14 +168,14 @@ if not plugin.directed: await self._stored_session.aset(entity.userhost(), plugin.namespace) try: - start_encryption = plugin.instance.startEncryption + start_encryption = plugin.instance.start_encryption except AttributeError: - log.debug(f"No startEncryption method found for {plugin.namespace}") + log.debug(f"No start_encryption method found for {plugin.namespace}") else: # we copy entity to avoid having the resource changed by stop_encryption - await utils.asDeferred(start_encryption, self.client, copy.copy(entity)) + await utils.as_deferred(start_encryption, self.client, copy.copy(entity)) - async def _stopEncryption(self, plugin, entity): + async def _stop_encryption(self, plugin, entity): """Stop encryption with a plugin This method must be called just before removing a plugin session. @@ -186,12 +186,12 @@ except KeyError: pass try: - stop_encryption = plugin.instance.stopEncryption + stop_encryption = plugin.instance.stop_encryption except AttributeError: - log.debug(f"No stopEncryption method found for {plugin.namespace}") + log.debug(f"No stop_encryption method found for {plugin.namespace}") else: # we copy entity to avoid having the resource changed by stop_encryption - return utils.asDeferred(stop_encryption, self.client, copy.copy(entity)) + return utils.as_deferred(stop_encryption, self.client, copy.copy(entity)) async def start(self, entity, namespace=None, replace=False): """Start an encryption session with an entity @@ -211,7 +211,7 @@ if namespace is None: plugin = self.plugins[0] else: - plugin = self.getPlugin(namespace) + plugin = self.get_plugin(namespace) bare_jid = entity.userhostJID() if bare_jid in self._sessions: @@ -227,7 +227,7 @@ # there is a conflict, but replacement is requested # so we stop previous encryption to use new one del self._sessions[bare_jid] - await self._stopEncryption(former_plugin, entity) + await self._stop_encryption(former_plugin, entity) else: msg = (_("Session with {bare_jid} is already encrypted with {name}. " "Please stop encryption session before changing algorithm.") @@ -238,7 +238,7 @@ data = {"plugin": plugin} if plugin.directed: if not entity.resource: - entity.resource = self.host.memory.getMainResource(self.client, entity) + entity.resource = self.host.memory.main_resource_get(self.client, entity) if not entity.resource: raise exceptions.NotFound( _("No resource found for {destinee}, can't encrypt with {name}") @@ -251,14 +251,14 @@ elif entity.resource: raise ValueError(_("{name} encryption must be used with bare jids.")) - await self._startEncryption(plugin, entity) + await self._start_encryption(plugin, entity) self._sessions[entity.userhostJID()] = data log.info(_("Encryption session has been set for {entity_jid} with " "{encryption_name}").format( entity_jid=entity.full(), encryption_name=plugin.name)) - self.host.bridge.messageEncryptionStarted( + self.host.bridge.message_encryption_started( entity.full(), - self.getBridgeData(data), + self.get_bridge_data(data), self.client.profile) msg = D_("Encryption session started: your messages with {destinee} are " "now end to end encrypted using {name} algorithm.").format( @@ -312,16 +312,16 @@ # we stop the whole session # see comment below for deleting session before stopping encryption del self._sessions[entity.userhostJID()] - await self._stopEncryption(plugin, entity) + await self._stop_encryption(plugin, entity) else: - # plugin's stopEncryption may call stop again (that's the case with OTR) - # so we need to remove plugin from session before calling self._stopEncryption + # plugin's stop_encryption may call stop again (that's the case with OTR) + # so we need to remove plugin from session before calling self._stop_encryption del self._sessions[entity.userhostJID()] - await self._stopEncryption(plugin, entity) + await self._stop_encryption(plugin, entity) log.info(_("encryption session stopped with entity {entity}").format( entity=entity.full())) - self.host.bridge.messageEncryptionStopped( + self.host.bridge.message_encryption_stopped( entity.full(), {'name': plugin.name, 'namespace': plugin.namespace, @@ -358,7 +358,7 @@ return None return session["plugin"].namespace - def getTrustUI(self, entity_jid, namespace=None): + def get_trust_ui(self, entity_jid, namespace=None): """Retrieve encryption UI @param entity_jid(jid.JID): get the UI for this entity @@ -379,53 +379,53 @@ .format(entity_jid=entity_jid.full())) plugin = session['plugin'] else: - plugin = self.getPlugin(namespace) + plugin = self.get_plugin(namespace) try: - get_trust_ui = plugin.instance.getTrustUI + get_trust_ui = plugin.instance.get_trust_ui except AttributeError: raise NotImplementedError( "Encryption plugin doesn't handle trust management UI") else: - return utils.asDeferred(get_trust_ui, self.client, entity_jid) + return utils.as_deferred(get_trust_ui, self.client, entity_jid) ## Menus ## @classmethod - def _importMenus(cls, host): - host.importMenu( + def _import_menus(cls, host): + host.import_menu( (D_("Encryption"), D_("unencrypted (plain text)")), - partial(cls._onMenuUnencrypted, host=host), + partial(cls._on_menu_unencrypted, host=host), security_limit=0, help_string=D_("End encrypted session"), type_=C.MENU_SINGLE, ) for plg in cls.getPlugins(): - host.importMenu( + host.import_menu( (D_("Encryption"), plg.name), - partial(cls._onMenuName, host=host, plg=plg), + partial(cls._on_menu_name, host=host, plg=plg), security_limit=0, help_string=D_("Start {name} session").format(name=plg.name), type_=C.MENU_SINGLE, ) - host.importMenu( + host.import_menu( (D_("Encryption"), D_("⛨ {name} trust").format(name=plg.name)), - partial(cls._onMenuTrust, host=host, plg=plg), + partial(cls._on_menu_trust, host=host, plg=plg), security_limit=0, help_string=D_("Manage {name} trust").format(name=plg.name), type_=C.MENU_SINGLE, ) @classmethod - def _onMenuUnencrypted(cls, data, host, profile): - client = host.getClient(profile) + def _on_menu_unencrypted(cls, data, host, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']).userhostJID() d = defer.ensureDeferred(client.encryption.stop(peer_jid)) d.addCallback(lambda __: {}) return d @classmethod - def _onMenuName(cls, data, host, plg, profile): - client = host.getClient(profile) + def _on_menu_name(cls, data, host, plg, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']) if not plg.directed: peer_jid = peer_jid.userhostJID() @@ -436,15 +436,15 @@ @classmethod @defer.inlineCallbacks - def _onMenuTrust(cls, data, host, plg, profile): - client = host.getClient(profile) + def _on_menu_trust(cls, data, host, plg, profile): + client = host.get_client(profile) peer_jid = jid.JID(data['jid']).userhostJID() - ui = yield client.encryption.getTrustUI(peer_jid, plg.namespace) + ui = yield client.encryption.get_trust_ui(peer_jid, plg.namespace) defer.returnValue({'xmlui': ui.toXml()}) ## Triggers ## - def setEncryptionFlag(self, mess_data): + def set_encryption_flag(self, mess_data): """Set "encryption" key in mess_data if session with destinee is encrypted""" to_jid = mess_data['to'] encryption = self._sessions.get(to_jid.userhostJID()) @@ -455,11 +455,11 @@ f"encryption flag must not be set for groupchat if encryption algorithm " f"({encryption['plugin'].name}) is directed!") mess_data[C.MESS_KEY_ENCRYPTION] = encryption - self.markAsEncrypted(mess_data, plugin.namespace) + self.mark_as_encrypted(mess_data, plugin.namespace) ## Misc ## - def markAsEncrypted(self, mess_data, namespace): + def mark_as_encrypted(self, mess_data, namespace): """Helper method to mark a message as having been e2e encrypted. This should be used in the post_treat workflow of messageReceived trigger of @@ -483,7 +483,7 @@ return mess_data - def isEncryptionRequested( + def is_encryption_requested( self, mess_data: MessageData, namespace: Optional[str] = None @@ -513,7 +513,7 @@ return mess_data['extra'].get(C.MESS_KEY_ENCRYPTED, False) - def markAsTrusted(self, mess_data): + def mark_as_trusted(self, mess_data): """Helper methor to mark a message as sent from a trusted entity. This should be used in the post_treat workflow of messageReceived trigger of @@ -523,7 +523,7 @@ mess_data[C.MESS_KEY_TRUSTED] = True return mess_data - def markAsUntrusted(self, mess_data): + def mark_as_untrusted(self, mess_data): """Helper methor to mark a message as sent from an untrusted entity. This should be used in the post_treat workflow of messageReceived trigger of
--- a/sat/memory/memory.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/memory.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,13 +65,13 @@ self.timeout = timeout or Sessions.DEFAULT_TIMEOUT self.resettable_timeout = resettable_timeout - def newSession(self, session_data=None, session_id=None, profile=None): + def new_session(self, session_data=None, session_id=None, profile=None): """Create a new session @param session_data: mutable data to use, default to a dict @param session_id (str): force the session_id to the given string @param profile: if set, the session is owned by the profile, - and profileGet must be used instead of __getitem__ + and profile_get must be used instead of __getitem__ @return: session_id, session_data """ if session_id is None: @@ -80,7 +80,7 @@ raise exceptions.ConflictError( "Session id {} is already used".format(session_id) ) - timer = reactor.callLater(self.timeout, self._purgeSession, session_id) + timer = reactor.callLater(self.timeout, self._purge_session, session_id) if session_data is None: session_data = {} self._sessions[session_id] = ( @@ -88,7 +88,7 @@ ) return session_id, session_data - def _purgeSession(self, session_id): + def _purge_session(self, session_id): try: timer, session_data, profile = self._sessions[session_id] except ValueError: @@ -113,7 +113,7 @@ def __contains__(self, session_id): return session_id in self._sessions - def profileGet(self, session_id, profile): + def profile_get(self, session_id, profile): try: timer, session_data, profile_set = self._sessions[session_id] except ValueError: @@ -133,7 +133,7 @@ timer, session_data = self._sessions[session_id] except ValueError: raise exceptions.InternalError( - "You need to use profileGet instead of __getitem__ when profile is set" + "You need to use profile_get instead of __getitem__ when profile is set" ) except KeyError: raise failure.Failure(KeyError(MSG_NO_SESSION)) @@ -142,11 +142,11 @@ return session_data def __setitem__(self, key, value): - raise NotImplementedError("You need do use newSession to create a session") + raise NotImplementedError("You need do use new_session to create a session") def __delitem__(self, session_id): """ delete the session data """ - self._purgeSession(session_id) + self._purge_session(session_id) def keys(self): return list(self._sessions.keys()) @@ -160,7 +160,7 @@ used as the key to retrieve data or delete a session (instead of session id). """ - def _profileGetAllIds(self, profile): + def _profile_get_all_ids(self, profile): """Return a list of the sessions ids that are associated to the given profile. @param profile: %(doc_profile)s @@ -176,7 +176,7 @@ ret.append(session_id) return ret - def profileGetUnique(self, profile): + def profile_get_unique(self, profile): """Return the data of the unique session that is associated to the given profile. @param profile: %(doc_profile)s @@ -185,25 +185,25 @@ - None if no session is associated to the profile - raise an error if more than one session are found """ - ids = self._profileGetAllIds(profile) + ids = self._profile_get_all_ids(profile) if len(ids) > 1: raise exceptions.InternalError( - "profileGetUnique has been used but more than one session has been found!" + "profile_get_unique has been used but more than one session has been found!" ) return ( - self.profileGet(ids[0], profile) if len(ids) == 1 else None + self.profile_get(ids[0], profile) if len(ids) == 1 else None ) # XXX: timeout might be reset - def profileDelUnique(self, profile): + def profile_del_unique(self, profile): """Delete the unique session that is associated to the given profile. @param profile: %(doc_profile)s @return: None, but raise an error if more than one session are found """ - ids = self._profileGetAllIds(profile) + ids = self._profile_get_all_ids(profile) if len(ids) > 1: raise exceptions.InternalError( - "profileDelUnique has been used but more than one session has been found!" + "profile_del_unique has been used but more than one session has been found!" ) if len(ids) == 1: del self._sessions[ids[0]] @@ -217,7 +217,7 @@ def __init__(self, timeout=None): ProfileSessions.__init__(self, timeout, resettable_timeout=False) - def _purgeSession(self, session_id): + def _purge_session(self, session_id): log.debug( "FIXME: PasswordSessions should ask for the profile password after the session expired" ) @@ -237,9 +237,9 @@ self.subscriptions = {} self.auth_sessions = PasswordSessions() # remember the authenticated profiles self.disco = Discovery(host) - self.config = tools_config.parseMainConf(log_filenames=True) - self._cache_path = Path(self.getConfig("", "local_dir"), C.CACHE_DIR) - self.admins = self.getConfig("", "admins_list", []) + self.config = tools_config.parse_main_conf(log_filenames=True) + self._cache_path = Path(self.config_get("", "local_dir"), C.CACHE_DIR) + self.admins = self.config_get("", "admins_list", []) self.admin_jids = set() @@ -256,7 +256,7 @@ await self.disco.load() for admin in self.admins: try: - admin_jid_s = await self.asyncGetParamA( + admin_jid_s = await self.param_get_a_async( "JabberID", "Connection", profile_key=admin ) except Exception as e: @@ -273,7 +273,7 @@ ## Configuration ## - def getConfig(self, section, name, default=None): + def config_get(self, section, name, default=None): """Get the main configuration option @param section: section of the config file (None or '' for DEFAULT) @@ -281,7 +281,7 @@ @param default: value to use if not found @return: str, list or dict """ - return tools_config.getConfig(self.config, section, name, default) + return tools_config.config_get(self.config, section, name, default) def load_xml(self, filename): """Load parameters template from xml file @@ -322,16 +322,16 @@ def load(self): """Load parameters and all memory things from db""" # parameters data - return self.params.loadGenParams() + return self.params.load_gen_params() - def loadIndividualParams(self, profile): + def load_individual_params(self, profile): """Load individual parameters for a profile @param profile: %(doc_profile)s""" - return self.params.loadIndParams(profile) + return self.params.load_ind_params(profile) ## Profiles/Sessions management ## - def startSession(self, password, profile): + def start_session(self, password, profile): """"Iniatialise session for a profile @param password(unicode): profile session password @@ -340,59 +340,59 @@ @raise exceptions.ProfileUnknownError if profile doesn't exists @raise exceptions.PasswordError: the password does not match """ - profile = self.getProfileName(profile) + profile = self.get_profile_name(profile) - def createSession(__): + def create_session(__): """Called once params are loaded.""" self._entities_cache[profile] = {} log.info("[{}] Profile session started".format(profile)) return False - def backendInitialised(__): - def doStartSession(__=None): - if self.isSessionStarted(profile): + def backend_initialised(__): + def do_start_session(__=None): + if self.is_session_started(profile): log.info("Session already started!") return True try: # if there is a value at this point in self._entities_cache, - # it is the loadIndividualParams Deferred, the session is starting + # it is the load_individual_params Deferred, the session is starting session_d = self._entities_cache[profile] except KeyError: # else we do request the params - session_d = self._entities_cache[profile] = self.loadIndividualParams( + session_d = self._entities_cache[profile] = self.load_individual_params( profile ) - session_d.addCallback(createSession) + session_d.addCallback(create_session) finally: return session_d - auth_d = defer.ensureDeferred(self.profileAuthenticate(password, profile)) - auth_d.addCallback(doStartSession) + auth_d = defer.ensureDeferred(self.profile_authenticate(password, profile)) + auth_d.addCallback(do_start_session) return auth_d if self.host.initialised.called: - return defer.succeed(None).addCallback(backendInitialised) + return defer.succeed(None).addCallback(backend_initialised) else: - return self.host.initialised.addCallback(backendInitialised) + return self.host.initialised.addCallback(backend_initialised) - def stopSession(self, profile): + def stop_session(self, profile): """Delete a profile session @param profile: %(doc_profile)s """ - if self.host.isConnected(profile): + if self.host.is_connected(profile): log.debug("Disconnecting profile because of session stop") self.host.disconnect(profile) - self.auth_sessions.profileDelUnique(profile) + self.auth_sessions.profile_del_unique(profile) try: self._entities_cache[profile] except KeyError: log.warning("Profile was not in cache") - def _isSessionStarted(self, profile_key): - return self.isSessionStarted(self.getProfileName(profile_key)) + def _is_session_started(self, profile_key): + return self.is_session_started(self.get_profile_name(profile_key)) - def isSessionStarted(self, profile): + def is_session_started(self, profile): try: # XXX: if the value in self._entities_cache is a Deferred, # the session is starting but not started yet @@ -400,20 +400,20 @@ except KeyError: return False - async def profileAuthenticate(self, password, profile): + async def profile_authenticate(self, password, profile): """Authenticate the profile. @param password (unicode): the SàT profile password @return: None in case of success (an exception is raised otherwise) @raise exceptions.PasswordError: the password does not match """ - if not password and self.auth_sessions.profileGetUnique(profile): + if not password and self.auth_sessions.profile_get_unique(profile): # XXX: this allows any frontend to connect with the empty password as soon as # the profile has been authenticated at least once before. It is OK as long as # submitting a form with empty passwords is restricted to local frontends. return - sat_cipher = await self.asyncGetParamA( + sat_cipher = await self.param_get_a_async( C.PROFILE_PASS_PATH[1], C.PROFILE_PASS_PATH[0], profile_key=profile ) valid = PasswordHasher.verify(password, sat_cipher) @@ -421,9 +421,9 @@ log.warning(_("Authentication failure of profile {profile}").format( profile=profile)) raise exceptions.PasswordError("The provided profile password doesn't match.") - return await self.newAuthSession(password, profile) + return await self.new_auth_session(password, profile) - async def newAuthSession(self, key, profile): + async def new_auth_session(self, key, profile): """Start a new session for the authenticated profile. If there is already an existing session, no new one is created @@ -435,18 +435,18 @@ data = await PersistentDict(C.MEMORY_CRYPTO_NAMESPACE, profile).load() personal_key = BlockCipher.decrypt(key, data[C.MEMORY_CRYPTO_KEY]) # Create the session for this profile and store the personal key - session_data = self.auth_sessions.profileGetUnique(profile) + session_data = self.auth_sessions.profile_get_unique(profile) if not session_data: - self.auth_sessions.newSession( + self.auth_sessions.new_session( {C.MEMORY_CRYPTO_KEY: personal_key}, profile=profile ) log.debug("auth session created for profile %s" % profile) - def purgeProfileSession(self, profile): + def purge_profile_session(self, profile): """Delete cache of data of profile @param profile: %(doc_profile)s""" log.info(_("[%s] Profile session purge" % profile)) - self.params.purgeProfile(profile) + self.params.purge_profile(profile) try: del self._entities_cache[profile] except KeyError: @@ -457,7 +457,7 @@ % profile ) - def getProfilesList(self, clients=True, components=False): + def get_profiles_list(self, clients=True, components=False): """retrieve profiles list @param clients(bool): if True return clients profiles @@ -467,18 +467,18 @@ if not clients and not components: log.warning(_("requesting no profiles at all")) return [] - profiles = self.storage.getProfilesList() + profiles = self.storage.get_profiles_list() if clients and components: return sorted(profiles) - isComponent = self.storage.profileIsComponent + is_component = self.storage.profile_is_component if clients: - p_filter = lambda p: not isComponent(p) + p_filter = lambda p: not is_component(p) else: - p_filter = lambda p: isComponent(p) + p_filter = lambda p: is_component(p) return sorted(p for p in profiles if p_filter(p)) - def getProfileName(self, profile_key, return_profile_keys=False): + def get_profile_name(self, profile_key, return_profile_keys=False): """Return name of profile from keyword @param profile_key: can be the profile name or a keyword (like @DEFAULT@) @@ -486,19 +486,19 @@ @return: requested profile name @raise exceptions.ProfileUnknownError if profile doesn't exists """ - return self.params.getProfileName(profile_key, return_profile_keys) + return self.params.get_profile_name(profile_key, return_profile_keys) - def profileSetDefault(self, profile): + def profile_set_default(self, profile): """Set default profile @param profile: %(doc_profile)s """ # we want to be sure that the profile exists - profile = self.getProfileName(profile) + profile = self.get_profile_name(profile) self.memory_data["Profile_default"] = profile - def createProfile(self, name, password, component=None): + def create_profile(self, name, password, component=None): """Create a new profile @param name(unicode): profile name @@ -532,40 +532,40 @@ # raise ValueError(_(u"Plugin {component} is not an entry point !".format( # component = component))) - d = self.params.createProfile(name, component) + d = self.params.create_profile(name, component) - def initPersonalKey(__): + def init_personal_key(__): # be sure to call this after checking that the profile doesn't exist yet # generated once for all and saved in a PersistentDict - personal_key = BlockCipher.getRandomKey( + personal_key = BlockCipher.get_random_key( base64=True ).decode('utf-8') - self.auth_sessions.newSession( + self.auth_sessions.new_session( {C.MEMORY_CRYPTO_KEY: personal_key}, profile=name - ) # will be encrypted by setParam + ) # will be encrypted by param_set - def startFakeSession(__): - # avoid ProfileNotConnected exception in setParam + def start_fake_session(__): + # avoid ProfileNotConnected exception in param_set self._entities_cache[name] = None - self.params.loadIndParams(name) + self.params.load_ind_params(name) - def stopFakeSession(__): + def stop_fake_session(__): del self._entities_cache[name] - self.params.purgeProfile(name) + self.params.purge_profile(name) - d.addCallback(initPersonalKey) - d.addCallback(startFakeSession) + d.addCallback(init_personal_key) + d.addCallback(start_fake_session) d.addCallback( - lambda __: self.setParam( + lambda __: self.param_set( C.PROFILE_PASS_PATH[1], password, C.PROFILE_PASS_PATH[0], profile_key=name ) ) - d.addCallback(stopFakeSession) - d.addCallback(lambda __: self.auth_sessions.profileDelUnique(name)) + d.addCallback(stop_fake_session) + d.addCallback(lambda __: self.auth_sessions.profile_del_unique(name)) return d - def asyncDeleteProfile(self, name, force=False): + def profile_delete_async(self, name, force=False): """Delete an existing profile @param name: Name of the profile @@ -574,55 +574,55 @@ @return: a Deferred instance """ - def cleanMemory(__): - self.auth_sessions.profileDelUnique(name) + def clean_memory(__): + self.auth_sessions.profile_del_unique(name) try: del self._entities_cache[name] except KeyError: pass - d = self.params.asyncDeleteProfile(name, force) - d.addCallback(cleanMemory) + d = self.params.profile_delete_async(name, force) + d.addCallback(clean_memory) return d - def isComponent(self, profile_name): + def is_component(self, profile_name): """Tell if a profile is a component @param profile_name(unicode): name of the profile @return (bool): True if profile is a component @raise exceptions.NotFound: profile doesn't exist """ - return self.storage.profileIsComponent(profile_name) + return self.storage.profile_is_component(profile_name) - def getEntryPoint(self, profile_name): + def get_entry_point(self, profile_name): """Get a component entry point @param profile_name(unicode): name of the profile @return (bool): True if profile is a component @raise exceptions.NotFound: profile doesn't exist """ - return self.storage.getEntryPoint(profile_name) + return self.storage.get_entry_point(profile_name) ## History ## - def addToHistory(self, client, data): - return self.storage.addToHistory(data, client.profile) + def add_to_history(self, client, data): + return self.storage.add_to_history(data, client.profile) - def _historyGetSerialise(self, history_data): + def _history_get_serialise(self, history_data): return [ (uid, timestamp, from_jid, to_jid, message, subject, mess_type, data_format.serialise(extra)) for uid, timestamp, from_jid, to_jid, message, subject, mess_type, extra in history_data ] - def _historyGet(self, from_jid_s, to_jid_s, limit=C.HISTORY_LIMIT_NONE, between=True, + def _history_get(self, from_jid_s, to_jid_s, limit=C.HISTORY_LIMIT_NONE, between=True, filters=None, profile=C.PROF_KEY_NONE): - d = self.historyGet(jid.JID(from_jid_s), jid.JID(to_jid_s), limit, between, + d = self.history_get(jid.JID(from_jid_s), jid.JID(to_jid_s), limit, between, filters, profile) - d.addCallback(self._historyGetSerialise) + d.addCallback(self._history_get_serialise) return d - def historyGet(self, from_jid, to_jid, limit=C.HISTORY_LIMIT_NONE, between=True, + def history_get(self, from_jid, to_jid, limit=C.HISTORY_LIMIT_NONE, between=True, filters=None, profile=C.PROF_KEY_NONE): """Retrieve messages in history @@ -636,31 +636,31 @@ @param filters (dict[unicode, unicode]): pattern to filter the history results (see bridge API for details) @param profile (str): %(doc_profile)s - @return (D(list)): list of message data as in [messageNew] + @return (D(list)): list of message data as in [message_new] """ assert profile != C.PROF_KEY_NONE if limit == C.HISTORY_LIMIT_DEFAULT: - limit = int(self.getParamA(C.HISTORY_LIMIT, "General", profile_key=profile)) + limit = int(self.param_get_a(C.HISTORY_LIMIT, "General", profile_key=profile)) elif limit == C.HISTORY_LIMIT_NONE: limit = None if limit == 0: return defer.succeed([]) - return self.storage.historyGet(from_jid, to_jid, limit, between, filters, profile) + return self.storage.history_get(from_jid, to_jid, limit, between, filters, profile) ## Statuses ## - def _getPresenceStatuses(self, profile_key): - ret = self.getPresenceStatuses(profile_key) + def _get_presence_statuses(self, profile_key): + ret = self.presence_statuses_get(profile_key) return {entity.full(): data for entity, data in ret.items()} - def getPresenceStatuses(self, profile_key): + def presence_statuses_get(self, profile_key): """Get all the presence statuses of a profile @param profile_key: %(doc_profile_key)s @return: presence data: key=entity JID, value=presence data for this entity """ - client = self.host.getClient(profile_key) - profile_cache = self._getProfileCache(client) + client = self.host.get_client(profile_key) + profile_cache = self._get_profile_cache(client) entities_presence = {} for entity_jid, entity_data in profile_cache.items(): @@ -668,7 +668,7 @@ full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: continue entities_presence.setdefault(entity_jid, {})[ @@ -677,7 +677,7 @@ return entities_presence - def setPresenceStatus(self, entity_jid, show, priority, statuses, profile_key): + def set_presence_status(self, entity_jid, show, priority, statuses, profile_key): """Change the presence status of an entity @param entity_jid: jid.JID of the entity @@ -686,26 +686,26 @@ @param statuses: dictionary of statuses @param profile_key: %(doc_profile_key)s """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) presence_data = PresenceTuple(show, priority, statuses) - self.updateEntityData( + self.update_entity_data( client, entity_jid, "presence", presence_data ) if entity_jid.resource and show != C.PRESENCE_UNAVAILABLE: # If a resource is available, bare jid should not have presence information try: - self.delEntityDatum(client, entity_jid.userhostJID(), "presence") + self.del_entity_datum(client, entity_jid.userhostJID(), "presence") except (KeyError, exceptions.UnknownEntityError): pass ## Resources ## - def _getAllResource(self, jid_s, profile_key): - client = self.host.getClient(profile_key) + def _get_all_resource(self, jid_s, profile_key): + client = self.host.get_client(profile_key) jid_ = jid.JID(jid_s) - return self.getAllResources(client, jid_) + return self.get_all_resources(client, jid_) - def getAllResources(self, client, entity_jid): + def get_all_resources(self, client, entity_jid): """Return all resource from jid for which we have had data in this session @param entity_jid: bare jid of the entity @@ -717,9 +717,9 @@ # FIXME: is there a need to keep cache data for resources which are not connected anymore? if entity_jid.resource: raise ValueError( - "getAllResources must be used with a bare jid (got {})".format(entity_jid) + "get_all_resources must be used with a bare jid (got {})".format(entity_jid) ) - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) try: entity_data = profile_cache[entity_jid.userhostJID()] except KeyError: @@ -730,21 +730,21 @@ resources.discard(None) return resources - def getAvailableResources(self, client, entity_jid): + def get_available_resources(self, client, entity_jid): """Return available resource for entity_jid - This method differs from getAllResources by returning only available resources + This method differs from get_all_resources by returning only available resources @param entity_jid: bare jid of the entit return (list[unicode]): list of available resources @raise exceptions.UnknownEntityError: if entity is not in cache """ available = [] - for resource in self.getAllResources(client, entity_jid): + for resource in self.get_all_resources(client, entity_jid): full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: log.debug("Can't get presence data for {}".format(full_jid)) else: @@ -752,12 +752,12 @@ available.append(resource) return available - def _getMainResource(self, jid_s, profile_key): - client = self.host.getClient(profile_key) + def _get_main_resource(self, jid_s, profile_key): + client = self.host.get_client(profile_key) jid_ = jid.JID(jid_s) - return self.getMainResource(client, jid_) or "" + return self.main_resource_get(client, jid_) or "" - def getMainResource(self, client, entity_jid): + def main_resource_get(self, client, entity_jid): """Return the main resource used by an entity @param entity_jid: bare entity jid @@ -765,15 +765,15 @@ """ if entity_jid.resource: raise ValueError( - "getMainResource must be used with a bare jid (got {})".format(entity_jid) + "main_resource_get must be used with a bare jid (got {})".format(entity_jid) ) try: - if self.host.plugins["XEP-0045"].isJoinedRoom(client, entity_jid): + if self.host.plugins["XEP-0045"].is_joined_room(client, entity_jid): return None # MUC rooms have no main resource except KeyError: # plugin not found pass try: - resources = self.getAllResources(client, entity_jid) + resources = self.get_all_resources(client, entity_jid) except exceptions.UnknownEntityError: log.warning("Entity is not in cache, we can't find any resource") return None @@ -782,7 +782,7 @@ full_jid = copy.copy(entity_jid) full_jid.resource = resource try: - presence_data = self.getEntityDatum(client, full_jid, "presence") + presence_data = self.get_entity_datum(client, full_jid, "presence") except KeyError: log.debug("No presence information for {}".format(full_jid)) continue @@ -795,7 +795,7 @@ ## Entities data ## - def _getProfileCache(self, client): + def _get_profile_cache(self, client): """Check profile validity and return its cache @param client: SatXMPPClient @@ -803,7 +803,7 @@ """ return self._entities_cache[client.profile] - def setSignalOnUpdate(self, key, signal=True): + def set_signal_on_update(self, key, signal=True): """Set a signal flag on the key When the key will be updated, a signal will be sent to frontends @@ -815,13 +815,13 @@ else: self._key_signals.discard(key) - def getAllEntitiesIter(self, client, with_bare=False): + def get_all_entities_iter(self, client, with_bare=False): """Return an iterator of full jids of all entities in cache @param with_bare: if True, include bare jids @return (list[unicode]): list of jids """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) # we construct a list of all known full jids (bare jid of entities x resources) for bare_jid, entity_data in profile_cache.items(): for resource in entity_data.keys(): @@ -831,22 +831,22 @@ full_jid.resource = resource yield full_jid - def updateEntityData( + def update_entity_data( self, client, entity_jid, key, value, silent=False ): """Set a misc data for an entity - If key was registered with setSignalOnUpdate, a signal will be sent to frontends + If key was registered with set_signal_on_update, a signal will be sent to frontends @param entity_jid: JID of the entity, C.ENTITY_ALL_RESOURCES for all resources of all entities, C.ENTITY_ALL for all entities (all resources + bare jids) @param key: key to set (eg: C.ENTITY_TYPE) @param value: value for this key (eg: C.ENTITY_TYPE_MUC) @param silent(bool): if True, doesn't send signal to frontend, even if there is a - signal flag (see setSignalOnUpdate) + signal flag (see set_signal_on_update) """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) if entity_jid in (C.ENTITY_ALL_RESOURCES, C.ENTITY_ALL): - entities = self.getAllEntitiesIter(client, entity_jid == C.ENTITY_ALL) + entities = self.get_all_entities_iter(client, entity_jid == C.ENTITY_ALL) else: entities = (entity_jid,) @@ -857,14 +857,14 @@ entity_data[key] = value if key in self._key_signals and not silent: - self.host.bridge.entityDataUpdated( + self.host.bridge.entity_data_updated( jid_.full(), key, data_format.serialise(value), client.profile ) - def delEntityDatum(self, client, entity_jid, key): + def del_entity_datum(self, client, entity_jid, key): """Delete a data for an entity @param entity_jid: JID of the entity, C.ENTITY_ALL_RESOURCES for all resources of all entities, @@ -874,9 +874,9 @@ @raise exceptions.UnknownEntityError: if entity is not in cache @raise KeyError: key is not in cache """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) if entity_jid in (C.ENTITY_ALL_RESOURCES, C.ENTITY_ALL): - entities = self.getAllEntitiesIter(client, entity_jid == C.ENTITY_ALL) + entities = self.get_all_entities_iter(client, entity_jid == C.ENTITY_ALL) else: entities = (entity_jid,) @@ -895,9 +895,9 @@ else: raise e - def _getEntitiesData(self, entities_jids, keys_list, profile_key): - client = self.host.getClient(profile_key) - ret = self.getEntitiesData( + def _get_entities_data(self, entities_jids, keys_list, profile_key): + client = self.host.get_client(profile_key) + ret = self.entities_data_get( client, [jid.JID(jid_) for jid_ in entities_jids], keys_list ) return { @@ -905,7 +905,7 @@ for jid_, data in ret.items() } - def getEntitiesData(self, client, entities_jids, keys_list=None): + def entities_data_get(self, client, entities_jids, keys_list=None): """Get a list of cached values for several entities at once @param entities_jids: jids of the entities, or empty list for all entities in cache @@ -920,7 +920,7 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - def fillEntityData(entity_cache_data): + def fill_entity_data(entity_cache_data): entity_data = {} if keys_list is None: entity_data = entity_cache_data @@ -932,7 +932,7 @@ continue return entity_data - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) ret_data = {} if entities_jids: for entity in entities_jids: @@ -942,21 +942,21 @@ ] except KeyError: continue - ret_data[entity.full()] = fillEntityData(entity_cache_data, keys_list) + ret_data[entity.full()] = fill_entity_data(entity_cache_data, keys_list) else: for bare_jid, data in profile_cache.items(): for resource, entity_cache_data in data.items(): full_jid = copy.copy(bare_jid) full_jid.resource = resource - ret_data[full_jid] = fillEntityData(entity_cache_data) + ret_data[full_jid] = fill_entity_data(entity_cache_data) return ret_data - def _getEntityData(self, entity_jid_s, keys_list=None, profile=C.PROF_KEY_NONE): - return self.getEntityData( - self.host.getClient(profile), jid.JID(entity_jid_s), keys_list) + def _get_entity_data(self, entity_jid_s, keys_list=None, profile=C.PROF_KEY_NONE): + return self.entity_data_get( + self.host.get_client(profile), jid.JID(entity_jid_s), keys_list) - def getEntityData(self, client, entity_jid, keys_list=None): + def entity_data_get(self, client, entity_jid, keys_list=None): """Get a list of cached values for entity @param entity_jid: JID of the entity @@ -968,7 +968,7 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - profile_cache = self._getProfileCache(client) + profile_cache = self._get_profile_cache(client) try: entity_data = profile_cache[entity_jid.userhostJID()][entity_jid.resource] except KeyError: @@ -982,7 +982,7 @@ return {key: entity_data[key] for key in keys_list if key in entity_data} - def getEntityDatum(self, client, entity_jid, key): + def get_entity_datum(self, client, entity_jid, key): """Get a datum from entity @param entity_jid: JID of the entity @@ -992,9 +992,9 @@ @raise exceptions.UnknownEntityError: if entity is not in cache @raise KeyError: if there is no value for this key and this entity """ - return self.getEntityData(client, entity_jid, (key,))[key] + return self.entity_data_get(client, entity_jid, (key,))[key] - def delEntityCache( + def del_entity_cache( self, entity_jid, delete_all_resources=True, profile_key=C.PROF_KEY_NONE ): """Remove all cached data for entity @@ -1005,8 +1005,8 @@ @raise exceptions.UnknownEntityError: if entity is not in cache """ - client = self.host.getClient(profile_key) - profile_cache = self._getProfileCache(client) + client = self.host.get_client(profile_key) + profile_cache = self._get_profile_cache(client) if delete_all_resources: if entity_jid.resource: @@ -1027,7 +1027,7 @@ ## Encryption ## - def encryptValue(self, value, profile): + def encrypt_value(self, value, profile): """Encrypt a value for the given profile. The personal key must be loaded already in the profile session, that should be the case if the profile is already authenticated. @@ -1037,7 +1037,7 @@ @return: the deferred encrypted value """ try: - personal_key = self.auth_sessions.profileGetUnique(profile)[ + personal_key = self.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1047,7 +1047,7 @@ ) return BlockCipher.encrypt(personal_key, value) - def decryptValue(self, value, profile): + def decrypt_value(self, value, profile): """Decrypt a value for the given profile. The personal key must be loaded already in the profile session, that should be the case if the profile is already authenticated. @@ -1057,7 +1057,7 @@ @return: the deferred decrypted value """ try: - personal_key = self.auth_sessions.profileGetUnique(profile)[ + personal_key = self.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1067,7 +1067,7 @@ ) return BlockCipher.decrypt(personal_key, value) - def encryptPersonalData(self, data_key, data_value, crypto_key, profile): + def encrypt_personal_data(self, data_key, data_value, crypto_key, profile): """Re-encrypt a personal data (saved to a PersistentDict). @param data_key: key for the individual PersistentDict instance @@ -1077,7 +1077,7 @@ @return: a deferred None value """ - def gotIndMemory(data): + def got_ind_memory(data): data[data_key] = BlockCipher.encrypt(crypto_key, data_value) return data.force(data_key) @@ -1088,28 +1088,28 @@ ) d = PersistentDict(C.MEMORY_CRYPTO_NAMESPACE, profile).load() - return d.addCallback(gotIndMemory).addCallback(done) + return d.addCallback(got_ind_memory).addCallback(done) ## Subscription requests ## - def addWaitingSub(self, type_, entity_jid, profile_key): + def add_waiting_sub(self, type_, entity_jid, profile_key): """Called when a subcription request is received""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) assert profile if profile not in self.subscriptions: self.subscriptions[profile] = {} self.subscriptions[profile][entity_jid] = type_ - def delWaitingSub(self, entity_jid, profile_key): + def del_waiting_sub(self, entity_jid, profile_key): """Called when a subcription request is finished""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) assert profile if profile in self.subscriptions and entity_jid in self.subscriptions[profile]: del self.subscriptions[profile][entity_jid] - def getWaitingSub(self, profile_key): + def sub_waiting_get(self, profile_key): """Called to get a list of currently waiting subscription requests""" - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Asking waiting subscriptions for a non-existant profile")) return {} @@ -1120,13 +1120,13 @@ ## Parameters ## - def getStringParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - return self.params.getStringParamA(name, category, attr, profile_key) + def get_string_param_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + return self.params.get_string_param_a(name, category, attr, profile_key) - def getParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - return self.params.getParamA(name, category, attr, profile_key=profile_key) + def param_get_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + return self.params.param_get_a(name, category, attr, profile_key=profile_key) - def asyncGetParamA( + def param_get_a_async( self, name, category, @@ -1134,33 +1134,33 @@ security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE, ): - return self.params.asyncGetParamA( + return self.params.param_get_a_async( name, category, attr, security_limit, profile_key ) - def _getParamsValuesFromCategory( + def _get_params_values_from_category( self, category, security_limit, app, extra_s, profile_key ): - return self.params._getParamsValuesFromCategory( + return self.params._get_params_values_from_category( category, security_limit, app, extra_s, profile_key ) - def asyncGetStringParamA( + def async_get_string_param_a( self, name, category, attribute="value", security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): - profile = self.getProfileName(profile_key) - return defer.ensureDeferred(self.params.asyncGetStringParamA( + profile = self.get_profile_name(profile_key) + return defer.ensureDeferred(self.params.async_get_string_param_a( name, category, attribute, security_limit, profile )) - def _getParamsUI(self, security_limit, app, extra_s, profile_key): - return self.params._getParamsUI(security_limit, app, extra_s, profile_key) + def _get_params_ui(self, security_limit, app, extra_s, profile_key): + return self.params._get_params_ui(security_limit, app, extra_s, profile_key) - def getParamsCategories(self): - return self.params.getParamsCategories() + def params_categories_get(self): + return self.params.params_categories_get() - def setParam( + def param_set( self, name, value, @@ -1168,43 +1168,43 @@ security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE, ): - return self.params.setParam(name, value, category, security_limit, profile_key) + return self.params.param_set(name, value, category, security_limit, profile_key) - def updateParams(self, xml): - return self.params.updateParams(xml) + def update_params(self, xml): + return self.params.update_params(xml) - def paramsRegisterApp(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): - return self.params.paramsRegisterApp(xml, security_limit, app) + def params_register_app(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): + return self.params.params_register_app(xml, security_limit, app) - def setDefault(self, name, category, callback, errback=None): - return self.params.setDefault(name, category, callback, errback) + def set_default(self, name, category, callback, errback=None): + return self.params.set_default(name, category, callback, errback) ## Private Data ## - def _privateDataSet(self, namespace, key, data_s, profile_key): - client = self.host.getClient(profile_key) + def _private_data_set(self, namespace, key, data_s, profile_key): + client = self.host.get_client(profile_key) # we accept any type data = data_format.deserialise(data_s, type_check=None) - return defer.ensureDeferred(self.storage.setPrivateValue( + return defer.ensureDeferred(self.storage.set_private_value( namespace, key, data, binary=True, profile=client.profile)) - def _privateDataGet(self, namespace, key, profile_key): - client = self.host.getClient(profile_key) + def _private_data_get(self, namespace, key, profile_key): + client = self.host.get_client(profile_key) d = defer.ensureDeferred( - self.storage.getPrivates( + self.storage.get_privates( namespace, [key], binary=True, profile=client.profile) ) d.addCallback(lambda data_dict: data_format.serialise(data_dict.get(key))) return d - def _privateDataDelete(self, namespace, key, profile_key): - client = self.host.getClient(profile_key) - return defer.ensureDeferred(self.storage.delPrivateValue( + def _private_data_delete(self, namespace, key, profile_key): + client = self.host.get_client(profile_key) + return defer.ensureDeferred(self.storage.del_private_value( namespace, key, binary=True, profile=client.profile)) ## Files ## - def checkFilePermission( + def check_file_permission( self, file_data: dict, peer_jid: Optional[jid.JID], @@ -1213,7 +1213,7 @@ ) -> None: """Check that an entity has the right permission on a file - @param file_data: data of one file, as returned by getFiles + @param file_data: data of one file, as returned by get_files @param peer_jid: entity trying to access the file @param perms_to_check: permissions to check tuple of C.ACCESS_PERM_* @@ -1268,15 +1268,15 @@ _("unknown access type: {type}").format(type=perm_type) ) - async def checkPermissionToRoot(self, client, file_data, peer_jid, perms_to_check): - """do checkFilePermission on file_data and all its parents until root""" + async def check_permission_to_root(self, client, file_data, peer_jid, perms_to_check): + """do check_file_permission on file_data and all its parents until root""" current = file_data while True: - self.checkFilePermission(current, peer_jid, perms_to_check) + self.check_file_permission(current, peer_jid, perms_to_check) parent = current["parent"] if not parent: break - files_data = await self.getFiles( + files_data = await self.get_files( client, peer_jid=None, file_id=parent, perms_to_check=None ) try: @@ -1284,7 +1284,7 @@ except IndexError: raise exceptions.DataError("Missing parent") - async def _getParentDir( + async def _get_parent_dir( self, client, path, parent, namespace, owner, peer_jid, perms_to_check ): """Retrieve parent node from a path, or last existing directory @@ -1308,7 +1308,7 @@ # non existing directories will be created parent = "" for idx, path_elt in enumerate(path_elts): - directories = await self.storage.getFiles( + directories = await self.storage.get_files( client, parent=parent, type_=C.FILE_TYPE_DIRECTORY, @@ -1325,11 +1325,11 @@ ) else: directory = directories[0] - self.checkFilePermission(directory, peer_jid, perms_to_check) + self.check_file_permission(directory, peer_jid, perms_to_check) parent = directory["id"] return (parent, []) - def getFileAffiliations(self, file_data: dict) -> Dict[jid.JID, str]: + def get_file_affiliations(self, file_data: dict) -> Dict[jid.JID, str]: """Convert file access to pubsub like affiliations""" affiliations = {} access_data = file_data['access'] @@ -1352,7 +1352,7 @@ return affiliations - def _setFileAffiliationsUpdate( + def _set_file_affiliations_update( self, access: dict, file_data: dict, @@ -1401,7 +1401,7 @@ else: raise ValueError(f"unknown affiliation: {affiliation!r}") - async def setFileAffiliations( + async def set_file_affiliations( self, client, file_data: dict, @@ -1417,17 +1417,17 @@ - "none" removes both read and write permissions """ file_id = file_data['id'] - await self.fileUpdate( + await self.file_update( file_id, 'access', update_cb=partial( - self._setFileAffiliationsUpdate, + self._set_file_affiliations_update, file_data=file_data, affiliations=affiliations ), ) - def _setFileAccessModelUpdate( + def _set_file_access_model_update( self, access: dict, file_data: dict, @@ -1445,7 +1445,7 @@ if requested_type == C.ACCESS_TYPE_WHITELIST and 'jids' not in read_data: read_data['jids'] = [] - async def setFileAccessModel( + async def set_file_access_model( self, client, file_data: dict, @@ -1458,17 +1458,17 @@ - "whitelist": set whitelist to file/dir """ file_id = file_data['id'] - await self.fileUpdate( + await self.file_update( file_id, 'access', update_cb=partial( - self._setFileAccessModelUpdate, + self._set_file_access_model_update, file_data=file_data, access_model=access_model ), ) - def getFilesOwner( + def get_files_owner( self, client, owner: Optional[jid.JID], @@ -1499,7 +1499,7 @@ ) return peer_jid.userhostJID() - async def getFiles( + async def get_files( self, client, peer_jid, file_id=None, version=None, parent=None, path=None, type_=None, file_hash=None, hash_algo=None, name=None, namespace=None, mime_type=None, public_id=None, owner=None, access=None, projection=None, @@ -1526,7 +1526,7 @@ @param mime_type(unicode, None): filter on this mime type @param public_id(unicode, None): filter on this public id @param owner(jid.JID, None): if not None, only get files from this owner - @param access(dict, None): get file with given access (see [setFile]) + @param access(dict, None): get file with given access (see [set_file]) @param projection(list[unicode], None): name of columns to retrieve None to retrieve all @param unique(bool): if True will remove duplicates @@ -1534,7 +1534,7 @@ must be a tuple of C.ACCESS_PERM_* or None if None, permission will no be checked (peer_jid must be None too in this case) - other params are the same as for [setFile] + other params are the same as for [set_file] @return (list[dict]): files corresponding to filters @raise exceptions.NotFound: parent directory not found (when path is specified) @raise exceptions.PermissionError: peer_jid can't use perms_to_check for one of @@ -1546,11 +1546,11 @@ "if you want to disable permission check, both peer_jid and " "perms_to_check must be None" ) - owner = self.getFilesOwner(client, owner, peer_jid, file_id, parent) + owner = self.get_files_owner(client, owner, peer_jid, file_id, parent) if path is not None: path = str(path) - # permission are checked by _getParentDir - parent, remaining_path_elts = await self._getParentDir( + # permission are checked by _get_parent_dir + parent, remaining_path_elts = await self._get_parent_dir( client, path, parent, namespace, owner, peer_jid, perms_to_check ) if remaining_path_elts: @@ -1560,16 +1560,16 @@ if parent and peer_jid: # if parent is given directly and permission check is requested, # we need to check all the parents - parent_data = await self.storage.getFiles(client, file_id=parent) + parent_data = await self.storage.get_files(client, file_id=parent) try: parent_data = parent_data[0] except IndexError: raise exceptions.DataError("mising parent") - await self.checkPermissionToRoot( + await self.check_permission_to_root( client, parent_data, peer_jid, perms_to_check ) - files = await self.storage.getFiles( + files = await self.storage.get_files( client, file_id=file_id, version=version, @@ -1592,7 +1592,7 @@ to_remove = [] for file_data in files: try: - self.checkFilePermission( + self.check_file_permission( file_data, peer_jid, perms_to_check, set_affiliation=True ) except exceptions.PermissionError: @@ -1601,7 +1601,7 @@ files.remove(file_data) return files - async def setFile( + async def set_file( self, client, name, file_id=None, version="", parent=None, path=None, type_=C.FILE_TYPE_FILE, file_hash=None, hash_algo=None, size=None, namespace=None, mime_type=None, public_id=None, created=None, modified=None, @@ -1678,18 +1678,18 @@ raise ValueError( "version, file_hash, size and mime_type can't be set for a directory" ) - owner = self.getFilesOwner(client, owner, peer_jid, file_id, parent) + owner = self.get_files_owner(client, owner, peer_jid, file_id, parent) if path is not None: path = str(path) - # _getParentDir will check permissions if peer_jid is set, so we use owner - parent, remaining_path_elts = await self._getParentDir( + # _get_parent_dir will check permissions if peer_jid is set, so we use owner + parent, remaining_path_elts = await self._get_parent_dir( client, path, parent, namespace, owner, owner, perms_to_check ) # if remaining directories don't exist, we have to create them for new_dir in remaining_path_elts: new_dir_id = shortuuid.uuid() - await self.storage.setFile( + await self.storage.set_file( client, name=new_dir, file_id=new_dir_id, @@ -1706,7 +1706,7 @@ elif parent is None: parent = "" - await self.storage.setFile( + await self.storage.set_file( client, file_id=file_id, version=version, @@ -1726,7 +1726,7 @@ extra=extra, ) - async def fileGetUsedSpace( + async def file_get_used_space( self, client, peer_jid: jid.JID, @@ -1736,15 +1736,15 @@ @param peer_jid: entity requesting the size @param owner: entity owning the file to check. If None, will be determined by - getFilesOwner + get_files_owner @return: size of total space used by files of this owner """ - owner = self.getFilesOwner(client, owner, peer_jid) + owner = self.get_files_owner(client, owner, peer_jid) if peer_jid.userhostJID() != owner and client.profile not in self.admins: raise exceptions.PermissionError("You are not allowed to check this size") - return await self.storage.fileGetUsedSpace(client, owner) + return await self.storage.file_get_used_space(client, owner) - def fileUpdate(self, file_id, column, update_cb): + def file_update(self, file_id, column, update_cb): """Update a file column taking care of race condition access is NOT checked in this method, it must be checked beforehand @@ -1754,10 +1754,10 @@ the method will take older value as argument, and must update it in place Note that the callable must be thread-safe """ - return self.storage.fileUpdate(file_id, column, update_cb) + return self.storage.file_update(file_id, column, update_cb) @defer.inlineCallbacks - def _deleteFile( + def _delete_file( self, client, peer_jid: jid.JID, @@ -1778,7 +1778,7 @@ "file {file_name} can't be deleted, {peer_jid} is not the owner" .format(file_name=file_data['name'], peer_jid=peer_jid.full())) if file_data['type'] == C.FILE_TYPE_DIRECTORY: - sub_files = yield self.getFiles(client, peer_jid, parent=file_data['id']) + sub_files = yield self.get_files(client, peer_jid, parent=file_data['id']) if sub_files and not recursive: raise exceptions.DataError(_("Can't delete directory, it is not empty")) # we first delete the sub-files @@ -1787,15 +1787,15 @@ sub_file_path = files_path / sub_file_data['name'] else: sub_file_path = files_path - yield self._deleteFile( + yield self._delete_file( client, peer_jid, recursive, sub_file_path, sub_file_data) # then the directory itself - yield self.storage.fileDelete(file_data['id']) + yield self.storage.file_delete(file_data['id']) elif file_data['type'] == C.FILE_TYPE_FILE: log.info(_("deleting file {name} with hash {file_hash}").format( name=file_data['name'], file_hash=file_data['file_hash'])) - yield self.storage.fileDelete(file_data['id']) - references = yield self.getFiles( + yield self.storage.file_delete(file_data['id']) + references = yield self.get_files( client, peer_jid, file_hash=file_data['file_hash']) if references: log.debug("there are still references to the file, we keep it") @@ -1811,7 +1811,7 @@ raise exceptions.InternalError('Unexpected file type: {file_type}' .format(file_type=file_data['type'])) - async def fileDelete(self, client, peer_jid, file_id, recursive=False): + async def file_delete(self, client, peer_jid, file_id, recursive=False): """Delete a single file or a directory and all its sub-files @param file_id(unicode): id of the file to delete @@ -1821,7 +1821,7 @@ """ # FIXME: we only allow owner of file to delete files for now, but WRITE access # should be checked too - files_data = await self.getFiles(client, peer_jid, file_id) + files_data = await self.get_files(client, peer_jid, file_id) if not files_data: raise exceptions.NotFound("Can't find the file with id {file_id}".format( file_id=file_id)) @@ -1829,11 +1829,11 @@ if file_data["type"] != C.FILE_TYPE_DIRECTORY and recursive: raise ValueError("recursive can only be set for directories") files_path = self.host.get_local_path(None, C.FILES_DIR) - await self._deleteFile(client, peer_jid, recursive, files_path, file_data) + await self._delete_file(client, peer_jid, recursive, files_path, file_data) ## Cache ## - def getCachePath(self, namespace: str, *args: str) -> Path: + def get_cache_path(self, namespace: str, *args: str) -> Path: """Get path to use to get a common path for a namespace This can be used by plugins to manage permanent data. It's the responsability @@ -1844,13 +1844,13 @@ namespace = namespace.strip().lower() return Path( self._cache_path, - regex.pathEscape(namespace), - *(regex.pathEscape(a) for a in args) + regex.path_escape(namespace), + *(regex.path_escape(a) for a in args) ) ## Misc ## - def isEntityAvailable(self, client, entity_jid): + def is_entity_available(self, client, entity_jid): """Tell from the presence information if the given entity is available. @param entity_jid (JID): the entity to check (if bare jid is used, all resources are tested) @@ -1858,20 +1858,20 @@ """ if not entity_jid.resource: return bool( - self.getAvailableResources(client, entity_jid) + self.get_available_resources(client, entity_jid) ) # is any resource is available, entity is available try: - presence_data = self.getEntityDatum(client, entity_jid, "presence") + presence_data = self.get_entity_datum(client, entity_jid, "presence") except KeyError: log.debug("No presence information for {}".format(entity_jid)) return False return presence_data.show != C.PRESENCE_UNAVAILABLE - def isAdmin(self, profile: str) -> bool: + def is_admin(self, profile: str) -> bool: """Tell if given profile has administrator privileges""" return profile in self.admins - def isAdminJID(self, entity: jid.JID) -> bool: + def is_admin_jid(self, entity: jid.JID) -> bool: """Tells if an entity jid correspond to an admin one It is sometime not possible to use the profile alone to check if an entity is an
--- a/sat/memory/migration/env.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/migration/env.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,7 +38,7 @@ script output. """ - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() context.configure( url=db_config["url"], target_metadata=target_metadata, @@ -76,7 +76,7 @@ and associate a connection with the context. """ - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() engine = create_async_engine( db_config["url"], poolclass=pool.NullPool,
--- a/sat/memory/params.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/params.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ from twisted.python.failure import Failure from twisted.words.xish import domish from twisted.words.protocols.jabber import jid -from sat.tools.xml_tools import paramsXML2XMLUI, getText +from sat.tools.xml_tools import params_xml_2_xmlui, get_text from sat.tools.common import data_format from xml.sax.saxutils import quoteattr @@ -38,7 +38,7 @@ # this need an overall simplification to make maintenance easier -def createJidElts(jids): +def create_jid_elts(jids): """Generator which return <jid/> elements from jids @param jids(iterable[id.jID]): jids to use @@ -101,18 +101,18 @@ def load_default_params(self): self.dom = minidom.parseString(Params.default_xml.encode("utf-8")) - def _mergeParams(self, source_node, dest_node): + def _merge_params(self, source_node, dest_node): """Look for every node in source_node and recursively copy them to dest if they don't exists""" - def getNodesMap(children): + def get_nodes_map(children): ret = {} for child in children: if child.nodeType == child.ELEMENT_NODE: ret[(child.tagName, child.getAttribute("name"))] = child return ret - source_map = getNodesMap(source_node.childNodes) - dest_map = getNodesMap(dest_node.childNodes) + source_map = get_nodes_map(source_node.childNodes) + dest_map = get_nodes_map(dest_node.childNodes) source_set = set(source_map.keys()) dest_set = set(dest_map.keys()) to_add = source_set.difference(dest_set) @@ -122,22 +122,22 @@ to_recurse = source_set - to_add for node_key in to_recurse: - self._mergeParams(source_map[node_key], dest_map[node_key]) + self._merge_params(source_map[node_key], dest_map[node_key]) def load_xml(self, xml_file): """Load parameters template from xml file""" self.dom = minidom.parse(xml_file) default_dom = minidom.parseString(Params.default_xml.encode("utf-8")) - self._mergeParams(default_dom.documentElement, self.dom.documentElement) + self._merge_params(default_dom.documentElement, self.dom.documentElement) - def loadGenParams(self): + def load_gen_params(self): """Load general parameters data from storage @return: deferred triggered once params are loaded """ - return self.storage.loadGenParams(self.params_gen) + return self.storage.load_gen_params(self.params_gen) - def loadIndParams(self, profile, cache=None): + def load_ind_params(self, profile, cache=None): """Load individual parameters set self.params cache or a temporary cache @@ -147,11 +147,11 @@ """ if cache is None: self.params[profile] = {} - return self.storage.loadIndParams( + return self.storage.load_ind_params( self.params[profile] if cache is None else cache, profile ) - def purgeProfile(self, profile): + def purge_profile(self, profile): """Remove cache data of a profile @param profile: %(doc_profile)s @@ -176,7 +176,7 @@ self.params = {} self.params_gen = {} - def createProfile(self, profile, component): + def create_profile(self, profile, component): """Create a new profile @param profile(unicode): name of the profile @@ -184,14 +184,14 @@ @param callback: called when the profile actually exists in database and memory @return: a Deferred instance """ - if self.storage.hasProfile(profile): + if self.storage.has_profile(profile): log.info(_("The profile name already exists")) return defer.fail(exceptions.ConflictError()) if not self.host.trigger.point("ProfileCreation", profile): return defer.fail(exceptions.CancelError()) - return self.storage.createProfile(profile, component or None) + return self.storage.create_profile(profile, component or None) - def asyncDeleteProfile(self, profile, force=False): + def profile_delete_async(self, profile, force=False): """Delete an existing profile @param profile: name of the profile @@ -199,18 +199,18 @@ To be used for direct calls only (not through the bridge). @return: a Deferred instance """ - if not self.storage.hasProfile(profile): + if not self.storage.has_profile(profile): log.info(_("Trying to delete an unknown profile")) return defer.fail(Failure(exceptions.ProfileUnknownError(profile))) - if self.host.isConnected(profile): + if self.host.is_connected(profile): if force: self.host.disconnect(profile) else: log.info(_("Trying to delete a connected profile")) return defer.fail(Failure(exceptions.ProfileConnected)) - return self.storage.deleteProfile(profile) + return self.storage.delete_profile(profile) - def getProfileName(self, profile_key, return_profile_keys=False): + def get_profile_name(self, profile_key, return_profile_keys=False): """return profile according to profile_key @param profile_key: profile name or key which can be @@ -229,7 +229,7 @@ try: default = self.host.memory.memory_data[ "Profile_default" - ] = self.storage.getProfilesList()[0] + ] = self.storage.get_profiles_list()[0] except IndexError: log.info(_("No profile exist yet")) raise exceptions.ProfileUnknownError(profile_key) @@ -240,7 +240,7 @@ raise exceptions.ProfileNotSetError elif return_profile_keys and profile_key in [C.PROF_KEY_ALL]: return profile_key # this value must be managed by the caller - if not self.storage.hasProfile(profile_key): + if not self.storage.has_profile(profile_key): log.error(_("Trying to access an unknown profile (%s)") % profile_key) raise exceptions.ProfileUnknownError(profile_key) return profile_key @@ -260,7 +260,7 @@ # the node is new return None - def updateParams(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): + def update_params(self, xml, security_limit=C.NO_SECURITY_LIMIT, app=""): """import xml in parameters, update if the param already exists If security_limit is specified and greater than -1, the parameters @@ -287,7 +287,7 @@ 0 ) # count the params to be removed from current category for node in cat_node.childNodes: - if node.nodeName != "param" or not self.checkSecurityLimit( + if node.nodeName != "param" or not self.check_security_limit( node, security_limit ): to_remove.append(node) @@ -324,7 +324,7 @@ pre_process_app_node(src_parent, security_limit, app) import_node(self.dom.documentElement, src_parent) - def paramsRegisterApp(self, xml, security_limit, app): + def params_register_app(self, xml, security_limit, app): """Register frontend's specific parameters If security_limit is specified and greater than -1, the parameters @@ -351,12 +351,12 @@ ) return self.frontends_cache.append(app) - self.updateParams(xml, security_limit, app) + self.update_params(xml, security_limit, app) log.debug("Frontends parameters registered for %(app)s" % {"app": app}) def __default_ok(self, value, name, category): # FIXME: will not work with individual parameters - self.setParam(name, value, category) + self.param_set(name, value, category) def __default_ko(self, failure, name, category): log.error( @@ -364,7 +364,7 @@ % {"category": category, "name": name, "reason": str(failure.value)} ) - def setDefault(self, name, category, callback, errback=None): + def set_default(self, name, category, callback, errback=None): """Set default value of parameter 'default_cb' attibute of parameter must be set to 'yes' @@ -376,10 +376,10 @@ # TODO: send signal param update if value changed # TODO: manage individual paramaters log.debug( - "setDefault called for %(category)s/%(name)s" + "set_default called for %(category)s/%(name)s" % {"category": category, "name": name} ) - node = self._getParamNode(name, category, "@ALL@") + node = self._get_param_node(name, category, "@ALL@") if not node: log.error( _( @@ -390,15 +390,15 @@ return if node[1].getAttribute("default_cb") == "yes": # del node[1].attributes['default_cb'] # default_cb is not used anymore as a flag to know if we have to set the default value, - # and we can still use it later e.g. to call a generic setDefault method - value = self._getParam(category, name, C.GENERAL) + # and we can still use it later e.g. to call a generic set_default method + value = self._get_param(category, name, C.GENERAL) if value is None: # no value set by the user: we have the default value log.debug("Default value to set, using callback") d = defer.maybeDeferred(callback) d.addCallback(self.__default_ok, name, category) d.addErrback(errback or self.__default_ko, name, category) - def _getAttr_internal(self, node, attr, value): + def _get_attr_internal(self, node, attr, value): """Get attribute value. /!\ This method would return encrypted password values. @@ -464,7 +464,7 @@ "\t" ) # FIXME: it's not good to use tabs as separator ! else: # no user defined value, take default value from the XML - jids = [getText(jid_) for jid_ in node.getElementsByTagName("jid")] + jids = [get_text(jid_) for jid_ in node.getElementsByTagName("jid")] to_delete = [] for idx, value in enumerate(jids): try: @@ -480,7 +480,7 @@ return value_to_use return node.getAttribute(attr) - def _getAttr(self, node, attr, value): + def _get_attr(self, node, attr, value): """Get attribute value (synchronous). /!\ This method can not be used to retrieve password values. @@ -491,11 +491,11 @@ """ if attr == "value" and node.getAttribute("type") == "password": raise exceptions.InternalError( - "To retrieve password values, use _asyncGetAttr instead of _getAttr" + "To retrieve password values, use _async_get_attr instead of _get_attr" ) - return self._getAttr_internal(node, attr, value) + return self._get_attr_internal(node, attr, value) - def _asyncGetAttr(self, node, attr, value, profile=None): + def _async_get_attr(self, node, attr, value, profile=None): """Get attribute value. Profile passwords are returned hashed (if not empty), @@ -506,7 +506,7 @@ @param profile: %(doc_profile)s @return (unicode, bool, int, list): Deferred value to retrieve """ - value = self._getAttr_internal(node, attr, value) + value = self._get_attr_internal(node, attr, value) if attr != "value" or node.getAttribute("type") != "password": return defer.succeed(value) param_cat = node.parentNode.getAttribute("name") @@ -519,7 +519,7 @@ raise exceptions.ProfileNotSetError( "The profile is needed to decrypt a password" ) - password = self.host.memory.decryptValue(value, profile) + password = self.host.memory.decrypt_value(value, profile) if password is None: raise exceptions.InternalError("password should never be None") @@ -528,25 +528,25 @@ def _type_to_str(self, result): """Convert result to string, according to its type """ if isinstance(result, bool): - return C.boolConst(result) + return C.bool_const(result) elif isinstance(result, (list, set, tuple)): return ', '.join(self._type_to_str(r) for r in result) else: return str(result) - def getStringParamA(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): - """ Same as getParamA but for bridge: convert non string value to string """ + def get_string_param_a(self, name, category, attr="value", profile_key=C.PROF_KEY_NONE): + """ Same as param_get_a but for bridge: convert non string value to string """ return self._type_to_str( - self.getParamA(name, category, attr, profile_key=profile_key) + self.param_get_a(name, category, attr, profile_key=profile_key) ) - def getParamA( + def param_get_a( self, name, category, attr="value", use_default=True, profile_key=C.PROF_KEY_NONE ): """Helper method to get a specific attribute. /!\ This method would return encrypted password values, - to get the plain values you have to use asyncGetParamA. + to get the plain values you have to use param_get_a_async. @param name: name of the parameter @param category: category of the parameter @param attr: name of the attribute (default: "value") @@ -557,7 +557,7 @@ """ # FIXME: looks really dirty and buggy, need to be reviewed/refactored # FIXME: security_limit is not managed here ! - node = self._getParamNode(name, category) + node = self._get_param_node(name, category) if not node: log.error( _( @@ -569,18 +569,18 @@ if attr == "value" and node[1].getAttribute("type") == "password": raise exceptions.InternalError( - "To retrieve password values, use asyncGetParamA instead of getParamA" + "To retrieve password values, use param_get_a_async instead of param_get_a" ) if node[0] == C.GENERAL: - value = self._getParam(category, name, C.GENERAL) + value = self._get_param(category, name, C.GENERAL) if value is None and attr == "value" and not use_default: return value - return self._getAttr(node[1], attr, value) + return self._get_attr(node[1], attr, value) assert node[0] == C.INDIVIDUAL - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Requesting a param for an non-existant profile")) raise exceptions.ProfileUnknownError(profile_key) @@ -590,19 +590,19 @@ raise exceptions.ProfileNotConnected(profile) if attr == "value": - value = self._getParam(category, name, profile=profile) + value = self._get_param(category, name, profile=profile) if value is None and attr == "value" and not use_default: return value - return self._getAttr(node[1], attr, value) + return self._get_attr(node[1], attr, value) - async def asyncGetStringParamA( + async def async_get_string_param_a( self, name, category, attr="value", security_limit=C.NO_SECURITY_LIMIT, profile=C.PROF_KEY_NONE): - value = await self.asyncGetParamA( + value = await self.param_get_a_async( name, category, attr, security_limit, profile_key=profile) return self._type_to_str(value) - def asyncGetParamA( + def param_get_a_async( self, name, category, @@ -618,7 +618,7 @@ @param profile: owner of the param (@ALL@ for everyone) @return (defer.Deferred): parameter value, with corresponding type (bool, int, list, etc) """ - node = self._getParamNode(name, category) + node = self._get_param_node(name, category) if not node: log.error( _( @@ -628,7 +628,7 @@ ) raise ValueError("Requested param doesn't exist") - if not self.checkSecurityLimit(node[1], security_limit): + if not self.check_security_limit(node[1], security_limit): log.warning( _( "Trying to get parameter '%(param)s' in category '%(cat)s' without authorization!!!" @@ -638,12 +638,12 @@ raise exceptions.PermissionError if node[0] == C.GENERAL: - value = self._getParam(category, name, C.GENERAL) - return self._asyncGetAttr(node[1], attr, value) + value = self._get_param(category, name, C.GENERAL) + return self._async_get_attr(node[1], attr, value) assert node[0] == C.INDIVIDUAL - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: raise exceptions.InternalError( _("Requesting a param for a non-existant profile") @@ -652,23 +652,23 @@ if attr != "value": return defer.succeed(node[1].getAttribute(attr)) try: - value = self._getParam(category, name, profile=profile) - return self._asyncGetAttr(node[1], attr, value, profile) + value = self._get_param(category, name, profile=profile) + return self._async_get_attr(node[1], attr, value, profile) except exceptions.ProfileNotInCacheError: # We have to ask data to the storage manager - d = self.storage.getIndParam(category, name, profile) + d = self.storage.get_ind_param(category, name, profile) return d.addCallback( - lambda value: self._asyncGetAttr(node[1], attr, value, profile) + lambda value: self._async_get_attr(node[1], attr, value, profile) ) - def _getParamsValuesFromCategory( + def _get_params_values_from_category( self, category, security_limit, app, extra_s, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) - return defer.ensureDeferred(self.getParamsValuesFromCategory( + return defer.ensureDeferred(self.get_params_values_from_category( client, category, security_limit, app, extra)) - async def getParamsValuesFromCategory( + async def get_params_values_from_category( self, client, category, security_limit, app='', extra=None): """Get all parameters "attribute" for a category @@ -676,14 +676,14 @@ @param security_limit(int): NO_SECURITY_LIMIT (-1) to return all the params. Otherwise sole the params which have a security level defined *and* lower or equal to the specified value are returned. - @param app(str): see [getParams] - @param extra(dict): see [getParams] + @param app(str): see [get_params] + @param extra(dict): see [get_params] @return (dict): key: param name, value: param value (converted to string if needed) """ # TODO: manage category of general type (without existant profile) if extra is None: extra = {} - prof_xml = await self._constructProfileXml(client, security_limit, app, extra) + prof_xml = await self._construct_profile_xml(client, security_limit, app, extra) ret = {} for category_node in prof_xml.getElementsByTagName("category"): if category_node.getAttribute("name") == category: @@ -696,7 +696,7 @@ ) ) continue - value = await self.asyncGetStringParamA( + value = await self.async_get_string_param_a( name, category, security_limit=security_limit, profile=client.profile) @@ -706,7 +706,7 @@ prof_xml.unlink() return ret - def _getParam( + def _get_param( self, category, name, type_=C.INDIVIDUAL, cache=None, profile=C.PROF_KEY_NONE ): """Return the param, or None if it doesn't exist @@ -736,7 +736,7 @@ return None return cache[(category, name)] - async def _constructProfileXml(self, client, security_limit, app, extra): + async def _construct_profile_xml(self, client, security_limit, app, extra): """Construct xml for asked profile, filling values when needed /!\ as noticed in doc, don't forget to unlink the minidom.Document @@ -749,18 +749,18 @@ """ profile = client.profile - def checkNode(node): + def check_node(node): """Check the node against security_limit, app and extra""" - return (self.checkSecurityLimit(node, security_limit) - and self.checkApp(node, app) - and self.checkExtra(node, extra)) + return (self.check_security_limit(node, security_limit) + and self.check_app(node, app) + and self.check_extra(node, extra)) if profile in self.params: profile_cache = self.params[profile] else: # profile is not in cache, we load values in a short time cache profile_cache = {} - await self.loadIndParams(profile, profile_cache) + await self.load_ind_params(profile, profile_cache) # init the result document prof_xml = minidom.parseString("<params/>") @@ -782,7 +782,7 @@ for node in dest_cat.childNodes: if node.nodeName != "param": continue - if not checkNode(node): + if not check_node(node): to_remove.append(node) continue dest_params[node.getAttribute("name")] = node @@ -799,14 +799,14 @@ # we have to merge new params (we are parsing individual parameters, we have to add them # to the previously parsed general ones) name = param_node.getAttribute("name") - if not checkNode(param_node): + if not check_node(param_node): continue if name not in dest_params: # this is reached when a previous category exists dest_params[name] = param_node.cloneNode(True) dest_cat.appendChild(dest_params[name]) - profile_value = self._getParam( + profile_value = self._get_param( category, name, type_node.nodeName, @@ -867,12 +867,12 @@ return prof_xml - def _getParamsUI(self, security_limit, app, extra_s, profile_key): - client = self.host.getClient(profile_key) + def _get_params_ui(self, security_limit, app, extra_s, profile_key): + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) - return defer.ensureDeferred(self.getParamsUI(client, security_limit, app, extra)) + return defer.ensureDeferred(self.param_ui_get(client, security_limit, app, extra)) - async def getParamsUI(self, client, security_limit, app, extra=None): + async def param_ui_get(self, client, security_limit, app, extra=None): """Get XMLUI to handle parameters @param security_limit: NO_SECURITY_LIMIT (-1) to return all the params. @@ -883,10 +883,10 @@ - ignore: list of (category/name) values to remove from parameters @return(str): a SàT XMLUI for parameters """ - param_xml = await self.getParams(client, security_limit, app, extra) - return paramsXML2XMLUI(param_xml) + param_xml = await self.get_params(client, security_limit, app, extra) + return params_xml_2_xmlui(param_xml) - async def getParams(self, client, security_limit, app, extra=None): + async def get_params(self, client, security_limit, app, extra=None): """Construct xml for asked profile, take params xml as skeleton @param security_limit: NO_SECURITY_LIMIT (-1) to return all the params. @@ -900,12 +900,12 @@ """ if extra is None: extra = {} - prof_xml = await self._constructProfileXml(client, security_limit, app, extra) + prof_xml = await self._construct_profile_xml(client, security_limit, app, extra) return_xml = prof_xml.toxml() prof_xml.unlink() return "\n".join((line for line in return_xml.split("\n") if line)) - def _getParamNode(self, name, category, type_="@ALL@"): # FIXME: is type_ useful ? + def _get_param_node(self, name, category, type_="@ALL@"): # FIXME: is type_ useful ? """Return a node from the param_xml @param name: name of the node @param category: category of the node @@ -931,7 +931,7 @@ return (type_node.nodeName, param) return None - def getParamsCategories(self): + def params_categories_get(self): """return the categories availables""" categories = [] for cat in self.dom.getElementsByTagName("category"): @@ -940,7 +940,7 @@ categories.append(cat.getAttribute("name")) return categories - def setParam(self, name, value, category, security_limit=C.NO_SECURITY_LIMIT, + def param_set(self, name, value, category, security_limit=C.NO_SECURITY_LIMIT, profile_key=C.PROF_KEY_NONE): """Set a parameter, return None if the parameter is not in param xml. @@ -955,14 +955,14 @@ @param profile_key (str): %(doc_profile_key)s @return: a deferred None value when everything is done """ - # FIXME: setParam should accept the right type for value, not only str ! + # FIXME: param_set should accept the right type for value, not only str ! if profile_key != C.PROF_KEY_NONE: - profile = self.getProfileName(profile_key) + profile = self.get_profile_name(profile_key) if not profile: log.error(_("Trying to set parameter for an unknown profile")) raise exceptions.ProfileUnknownError(profile_key) - node = self._getParamNode(name, category, "@ALL@") + node = self._get_param_node(name, category, "@ALL@") if not node: log.error( _("Requesting an unknown parameter (%(category)s/%(name)s)") @@ -970,7 +970,7 @@ ) return defer.succeed(None) - if not self.checkSecurityLimit(node[1], security_limit): + if not self.check_security_limit(node[1], security_limit): msg = _( "{profile!r} is trying to set parameter {name!r} in category " "{category!r} without authorization!!!").format( @@ -1018,12 +1018,12 @@ if node[0] == C.GENERAL: self.params_gen[(category, name)] = value - self.storage.setGenParam(category, name, value) - for profile in self.storage.getProfilesList(): - if self.host.memory.isSessionStarted(profile): - self.host.bridge.paramUpdate(name, value, category, profile) + self.storage.set_gen_param(category, name, value) + for profile in self.storage.get_profiles_list(): + if self.host.memory.is_session_started(profile): + self.host.bridge.param_update(name, value, category, profile) self.host.trigger.point( - "paramUpdateTrigger", name, value, category, node[0], profile + "param_update_trigger", name, value, category, node[0], profile ) return defer.succeed(None) @@ -1035,7 +1035,7 @@ return defer.succeed(None) elif type_ == "password": try: - personal_key = self.host.memory.auth_sessions.profileGetUnique(profile)[ + personal_key = self.host.memory.auth_sessions.profile_get_unique(profile)[ C.MEMORY_CRYPTO_KEY ] except TypeError: @@ -1044,7 +1044,7 @@ ) if (category, name) == C.PROFILE_PASS_PATH: # using 'value' as the encryption key to encrypt another encryption key... could be confusing! - d = self.host.memory.encryptPersonalData( + d = self.host.memory.encrypt_personal_data( data_key=C.MEMORY_CRYPTO_KEY, data_value=personal_key, crypto_key=value, @@ -1060,21 +1060,21 @@ else: d = defer.succeed(value) - def gotFinalValue(value): - if self.host.memory.isSessionStarted(profile): + def got_final_value(value): + if self.host.memory.is_session_started(profile): self.params[profile][(category, name)] = value - self.host.bridge.paramUpdate(name, value, category, profile) + self.host.bridge.param_update(name, value, category, profile) self.host.trigger.point( - "paramUpdateTrigger", name, value, category, node[0], profile + "param_update_trigger", name, value, category, node[0], profile ) - return self.storage.setIndParam(category, name, value, profile) + return self.storage.set_ind_param(category, name, value, profile) else: raise exceptions.ProfileNotConnected - d.addCallback(gotFinalValue) + d.addCallback(got_final_value) return d - def _getNodesOfTypes(self, attr_type, node_type="@ALL@"): + def _get_nodes_of_types(self, attr_type, node_type="@ALL@"): """Return all the nodes matching the given types. TODO: using during the dev but not anymore... remove if not needed @@ -1105,7 +1105,7 @@ ret[(cat, param.getAttribute("name"))] = param return ret - def checkSecurityLimit(self, node, security_limit): + def check_security_limit(self, node, security_limit): """Check the given node against the given security limit. The value NO_SECURITY_LIMIT (-1) means that everything is allowed. @return: True if this node can be accessed with the given security limit. @@ -1117,7 +1117,7 @@ return True return False - def checkApp(self, node, app): + def check_app(self, node, app): """Check the given node against the given app. @param node: parameter node @@ -1128,7 +1128,7 @@ return True return node.getAttribute("app") == app - def checkExtra(self, node, extra): + def check_extra(self, node, extra): """Check the given node against the extra filters. @param node: parameter node @@ -1147,7 +1147,7 @@ return True -def makeOptions(options, selected=None): +def make_options(options, selected=None): """Create option XML form dictionary @param options(dict): option's name => option's label map
--- a/sat/memory/persistent.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/persistent.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ self.namespace = namespace self.profile = profile - def _setCache(self, data): + def _set_cache(self, data): self._cache = data def load(self): @@ -60,10 +60,10 @@ need to be called before any other operation @return: defers the PersistentDict instance itself """ - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) - d.addCallback(self._setCache) + d.addCallback(self._set_cache) d.addCallback(lambda __: self) return d @@ -117,20 +117,20 @@ def __setitem__(self, key, value): defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) return self._cache.__setitem__(key, value) def __delitem__(self, key): - self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + self.storage.del_private_value(self.namespace, key, self.binary, self.profile) return self._cache.__delitem__(key) def clear(self): """Delete all values from this namespace""" self._cache.clear() - return self.storage.delPrivateNamespace(self.namespace, self.binary, self.profile) + return self.storage.del_private_namespace(self.namespace, self.binary, self.profile) def get(self, key, default=None): return self._cache.get(key, default) @@ -139,7 +139,7 @@ """Async set, return a Deferred fired when value is actually stored""" self._cache.__setitem__(key, value) return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) @@ -147,7 +147,7 @@ def adel(self, key): """Async del, return a Deferred fired when value is actually deleted""" self._cache.__delitem__(key) - return self.storage.delPrivateValue( + return self.storage.del_private_value( self.namespace, key, self.binary, self.profile) def setdefault(self, key, default): @@ -163,7 +163,7 @@ @return: deferred fired when data is actually saved """ return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, name, self._cache[name], self.binary, self.profile ) ) @@ -192,14 +192,14 @@ raise NotImplementedError def items(self): - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) d.addCallback(lambda data_dict: data_dict.items()) return d def all(self): - return defer.ensureDeferred(self.storage.getPrivates( + return defer.ensureDeferred(self.storage.get_privates( self.namespace, binary=self.binary, profile=self.profile )) @@ -252,7 +252,7 @@ def __getitem__(self, key): """get the value as a Deferred""" - d = defer.ensureDeferred(self.storage.getPrivates( + d = defer.ensureDeferred(self.storage.get_privates( self.namespace, keys=[key], binary=self.binary, profile=self.profile )) d.addCallback(self._data2value, key) @@ -260,21 +260,21 @@ def __setitem__(self, key, value): defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) def __delitem__(self, key): - self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + self.storage.del_private_value(self.namespace, key, self.binary, self.profile) - def _defaultOrException(self, failure_, default): + def _default_or_exception(self, failure_, default): failure_.trap(KeyError) return default def get(self, key, default=None): d = self.__getitem__(key) - d.addErrback(self._defaultOrException, default=default) + d.addErrback(self._default_or_exception, default=default) return d def aset(self, key, value): @@ -282,7 +282,7 @@ # FIXME: redundant with force, force must be removed # XXX: similar as PersistentDict.aset, but doesn't use cache return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, key, value, self.binary, self.profile ) ) @@ -290,7 +290,7 @@ def adel(self, key): """Async del, return a Deferred fired when value is actually deleted""" # XXX: similar as PersistentDict.adel, but doesn't use cache - return self.storage.delPrivateValue( + return self.storage.del_private_value( self.namespace, key, self.binary, self.profile) def setdefault(self, key, default): @@ -303,7 +303,7 @@ @return: deferred fired when data is actually saved """ return defer.ensureDeferred( - self.storage.setPrivateValue( + self.storage.set_private_value( self.namespace, name, value, self.binary, self.profile ) ) @@ -314,4 +314,4 @@ @param key(unicode): key to delete @return (D): A deferred fired when delete is done """ - return self.storage.delPrivateValue(self.namespace, key, self.binary, self.profile) + return self.storage.del_private_value(self.namespace, key, self.binary, self.profile)
--- a/sat/memory/sqla.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/sqla.py Sat Apr 08 13:54:42 2023 +0200 @@ -122,10 +122,10 @@ # profile id to component entry point self.components: Dict[int, str] = {} - def getProfileById(self, profile_id): + def get_profile_by_id(self, profile_id): return self.profiles.get(profile_id) - async def migrateApply(self, *args: str, log_output: bool = False) -> None: + async def migrate_apply(self, *args: str, log_output: bool = False) -> None: """Do a migration command Commands are applied by running Alembic in a subprocess. @@ -167,7 +167,7 @@ await conn.run_sync(Base.metadata.create_all) log.debug("stamping the database") - await self.migrateApply("stamp", "head") + await self.migrate_apply("stamp", "head") log.debug("stamping done") def _check_db_is_up_to_date(self, conn: Connection) -> bool: @@ -193,14 +193,14 @@ else: log.info("Database needs to be updated") log.info("updating…") - await self.migrateApply("upgrade", "head", log_output=True) + await self.migrate_apply("upgrade", "head", log_output=True) log.info("Database is now up-to-date") @aio async def initialise(self) -> None: log.info(_("Connecting database")) - db_config = sqla_config.getDbConfig() + db_config = sqla_config.get_db_config() engine = create_async_engine( db_config["url"], future=True, @@ -288,31 +288,31 @@ ## Profiles - def getProfilesList(self) -> List[str]: + def get_profiles_list(self) -> List[str]: """"Return list of all registered profiles""" return list(self.profiles.keys()) - def hasProfile(self, profile_name: str) -> bool: + def has_profile(self, profile_name: str) -> bool: """return True if profile_name exists @param profile_name: name of the profile to check """ return profile_name in self.profiles - def profileIsComponent(self, profile_name: str) -> bool: + def profile_is_component(self, profile_name: str) -> bool: try: return self.profiles[profile_name] in self.components except KeyError: raise exceptions.NotFound("the requested profile doesn't exists") - def getEntryPoint(self, profile_name: str) -> str: + def get_entry_point(self, profile_name: str) -> str: try: return self.components[self.profiles[profile_name]] except KeyError: raise exceptions.NotFound("the requested profile doesn't exists or is not a component") @aio - async def createProfile(self, name: str, component_ep: Optional[str] = None) -> None: + async def create_profile(self, name: str, component_ep: Optional[str] = None) -> None: """Create a new profile @param name: name of the profile @@ -331,7 +331,7 @@ return profile @aio - async def deleteProfile(self, name: str) -> None: + async def delete_profile(self, name: str) -> None: """Delete profile @param name: name of the profile @@ -349,7 +349,7 @@ ## Params @aio - async def loadGenParams(self, params_gen: dict) -> None: + async def load_gen_params(self, params_gen: dict) -> None: """Load general parameters @param params_gen: dictionary to fill @@ -361,7 +361,7 @@ params_gen[(p.category, p.name)] = p.value @aio - async def loadIndParams(self, params_ind: dict, profile: str) -> None: + async def load_ind_params(self, params_ind: dict, profile: str) -> None: """Load individual parameters @param params_ind: dictionary to fill @@ -376,7 +376,7 @@ params_ind[(p.category, p.name)] = p.value @aio - async def getIndParam(self, category: str, name: str, profile: str) -> Optional[str]: + async def get_ind_param(self, category: str, name: str, profile: str) -> Optional[str]: """Ask database for the value of one specific individual parameter @param category: category of the parameter @@ -395,7 +395,7 @@ return result.scalar_one_or_none() @aio - async def getIndParamValues(self, category: str, name: str) -> Dict[str, str]: + async def get_ind_param_values(self, category: str, name: str) -> Dict[str, str]: """Ask database for the individual values of a parameter for all profiles @param category: category of the parameter @@ -414,7 +414,7 @@ return {param.profile.name: param.value for param in result.scalars()} @aio - async def setGenParam(self, category: str, name: str, value: Optional[str]) -> None: + async def set_gen_param(self, category: str, name: str, value: Optional[str]) -> None: """Save the general parameters in database @param category: category of the parameter @@ -436,7 +436,7 @@ await session.commit() @aio - async def setIndParam( + async def set_ind_param( self, category:str, name: str, @@ -489,7 +489,7 @@ return History.source == jid_.userhost() @aio - async def historyGet( + async def history_get( self, from_jid: Optional[jid.JID], to_jid: Optional[jid.JID], @@ -509,7 +509,7 @@ - None for unlimited @param between: confound source and dest (ignore the direction) @param filters: pattern to filter the history results - @return: list of messages as in [messageNew], minus the profile which is already + @return: list of messages as in [message_new], minus the profile which is already known. """ # we have to set a default value to profile because it's last argument @@ -634,7 +634,7 @@ return [h.as_tuple() for h in result] @aio - async def addToHistory(self, data: dict, profile: str) -> None: + async def add_to_history(self, data: dict, profile: str) -> None: """Store a new message in history @param data: message data as build by SatMessageProtocol.onMessage @@ -682,7 +682,7 @@ ## Private values - def _getPrivateClass(self, binary, profile): + def _get_private_class(self, binary, profile): """Get ORM class to use for private values""" if profile is None: return PrivateGenBin if binary else PrivateGen @@ -691,7 +691,7 @@ @aio - async def getPrivates( + async def get_privates( self, namespace:str, keys: Optional[Iterable[str]] = None, @@ -714,7 +714,7 @@ f"{' binary' if binary else ''} private values from database for namespace " f"{namespace}{f' with keys {keys!r}' if keys is not None else ''}" ) - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = select(cls).filter_by(namespace=namespace) if keys: stmt = stmt.where(cls.key.in_(list(keys))) @@ -725,7 +725,7 @@ return {p.key: p.value for p in result.scalars()} @aio - async def setPrivateValue( + async def set_private_value( self, namespace: str, key:str, @@ -743,7 +743,7 @@ @param profile: profile to use for individual value if None, it's a general value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) values = { "namespace": namespace, @@ -768,7 +768,7 @@ await session.commit() @aio - async def delPrivateValue( + async def del_private_value( self, namespace: str, key: str, @@ -783,7 +783,7 @@ @param profile: profile to use for individual value if None, it's a general value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = delete(cls).filter_by(namespace=namespace, key=key) @@ -795,7 +795,7 @@ await session.commit() @aio - async def delPrivateNamespace( + async def del_private_namespace( self, namespace: str, binary: bool = False, @@ -805,9 +805,9 @@ Be really cautious when you use this method, as all data with given namespace are removed. - Params are the same as for delPrivateValue + Params are the same as for del_private_value """ - cls = self._getPrivateClass(binary, profile) + cls = self._get_private_class(binary, profile) stmt = delete(cls).filter_by(namespace=namespace) @@ -821,7 +821,7 @@ ## Files @aio - async def getFiles( + async def get_files( self, client: Optional[SatXMPPEntity], file_id: Optional[str] = None, @@ -852,7 +852,7 @@ @param projection: name of columns to retrieve None to retrieve all @param unique: if True will remove duplicates - other params are the same as for [setFile] + other params are the same as for [set_file] @return: files corresponding to filters """ if projection is None: @@ -910,7 +910,7 @@ return [dict(r) for r in result] @aio - async def setFile( + async def set_file( self, client: SatXMPPEntity, name: str, @@ -987,7 +987,7 @@ )) @aio - async def fileGetUsedSpace(self, client: SatXMPPEntity, owner: jid.JID) -> int: + async def file_get_used_space(self, client: SatXMPPEntity, owner: jid.JID) -> int: async with self.session() as session: result = await session.execute( select(sum_(File.size)).filter_by( @@ -998,7 +998,7 @@ return result.scalar_one_or_none() or 0 @aio - async def fileDelete(self, file_id: str) -> None: + async def file_delete(self, file_id: str) -> None: """Delete file metadata from the database @param file_id: id of the file to delete @@ -1010,7 +1010,7 @@ await session.commit() @aio - async def fileUpdate( + async def file_update( self, file_id: str, column: str, @@ -1068,7 +1068,7 @@ ) @aio - async def getPubsubNode( + async def get_pubsub_node( self, client: SatXMPPEntity, service: jid.JID, @@ -1085,7 +1085,7 @@ @param with_items: retrieve items in the same query @param with_subscriptions: retrieve subscriptions in the same query @param create: if the node doesn't exist in DB, create it - @param create_kwargs: keyword arguments to use with ``setPubsubNode`` if the node + @param create_kwargs: keyword arguments to use with ``set_pubsub_node`` if the node needs to be created. """ async with self.session() as session: @@ -1112,15 +1112,15 @@ if create_kwargs is None: create_kwargs = {} try: - return await as_future(self.setPubsubNode( + return await as_future(self.set_pubsub_node( client, service, name, **create_kwargs )) except IntegrityError as e: if "unique" in str(e.orig).lower(): # the node may already exist, if it has been created just after - # getPubsubNode above + # get_pubsub_node above log.debug("ignoring UNIQUE constraint error") - cached_node = await as_future(self.getPubsubNode( + cached_node = await as_future(self.get_pubsub_node( client, service, name, @@ -1133,7 +1133,7 @@ return ret @aio - async def setPubsubNode( + async def set_pubsub_node( self, client: SatXMPPEntity, service: jid.JID, @@ -1159,7 +1159,7 @@ return node @aio - async def updatePubsubNodeSyncState( + async def update_pubsub_node_sync_state( self, node: PubsubNode, state: SyncState @@ -1176,7 +1176,7 @@ ) @aio - async def deletePubsubNode( + async def delete_pubsub_node( self, profiles: Optional[List[str]], services: Optional[List[jid.JID]], @@ -1207,7 +1207,7 @@ await session.commit() @aio - async def cachePubsubItems( + async def cache_pubsub_items( self, client: SatXMPPEntity, node: PubsubNode, @@ -1240,7 +1240,7 @@ await session.commit() @aio - async def deletePubsubItems( + async def delete_pubsub_items( self, node: PubsubNode, items_names: Optional[List[str]] = None @@ -1264,7 +1264,7 @@ await session.commit() @aio - async def purgePubsubItems( + async def purge_pubsub_items( self, services: Optional[List[jid.JID]] = None, names: Optional[List[str]] = None, @@ -1313,7 +1313,7 @@ await session.commit() @aio - async def getItems( + async def get_items( self, node: PubsubNode, max_items: Optional[int] = None, @@ -1352,7 +1352,7 @@ metadata = { "service": node.service, "node": node.name, - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=node.service.full(), node=node.name, @@ -1487,7 +1487,7 @@ result.reverse() return result, metadata - def _getSqlitePath( + def _get_sqlite_path( self, path: List[Union[str, int]] ) -> str: @@ -1495,7 +1495,7 @@ return f"${''.join(f'[{p}]' if isinstance(p, int) else f'.{p}' for p in path)}" @aio - async def searchPubsubItems( + async def search_pubsub_items( self, query: dict, ) -> Tuple[List[PubsubItem]]: @@ -1626,7 +1626,7 @@ op_attr = OP_MAP[operator] except KeyError: raise ValueError(f"invalid operator: {operator!r}") - sqlite_path = self._getSqlitePath(path) + sqlite_path = self._get_sqlite_path(path) if operator in ("overlap", "ioverlap", "disjoint", "idisjoint"): col = literal_column("json_each.value") if operator[0] == "i": @@ -1683,7 +1683,7 @@ raise NotImplementedError(f"Unknown {order!r} order") else: # we have a JSON path - # sqlite_path = self._getSqlitePath(path) + # sqlite_path = self._get_sqlite_path(path) col = PubsubItem.parsed[path] direction = order_data.get("direction", "ASC").lower() if not direction in ("asc", "desc"):
--- a/sat/memory/sqla_config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/memory/sqla_config.py Sat Apr 08 13:54:42 2023 +0200 @@ -22,15 +22,15 @@ from sat.tools import config -def getDbConfig() -> dict: +def get_db_config() -> dict: """Get configuration for database @return: dict with following keys: - type: only "sqlite" for now - path: path to the sqlite DB """ - main_conf = config.parseMainConf() - local_dir = Path(config.getConfig(main_conf, "", "local_dir")) + main_conf = config.parse_main_conf() + local_dir = Path(config.config_get(main_conf, "", "local_dir")) database_path = (local_dir / C.SAVEFILE_DATABASE).expanduser() url = f"sqlite+aiosqlite:///{quote(str(database_path))}?timeout=30" return {
--- a/sat/plugins/plugin_adhoc_dbus.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_adhoc_dbus.py Sat Apr 08 13:54:42 2023 +0200 @@ -98,39 +98,39 @@ log.info(_("plugin Ad-Hoc D-Bus initialization")) self.host = host if etree is not None: - host.bridge.addMethod( - "adHocDBusAddAuto", + host.bridge.add_method( + "ad_hoc_dbus_add_auto", ".plugin", in_sign="sasasasasasass", out_sign="(sa(sss))", - method=self._adHocDBusAddAuto, + method=self._ad_hoc_dbus_add_auto, async_=True, ) - host.bridge.addMethod( - "adHocRemotesGet", + host.bridge.add_method( + "ad_hoc_remotes_get", ".plugin", in_sign="s", out_sign="a(sss)", - method=self._adHocRemotesGet, + method=self._ad_hoc_remotes_get, async_=True, ) self._c = host.plugins["XEP-0050"] - host.registerNamespace("mediaplayer", NS_MEDIA_PLAYER) + host.register_namespace("mediaplayer", NS_MEDIA_PLAYER) if dbus is not None: self.session_bus = dbus.SessionBus() self.fd_object = self.session_bus.get_object( FD_NAME, FD_PATH, introspect=False) - def profileConnected(self, client): + def profile_connected(self, client): if dbus is not None: - self._c.addAdHocCommand( - client, self.localMediaCb, D_("Media Players"), + self._c.add_ad_hoc_command( + client, self.local_media_cb, D_("Media Players"), node=NS_MEDIA_PLAYER, timeout=60*60*6 # 6 hours timeout, to avoid breaking remote # in the middle of a movie ) - def _DBusAsyncCall(self, proxy, method, *args, **kwargs): + def _dbus_async_call(self, proxy, method, *args, **kwargs): """ Call a DBus method asynchronously and return a deferred @param proxy: DBus object proxy, as returner by get_object @@ -149,18 +149,18 @@ proxy.get_dbus_method(method, dbus_interface=interface)(*args, **kwargs) return d - def _DBusGetProperty(self, proxy, interface, name): - return self._DBusAsyncCall( + def _dbus_get_property(self, proxy, interface, name): + return self._dbus_async_call( proxy, "Get", interface, name, interface="org.freedesktop.DBus.Properties") - def _DBusListNames(self): - return self._DBusAsyncCall(self.fd_object, "ListNames") + def _dbus_list_names(self): + return self._dbus_async_call(self.fd_object, "ListNames") - def _DBusIntrospect(self, proxy): - return self._DBusAsyncCall(proxy, INTROSPECT_METHOD, interface=INTROSPECT_IFACE) + def _dbus_introspect(self, proxy): + return self._dbus_async_call(proxy, INTROSPECT_METHOD, interface=INTROSPECT_IFACE) - def _acceptMethod(self, method): + def _accept_method(self, method): """ Return True if we accept the method for a command @param method: etree.Element @return: True if the method is acceptable @@ -175,7 +175,7 @@ @defer.inlineCallbacks def _introspect(self, methods, bus_name, proxy): log.debug("introspecting path [%s]" % proxy.object_path) - introspect_xml = yield self._DBusIntrospect(proxy) + introspect_xml = yield self._dbus_introspect(proxy) el = etree.fromstring(introspect_xml) for node in el.iterchildren("node", "interface"): if node.tag == "node": @@ -191,23 +191,23 @@ continue log.debug("introspecting interface [%s]" % name) for method in node.iterchildren("method"): - if self._acceptMethod(method): + if self._accept_method(method): method_name = method.get("name") log.debug("method accepted: [%s]" % method_name) methods.add((proxy.object_path, name, method_name)) - def _adHocDBusAddAuto(self, prog_name, allowed_jids, allowed_groups, allowed_magics, + def _ad_hoc_dbus_add_auto(self, prog_name, allowed_jids, allowed_groups, allowed_magics, forbidden_jids, forbidden_groups, flags, profile_key): - client = self.host.getClient(profile_key) - return self.adHocDBusAddAuto( + client = self.host.get_client(profile_key) + return self.ad_hoc_dbus_add_auto( client, prog_name, allowed_jids, allowed_groups, allowed_magics, forbidden_jids, forbidden_groups, flags) @defer.inlineCallbacks - def adHocDBusAddAuto(self, client, prog_name, allowed_jids=None, allowed_groups=None, + def ad_hoc_dbus_add_auto(self, client, prog_name, allowed_jids=None, allowed_groups=None, allowed_magics=None, forbidden_jids=None, forbidden_groups=None, flags=None): - bus_names = yield self._DBusListNames() + bus_names = yield self._dbus_list_names() bus_names = [bus_name for bus_name in bus_names if "." + prog_name in bus_name] if not bus_names: log.info("Can't find any bus for [%s]" % prog_name) @@ -223,7 +223,7 @@ yield self._introspect(methods, bus_name, proxy) if methods: - self._addCommand( + self._add_command( client, prog_name, bus_name, @@ -238,13 +238,13 @@ defer.returnValue((str(bus_name), methods)) - def _addCommand(self, client, adhoc_name, bus_name, methods, allowed_jids=None, + def _add_command(self, client, adhoc_name, bus_name, methods, allowed_jids=None, allowed_groups=None, allowed_magics=None, forbidden_jids=None, forbidden_groups=None, flags=None): if flags is None: flags = set() - def DBusCallback(client, command_elt, session_data, action, node): + def d_bus_callback(client, command_elt, session_data, action, node): actions = session_data.setdefault("actions", []) names_map = session_data.setdefault("names_map", {}) actions.append(action) @@ -283,7 +283,7 @@ path, iface, command = names_map[command] proxy = self.session_bus.get_object(bus_name, path) - self._DBusAsyncCall(proxy, command, interface=iface) + self._dbus_async_call(proxy, command, interface=iface) # job done, we can end the session, except if we have FLAG_LOOP if FLAG_LOOP in flags: @@ -292,7 +292,7 @@ # is OK) del actions[:] names_map.clear() - return DBusCallback( + return d_bus_callback( client, None, session_data, self._c.ACTION.EXECUTE, node ) form = data_form.Form("form", title=_("Updated")) @@ -305,9 +305,9 @@ return (payload, status, None, note) - self._c.addAdHocCommand( + self._c.add_ad_hoc_command( client, - DBusCallback, + d_bus_callback, adhoc_name, allowed_jids=allowed_jids, allowed_groups=allowed_groups, @@ -318,18 +318,18 @@ ## Local media ## - def _adHocRemotesGet(self, profile): - return self.adHocRemotesGet(self.host.getClient(profile)) + def _ad_hoc_remotes_get(self, profile): + return self.ad_hoc_remotes_get(self.host.get_client(profile)) @defer.inlineCallbacks - def adHocRemotesGet(self, client): + def ad_hoc_remotes_get(self, client): """Retrieve available remote media controlers in our devices @return (list[tuple[unicode, unicode, unicode]]): list of devices with: - entity full jid - device name - device label """ - found_data = yield defer.ensureDeferred(self.host.findByFeatures( + found_data = yield defer.ensureDeferred(self.host.find_by_features( client, [self.host.ns_map['commands']], service=False, roster=False, own_jid=True, local_device=True)) @@ -344,7 +344,7 @@ try: result_elt = yield self._c.do(client, device_jid, NS_MEDIA_PLAYER, timeout=5) - command_elt = self._c.getCommandElt(result_elt) + command_elt = self._c.get_command_elt(result_elt) form = data_form.findForm(command_elt, NS_MEDIA_PLAYER) if form is None: continue @@ -368,7 +368,7 @@ break defer.returnValue(remotes) - def doMPRISCommand(self, proxy, command): + def do_mpris_command(self, proxy, command): iface, command = command.rsplit(".", 1) if command == CMD_GO_BACK: command = 'Seek' @@ -378,9 +378,9 @@ args = [SEEK_OFFSET] else: args = [] - return self._DBusAsyncCall(proxy, command, *args, interface=iface) + return self._dbus_async_call(proxy, command, *args, interface=iface) - def addMPRISMetadata(self, form, metadata): + def add_mpris_metadata(self, form, metadata): """Serialise MRPIS Metadata according to MPRIS_METADATA_MAP""" for mpris_key, name in MPRIS_METADATA_MAP.items(): if mpris_key in metadata: @@ -390,7 +390,7 @@ value=value)) @defer.inlineCallbacks - def localMediaCb(self, client, command_elt, session_data, action, node): + def local_media_cb(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -399,7 +399,7 @@ if command_form is None or len(command_form.fields) == 0: # root request, we looks for media players - bus_names = yield self._DBusListNames() + bus_names = yield self._dbus_list_names() bus_names = [b for b in bus_names if b.startswith(MPRIS_PREFIX)] if len(bus_names) == 0: note = (self._c.NOTE.INFO, D_("No media player found.")) @@ -445,7 +445,7 @@ except KeyError: pass else: - yield self.doMPRISCommand(proxy, command) + yield self.do_mpris_command(proxy, command) # we construct the remote control form form = data_form.Form("form", title=D_("Media Player Selection")) @@ -455,13 +455,13 @@ for iface, properties_names in MPRIS_PROPERTIES.items(): for name in properties_names: try: - value = yield self._DBusGetProperty(proxy, iface, name) + value = yield self._dbus_get_property(proxy, iface, name) except Exception as e: log.warning(_("Can't retrieve attribute {name}: {reason}") .format(name=name, reason=e)) continue if name == MPRIS_METADATA_KEY: - self.addMPRISMetadata(form, value) + self.add_mpris_metadata(form, value) else: form.addField(data_form.Field(fieldType="fixed", var=name,
--- a/sat/plugins/plugin_blog_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,7 +61,7 @@ OPT_DEFAULTS = {OPT_UPLOAD_IMAGES: True, OPT_IGNORE_TLS: False} def __init__(self, host): - log.info(_("plugin Blog Import initialization")) + log.info(_("plugin Blog import initialization")) self.host = host self._u = host.plugins["UPLOAD"] self._p = host.plugins["XEP-0060"] @@ -69,10 +69,10 @@ self._s = self.host.plugins["TEXT_SYNTAXES"] host.plugins["IMPORT"].initialize(self, "blog") - def importItem( + def import_item( self, client, item_import_data, session, options, return_data, service, node ): - """importItem specialized for blog import + """import_item specialized for blog import @param item_import_data(dict): * mandatory keys: @@ -116,7 +116,7 @@ except KeyError: pass else: - new_uri = return_data[URL_REDIRECT_PREFIX + old_uri] = self._p.getNodeURI( + new_uri = return_data[URL_REDIRECT_PREFIX + old_uri] = self._p.get_node_uri( service if service is not None else client.jid.userhostJID(), node or self._m.namespace, item_id, @@ -126,14 +126,14 @@ return mb_data @defer.inlineCallbacks - def importSubItems(self, client, item_import_data, mb_data, session, options): + def import_sub_items(self, client, item_import_data, mb_data, session, options): # comments data if len(item_import_data["comments"]) != 1: raise NotImplementedError("can't manage multiple comment links") allow_comments = C.bool(mb_data.get("allow_comments", C.BOOL_FALSE)) if allow_comments: - comments_service = yield self._m.getCommentsService(client) - comments_node = self._m.getCommentsNode(mb_data["id"]) + comments_service = yield self._m.get_comments_service(client) + comments_node = self._m.get_comments_node(mb_data["id"]) mb_data["comments_service"] = comments_service.full() mb_data["comments_node"] = comments_node recurse_kwargs = { @@ -149,7 +149,7 @@ ) defer.returnValue(None) - def publishItem(self, client, mb_data, service, node, session): + def publish_item(self, client, mb_data, service, node, session): log.debug( "uploading item [{id}]: {title}".format( id=mb_data["id"], title=mb_data.get("title", "") @@ -158,7 +158,7 @@ return self._m.send(client, mb_data, service, node) @defer.inlineCallbacks - def itemFilters(self, client, mb_data, session, options): + def item_filters(self, client, mb_data, session, options): """Apply filters according to options modify mb_data in place @@ -188,7 +188,7 @@ ) # we convert rich syntax to XHTML here, so we can handle filters easily converted = yield self._s.convert( - rich, self._s.getCurrentSyntax(client.profile), safe=False + rich, self._s.get_current_syntax(client.profile), safe=False ) mb_data["{}_xhtml".format(prefix)] = converted del mb_data["{}_rich".format(prefix)] @@ -220,7 +220,7 @@ ) except domish.ParserError: # we clean the xml and try again our luck - cleaned = yield self._s.cleanXHTML(mb_data["content_xhtml"]) + cleaned = yield self._s.clean_xhtml(mb_data["content_xhtml"]) top_elt = xml_tools.ElementParser()(cleaned, namespace=C.NS_XHTML) opt_host = options.get(OPT_HOST) if opt_host: @@ -239,8 +239,8 @@ tmp_dir = tempfile.mkdtemp() try: # TODO: would be nice to also update the hyperlinks to these images, e.g. when you have <a href="{url}"><img src="{url}"></a> - for img_elt in xml_tools.findAll(top_elt, names=["img"]): - yield self.imgFilters(client, img_elt, options, opt_host, tmp_dir) + for img_elt in xml_tools.find_all(top_elt, names=["img"]): + yield self.img_filters(client, img_elt, options, opt_host, tmp_dir) finally: os.rmdir(tmp_dir) # XXX: tmp_dir should be empty, or something went wrong @@ -248,7 +248,7 @@ mb_data["content_xhtml"] = top_elt.toXml() @defer.inlineCallbacks - def imgFilters(self, client, img_elt, options, opt_host, tmp_dir): + def img_filters(self, client, img_elt, options, opt_host, tmp_dir): """Filters handling images url without host are fixed (if possible)
--- a/sat/plugins/plugin_blog_import_dokuwiki.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import_dokuwiki.py Sat Apr 08 13:54:42 2023 +0200 @@ -123,7 +123,7 @@ self.limit = limit self.posts_data = OrderedDict() - def getPostId(self, post): + def get_post_id(self, post): """Return a unique and constant post id @param post(dict): parsed post data @@ -131,7 +131,7 @@ """ return str(post["id"]) - def getPostUpdated(self, post): + def get_post_updated(self, post): """Return the update date. @param post(dict): parsed post data @@ -139,7 +139,7 @@ """ return str(post["mtime"]) - def getPostPublished(self, post): + def get_post_published(self, post): """Try to parse the date from the message ID, else use "mtime". The date can be extracted if the message ID looks like one of: @@ -162,16 +162,16 @@ return default return str(calendar.timegm(time_struct)) - def processPost(self, post, profile_jid): + def process_post(self, post, profile_jid): """Process a single page. @param post (dict): parsed post data @param profile_jid """ # get main information - id_ = self.getPostId(post) - updated = self.getPostUpdated(post) - published = self.getPostPublished(post) + id_ = self.get_post_id(post) + updated = self.get_post_updated(post) + published = self.get_post_published(post) # manage links backlinks = self.pages.backlinks(id_) @@ -182,7 +182,7 @@ backlinks.append(page[1:] if page.startswith(":") else page) self.pages.get(id_) - content_xhtml = self.processContent(self.pages.html(id_), backlinks, profile_jid) + content_xhtml = self.process_content(self.pages.html(id_), backlinks, profile_jid) # XXX: title is already in content_xhtml and difficult to remove, so leave it # title = content.split("\n")[0].strip(u"\ufeff= ") @@ -230,14 +230,14 @@ count = 0 for page in pages_list: - self.processPost(page, profile_jid) + self.process_post(page, profile_jid) count += 1 if count >= self.limit: break return (iter(self.posts_data.values()), len(self.posts_data)) - def processContent(self, text, backlinks, profile_jid): + def process_content(self, text, backlinks, profile_jid): """Do text substitutions and file copy. @param text (unicode): message content @@ -259,7 +259,7 @@ if re.match(r"^\w*://", link): # absolute URL to link directly continue if self.media_repo: - self.moveMedia(link, subs) + self.move_media(link, subs) elif link not in subs: subs[link] = urllib.parse.urljoin(self.url, link) @@ -267,7 +267,7 @@ text = text.replace(url, new_url) return text - def moveMedia(self, link, subs): + def move_media(self, link, subs): """Move a media from the DokuWiki host to the new repository. This also updates the hyperlinks to internal media files. @@ -304,17 +304,17 @@ return filepath = os.path.join(self.temp_dir, filename) - self.downloadMedia(url, filepath) + self.download_media(url, filepath) if thumb_width: filename = os.path.join("thumbs", thumb_width, filename) thumbnail = os.path.join(self.temp_dir, filename) - self.createThumbnail(filepath, thumbnail, thumb_width) + self.create_thumbnail(filepath, thumbnail, thumb_width) new_url = os.path.join(self.media_repo, filename) subs[link] = new_url - def downloadMedia(self, source, dest): + def download_media(self, source, dest): """Copy media to localhost. @param source (unicode): source url @@ -327,7 +327,7 @@ urllib.request.urlretrieve(source, dest) log.debug("DokuWiki media file copied to %s" % dest) - def createThumbnail(self, source, dest, width): + def create_thumbnail(self, source, dest, width): """Create a thumbnail. @param source (unicode): source file path @@ -348,13 +348,13 @@ class DokuwikiImport(object): def __init__(self, host): - log.info(_("plugin Dokuwiki Import initialization")) + log.info(_("plugin Dokuwiki import initialization")) self.host = host self._blog_import = host.plugins["BLOG_IMPORT"] - self._blog_import.register("dokuwiki", self.DkImport, SHORT_DESC, LONG_DESC) + self._blog_import.register("dokuwiki", self.dk_import, SHORT_DESC, LONG_DESC) - def DkImport(self, client, location, options=None): - """Import from DokuWiki to PubSub + def dk_import(self, client, location, options=None): + """import from DokuWiki to PubSub @param location (unicode): DokuWiki site URL @param options (dict, None): DokuWiki import parameters @@ -407,7 +407,7 @@ info_msg = info_msg.format( temp_dir=dk_importer.temp_dir, media_repo=media_repo, location=location ) - self.host.actionNew( + self.host.action_new( {"xmlui": xml_tools.note(info_msg).toXml()}, profile=client.profile ) d = threads.deferToThread(dk_importer.process, client, namespace)
--- a/sat/plugins/plugin_blog_import_dotclear.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_blog_import_dotclear.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,7 +50,7 @@ To use it, you'll need to export your blog to a flat file. You must go in your admin interface and select Plugins/Maintenance then Backup. Export only one blog if you have many, i.e. select "Download database of current blog" -Depending on your configuration, your may need to use Import/Export plugin and export as a flat file. +Depending on your configuration, your may need to use import/Export plugin and export as a flat file. location: you must use the absolute path to your backup for the location parameter """ @@ -77,7 +77,7 @@ self.posts_data = OrderedDict() self.tags = {} - def getPostId(self, post): + def get_post_id(self, post): """Return a unique and constant post id @param post(dict): parsed post data @@ -91,7 +91,7 @@ post["post_url"], ) - def getCommentId(self, comment): + def get_comment_id(self, comment): """Return a unique and constant comment id @param comment(dict): parsed comment @@ -110,7 +110,7 @@ """ return time.mktime(time.strptime(data[key], "%Y-%m-%d %H:%M:%S")) - def readFields(self, fields_data): + def read_fields(self, fields_data): buf = [] idx = 0 while True: @@ -148,13 +148,13 @@ buf.append(char) def parseFields(self, headers, data): - return dict(zip(headers, self.readFields(data))) + return dict(zip(headers, self.read_fields(data))) - def postHandler(self, headers, data, index): + def post_handler(self, headers, data, index): post = self.parseFields(headers, data) log.debug("({}) post found: {}".format(index, post["post_title"])) mb_data = { - "id": self.getPostId(post), + "id": self.get_post_id(post), "published": self.getTime(post, "post_creadt"), "updated": self.getTime(post, "post_upddt"), "author": post["user_id"], # there use info are not in the archive @@ -163,7 +163,7 @@ post["post_content_xhtml"], post["post_excerpt_xhtml"] ), "title": post["post_title"], - "allow_comments": C.boolConst(bool(int(post["post_open_comment"]))), + "allow_comments": C.bool_const(bool(int(post["post_open_comment"]))), } self.posts_data[post["post_id"]] = { "blog": mb_data, @@ -171,18 +171,18 @@ "url": "/post/{}".format(post["post_url"]), } - def metaHandler(self, headers, data, index): + def meta_handler(self, headers, data, index): meta = self.parseFields(headers, data) if meta["meta_type"] == "tag": tags = self.tags.setdefault(meta["post_id"], set()) tags.add(meta["meta_id"]) - def metaFinishedHandler(self): + def meta_finished_handler(self): for post_id, tags in self.tags.items(): data_format.iter2dict("tag", tags, self.posts_data[post_id]["blog"]) del self.tags - def commentHandler(self, headers, data, index): + def comment_handler(self, headers, data, index): comment = self.parseFields(headers, data) if comment["comment_site"]: # we don't use atom:uri because it's used for jid in XMPP @@ -193,7 +193,7 @@ else: content = comment["comment_content"] mb_data = { - "id": self.getCommentId(comment), + "id": self.get_comment_id(comment), "published": self.getTime(comment, "comment_dt"), "updated": self.getTime(comment, "comment_upddt"), "author": comment["comment_author"], @@ -263,13 +263,13 @@ class DotclearImport(object): def __init__(self, host): - log.info(_("plugin Dotclear Import initialization")) + log.info(_("plugin Dotclear import initialization")) self.host = host host.plugins["BLOG_IMPORT"].register( - "dotclear", self.DcImport, SHORT_DESC, LONG_DESC + "dotclear", self.dc_import, SHORT_DESC, LONG_DESC ) - def DcImport(self, client, location, options=None): + def dc_import(self, client, location, options=None): if not os.path.isabs(location): raise exceptions.DataError( "An absolute path to backup data need to be given as location"
--- a/sat/plugins/plugin_comp_ap_gateway/__init__.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/__init__.py Sat Apr 08 13:54:42 2023 +0200 @@ -151,9 +151,9 @@ self._t = host.plugins["TEXT_SYNTAXES"] self._i = host.plugins["IDENTITY"] self._events = host.plugins["XEP-0471"] - self._p.addManagedNode( + self._p.add_managed_node( "", - items_cb=self._itemsReceived, + items_cb=self._items_received, # we want to be sure that the callbacks are launched before pubsub cache's # one, as we need to inspect items before they are actually removed from cache # or updated @@ -162,20 +162,20 @@ self.pubsub_service = APPubsubService(self) self.ad_hoc = APAdHocService(self) self.ap_events = APEvents(self) - host.trigger.add("messageReceived", self._messageReceivedTrigger, priority=-1000) - host.trigger.add("XEP-0424_retractReceived", self._onMessageRetract) - host.trigger.add("XEP-0372_ref_received", self._onReferenceReceived) + host.trigger.add("messageReceived", self._message_received_trigger, priority=-1000) + host.trigger.add("XEP-0424_retractReceived", self._on_message_retract) + host.trigger.add("XEP-0372_ref_received", self._on_reference_received) - host.bridge.addMethod( - "APSend", + host.bridge.add_method( + "ap_send", ".plugin", in_sign="sss", out_sign="", - method=self._publishMessage, + method=self._publish_message, async_=True, ) - def getHandler(self, __): + def get_handler(self, __): return self.pubsub_service async def init(self, client): @@ -186,7 +186,7 @@ log.info(_("ActivityPub Gateway initialization")) # RSA keys - stored_data = await self.host.memory.storage.getPrivates( + stored_data = await self.host.memory.storage.get_privates( IMPORT_NAME, ["rsa_key"], profile=client.profile ) private_key_pem = stored_data.get("rsa_key") @@ -201,7 +201,7 @@ format=serialization.PrivateFormat.PKCS8, encryption_algorithm=serialization.NoEncryption() ).decode() - await self.host.memory.storage.setPrivateValue( + await self.host.memory.storage.set_private_value( IMPORT_NAME, "rsa_key", private_key_pem, profile=client.profile ) else: @@ -217,9 +217,9 @@ # params # URL and port - self.public_url = self.host.memory.getConfig( + self.public_url = self.host.memory.config_get( CONF_SECTION, "public_url" - ) or self.host.memory.getConfig( + ) or self.host.memory.config_get( CONF_SECTION, "xmpp_domain" ) if self.public_url is None: @@ -235,37 +235,37 @@ "\"public_url\" configuration option. ActivityPub Gateway won't be run." ) return - self.http_port = int(self.host.memory.getConfig( + self.http_port = int(self.host.memory.config_get( CONF_SECTION, 'http_port', 8123)) - connection_type = self.host.memory.getConfig( + connection_type = self.host.memory.config_get( CONF_SECTION, 'http_connection_type', 'https') if connection_type not in ('http', 'https'): raise exceptions.ConfigError( 'bad ap-gateay http_connection_type, you must use one of "http" or ' '"https"' ) - self.max_items = int(self.host.memory.getConfig( + self.max_items = int(self.host.memory.config_get( CONF_SECTION, 'new_node_max_items', 50 )) - self.comments_max_depth = int(self.host.memory.getConfig( + self.comments_max_depth = int(self.host.memory.config_get( CONF_SECTION, 'comments_max_depth', 0 )) - self.ap_path = self.host.memory.getConfig(CONF_SECTION, 'ap_path', '_ap') + self.ap_path = self.host.memory.config_get(CONF_SECTION, 'ap_path', '_ap') self.base_ap_url = parse.urljoin(f"https://{self.public_url}", f"{self.ap_path}/") # True (default) if we provide gateway only to entities/services from our server self.local_only = C.bool( - self.host.memory.getConfig(CONF_SECTION, 'local_only', C.BOOL_TRUE) + self.host.memory.config_get(CONF_SECTION, 'local_only', C.BOOL_TRUE) ) # if True (default), mention will be parsed in non-private content coming from # XMPP. This is necessary as XEP-0372 are coming separately from item where the # mention is done, which is hard to impossible to translate to ActivityPub (where # mention specified inside the item directly). See documentation for details. self.auto_mentions = C.bool( - self.host.memory.getConfig(CONF_SECTION, "auto_mentions", C.BOOL_TRUE) + self.host.memory.config_get(CONF_SECTION, "auto_mentions", C.BOOL_TRUE) ) - html_redirect: Dict[str, Union[str, dict]] = self.host.memory.getConfig( + html_redirect: Dict[str, Union[str, dict]] = self.host.memory.config_get( CONF_SECTION, 'html_redirect_dict', {} ) self.html_redirect: Dict[str, List[dict]] = {} @@ -291,13 +291,13 @@ if connection_type == 'http': reactor.listenTCP(self.http_port, self.server) else: - options = tls.getOptionsFromConfig( + options = tls.get_options_from_config( self.host.memory.config, CONF_SECTION) - tls.TLSOptionsCheck(options) - context_factory = tls.getTLSContextFactory(options) + tls.tls_options_check(options) + context_factory = tls.get_tls_context_factory(options) reactor.listenSSL(self.http_port, self.server, context_factory) - async def profileConnecting(self, client): + async def profile_connecting(self, client): self.client = client client.sendHistory = True client._ap_storage = persistent.LazyPersistentBinaryDict( @@ -306,10 +306,10 @@ ) await self.init(client) - def profileConnected(self, client): + def profile_connected(self, client): self.ad_hoc.init(client) - async def _itemsReceived( + async def _items_received( self, client: SatXMPPEntity, itemsEvent: pubsub.ItemsEvent @@ -326,7 +326,7 @@ return # we need recipient as JID and not gateway own JID to be able to use methods such # as "subscribe" - client = self.client.getVirtualClient(itemsEvent.sender) + client = self.client.get_virtual_client(itemsEvent.sender) recipient = itemsEvent.recipient if not recipient.user: log.debug("ignoring items event without local part specified") @@ -334,18 +334,18 @@ ap_account = self._e.unescape(recipient.user) - if self._pa.isAttachmentNode(itemsEvent.nodeIdentifier): - await self.convertAndPostAttachments( + if self._pa.is_attachment_node(itemsEvent.nodeIdentifier): + await self.convert_and_post_attachments( client, ap_account, itemsEvent.sender, itemsEvent.nodeIdentifier, itemsEvent.items ) else: - await self.convertAndPostItems( + await self.convert_and_post_items( client, ap_account, itemsEvent.sender, itemsEvent.nodeIdentifier, itemsEvent.items ) - async def getVirtualClient(self, actor_id: str) -> SatXMPPEntity: + async def get_virtual_client(self, actor_id: str) -> SatXMPPEntity: """Get client for this component with a specified jid This is needed to perform operations with the virtual JID corresponding to the AP @@ -353,8 +353,8 @@ @param actor_id: ID of the actor @return: virtual client """ - local_jid = await self.getJIDFromId(actor_id) - return self.client.getVirtualClient(local_jid) + local_jid = await self.get_jid_from_id(actor_id) + return self.client.get_virtual_client(local_jid) def is_activity(self, data: dict) -> bool: """Return True if the data has an activity type""" @@ -363,7 +363,7 @@ except (KeyError, TypeError): return False - async def apGet(self, url: str) -> dict: + async def ap_get(self, url: str) -> dict: """Retrieve AP JSON from given URL @raise error.StanzaError: "service-unavailable" is sent when something went wrong @@ -392,16 +392,16 @@ ) @overload - async def apGetObject(self, data: dict, key: str) -> Optional[dict]: + async def ap_get_object(self, data: dict, key: str) -> Optional[dict]: ... @overload - async def apGetObject( + async def ap_get_object( self, data: Union[str, dict], key: None = None ) -> dict: ... - async def apGetObject(self, data, key = None): + async def ap_get_object(self, data, key = None): """Retrieve an AP object, dereferencing when necessary This method is to be used with attributes marked as "Functional" in @@ -416,21 +416,21 @@ value = data if value is None: if key is None: - raise ValueError("None can't be used with apGetObject is key is None") + raise ValueError("None can't be used with ap_get_object is key is None") return None elif isinstance(value, dict): return value elif isinstance(value, str): - if self.isLocalURL(value): - return await self.apGetLocalObject(value) + if self.is_local_url(value): + return await self.ap_get_local_object(value) else: - return await self.apGet(value) + return await self.ap_get(value) else: raise NotImplementedError( "was expecting a string or a dict, got {type(value)}: {value!r}}" ) - async def apGetLocalObject( + async def ap_get_local_object( self, url: str ) -> dict: @@ -438,23 +438,23 @@ for now, only handle XMPP items to convert to AP """ - url_type, url_args = self.parseAPURL(url) + url_type, url_args = self.parse_apurl(url) if url_type == TYPE_ITEM: try: account, item_id = url_args except ValueError: raise ValueError(f"invalid URL: {url}") - author_jid, node = await self.getJIDAndNode(account) + author_jid, node = await self.get_jid_and_node(account) if node is None: node = self._m.namespace - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( self.client, author_jid, node ) if not cached_node: log.debug(f"node {node!r} at {author_jid} is not found in cache") found_item = None else: - cached_items, __ = await self.host.memory.storage.getItems( + cached_items, __ = await self.host.memory.storage.get_items( cached_node, item_ids=[item_id] ) if not cached_items: @@ -468,8 +468,8 @@ if found_item is None: # the node is not in cache, we have to make a request to retrieve the item - # If the node doesn't exist, getItems will raise a NotFound exception - found_items, __ = await self._p.getItems( + # If the node doesn't exist, get_items will raise a NotFound exception + found_items, __ = await self._p.get_items( self.client, author_jid, node, item_ids=[item_id] ) try: @@ -499,7 +499,7 @@ 'only object from "item" URLs can be retrieved for now' ) - async def apGetList( + async def ap_get_list( self, data: dict, key: str, @@ -507,7 +507,7 @@ ) -> Optional[List[Dict[str, Any]]]: """Retrieve a list of objects from AP data, dereferencing when necessary - This method is to be used with non functional vocabularies. Use ``apGetObject`` + This method is to be used with non functional vocabularies. Use ``ap_get_object`` otherwise. If the value is a dictionary, it will be wrapped in a list @param data: AP object where a list of objects is looked for @@ -519,10 +519,10 @@ if value is None: return None elif isinstance(value, str): - if self.isLocalURL(value): - value = await self.apGetLocalObject(value) + if self.is_local_url(value): + value = await self.ap_get_local_object(value) else: - value = await self.apGet(value) + value = await self.ap_get(value) if isinstance(value, dict): return [value] if not isinstance(value, list): @@ -533,9 +533,9 @@ for v in value ] else: - return [await self.apGetObject(i) for i in value] + return [await self.ap_get_object(i) for i in value] - async def apGetActors( + async def ap_get_actors( self, data: dict, key: str, @@ -575,11 +575,11 @@ f"list of actors is empty" ) if as_account: - return [await self.getAPAccountFromId(actor_id) for actor_id in value] + return [await self.get_ap_account_from_id(actor_id) for actor_id in value] else: return value - async def apGetSenderActor( + async def ap_get_sender_actor( self, data: dict, ) -> str: @@ -592,12 +592,12 @@ @raise exceptions.NotFound: no actor has been found in data """ try: - actors = await self.apGetActors(data, "actor", as_account=False) + actors = await self.ap_get_actors(data, "actor", as_account=False) except exceptions.DataError: actors = None if not actors: try: - actors = await self.apGetActors(data, "attributedTo", as_account=False) + actors = await self.ap_get_actors(data, "attributedTo", as_account=False) except exceptions.DataError: raise exceptions.NotFound( 'actor not specified in "actor" or "attributedTo"' @@ -607,7 +607,7 @@ except IndexError: raise exceptions.NotFound("list of actors is empty") - def mustEncode(self, text: str) -> bool: + def must_encode(self, text: str) -> bool: """Indicate if a text must be period encoded""" return ( not RE_ALLOWED_UNQUOTED.match(text) @@ -615,10 +615,10 @@ or "---" in text ) - def periodEncode(self, text: str) -> str: + def period_encode(self, text: str) -> str: """Period encode a text - see [getJIDAndNode] for reasons of period encoding + see [get_jid_and_node] for reasons of period encoding """ return ( parse.quote(text, safe="") @@ -629,7 +629,7 @@ .replace("%", ".") ) - async def getAPAccountFromJidAndNode( + async def get_ap_account_from_jid_and_node( self, jid_: jid.JID, node: Optional[str] @@ -644,28 +644,28 @@ if self.client is None: raise exceptions.InternalError("Client is not set yet") - if self.isVirtualJID(jid_): + if self.is_virtual_jid(jid_): # this is an proxy JID to an AP Actor return self._e.unescape(jid_.user) - if node and not jid_.user and not self.mustEncode(node): - is_pubsub = await self.isPubsub(jid_) + if node and not jid_.user and not self.must_encode(node): + is_pubsub = await self.is_pubsub(jid_) # when we have a pubsub service, the user part can be used to set the node # this produces more user-friendly AP accounts if is_pubsub: jid_.user = node node = None - is_local = self.isLocal(jid_) + is_local = self.is_local(jid_) user = jid_.user if is_local else jid_.userhost() if user is None: user = "" account_elts = [] - if node and self.mustEncode(node) or self.mustEncode(user): + if node and self.must_encode(node) or self.must_encode(user): account_elts = ["___"] if node: - node = self.periodEncode(node) - user = self.periodEncode(user) + node = self.period_encode(node) + user = self.period_encode(user) if not user: raise exceptions.InternalError("there should be a user part") @@ -678,21 +678,21 @@ )) return "".join(account_elts) - def isLocal(self, jid_: jid.JID) -> bool: + def is_local(self, jid_: jid.JID) -> bool: """Returns True if jid_ use a domain or subdomain of gateway's host""" local_host = self.client.host.split(".") assert local_host return jid_.host.split(".")[-len(local_host):] == local_host - async def isPubsub(self, jid_: jid.JID) -> bool: + async def is_pubsub(self, jid_: jid.JID) -> bool: """Indicate if a JID is a Pubsub service""" - host_disco = await self.host.getDiscoInfos(self.client, jid_) + host_disco = await self.host.get_disco_infos(self.client, jid_) return ( ("pubsub", "service") in host_disco.identities and not ("pubsub", "pep") in host_disco.identities ) - async def getJIDAndNode(self, ap_account: str) -> Tuple[jid.JID, Optional[str]]: + async def get_jid_and_node(self, ap_account: str) -> Tuple[jid.JID, Optional[str]]: """Decode raw AP account handle to get XMPP JID and Pubsub Node Username are case insensitive. @@ -767,7 +767,7 @@ # we need to check host disco, because disco request to user may be # blocked for privacy reason (see # https://xmpp.org/extensions/xep-0030.html#security) - is_pubsub = await self.isPubsub(jid.JID(domain)) + is_pubsub = await self.is_pubsub(jid.JID(domain)) if is_pubsub: # if the host is a pubsub service and not a PEP, we consider that username @@ -781,14 +781,14 @@ except RuntimeError: raise ValueError(f"Invalid jid: {jid_s!r}") - if self.local_only and not self.isLocal(jid_): + if self.local_only and not self.is_local(jid_): raise exceptions.PermissionError( "This gateway is configured to map only local entities and services" ) return jid_, node - def getLocalJIDFromAccount(self, account: str) -> jid.JID: + def get_local_jid_from_account(self, account: str) -> jid.JID: """Compute JID linking to an AP account The local jid is computer by escaping AP actor handle and using it as local part @@ -803,7 +803,7 @@ ) ) - async def getJIDFromId(self, actor_id: str) -> jid.JID: + async def get_jid_from_id(self, actor_id: str) -> jid.JID: """Compute JID linking to an AP Actor ID The local jid is computer by escaping AP actor handle and using it as local part @@ -811,17 +811,17 @@ If the actor_id comes from local server (checked with self.public_url), it means that we have an XMPP entity, and the original JID is returned """ - if self.isLocalURL(actor_id): - request_type, extra_args = self.parseAPURL(actor_id) + if self.is_local_url(actor_id): + request_type, extra_args = self.parse_apurl(actor_id) if request_type != TYPE_ACTOR or len(extra_args) != 1: raise ValueError(f"invalid actor id: {actor_id!r}") - actor_jid, __ = await self.getJIDAndNode(extra_args[0]) + actor_jid, __ = await self.get_jid_and_node(extra_args[0]) return actor_jid - account = await self.getAPAccountFromId(actor_id) - return self.getLocalJIDFromAccount(account) + account = await self.get_ap_account_from_id(actor_id) + return self.get_local_jid_from_account(account) - def parseAPURL(self, url: str) -> Tuple[str, List[str]]: + def parse_apurl(self, url: str) -> Tuple[str, List[str]]: """Parse an URL leading to an AP endpoint @param url: URL to parse (schema is not mandatory) @@ -831,7 +831,7 @@ type_, *extra_args = path[len(self.ap_path):].lstrip("/").split("/") return type_, [parse.unquote(a) for a in extra_args] - def buildAPURL(self, type_:str , *args: str) -> str: + def build_apurl(self, type_:str , *args: str) -> str: """Build an AP endpoint URL @param type_: type of AP endpoing @@ -842,18 +842,18 @@ str(Path(type_).joinpath(*(parse.quote_plus(a, safe="@") for a in args))) ) - def isLocalURL(self, url: str) -> bool: + def is_local_url(self, url: str) -> bool: """Tells if an URL link to this component ``public_url`` and ``ap_path`` are used to check the URL """ return url.startswith(self.base_ap_url) - def isVirtualJID(self, jid_: jid.JID) -> bool: + def is_virtual_jid(self, jid_: jid.JID) -> bool: """Tell if a JID is an AP actor mapped through this gateway""" return jid_.host == self.client.jid.userhost() - def buildSignatureHeader(self, values: Dict[str, str]) -> str: + def build_signature_header(self, values: Dict[str, str]) -> str: """Build key="<value>" signature header from signature data""" fields = [] for key, value in values.items(): @@ -868,7 +868,7 @@ return ",".join(fields) - def getDigest(self, body: bytes, algo="SHA-256") -> Tuple[str, str]: + def get_digest(self, body: bytes, algo="SHA-256") -> Tuple[str, str]: """Get digest data to use in header and signature @param body: body of the request @@ -879,12 +879,12 @@ return algo, base64.b64encode(hashlib.sha256(body).digest()).decode() @async_lru(maxsize=LRU_MAX_SIZE) - async def getActorData(self, actor_id) -> dict: + async def get_actor_data(self, actor_id) -> dict: """Retrieve actor data with LRU cache""" - return await self.apGet(actor_id) + return await self.ap_get(actor_id) @async_lru(maxsize=LRU_MAX_SIZE) - async def getActorPubKeyData( + async def get_actor_pub_key_data( self, actor_id: str ) -> Tuple[str, str, rsa.RSAPublicKey]: @@ -894,7 +894,7 @@ @return: key_id, owner and public_key @raise KeyError: publicKey is missing from actor data """ - actor_data = await self.getActorData(actor_id) + actor_data = await self.get_actor_data(actor_id) pub_key_data = actor_data["publicKey"] key_id = pub_key_data["id"] owner = pub_key_data["owner"] @@ -947,11 +947,11 @@ return data - def getKeyId(self, actor_id: str) -> str: + def get_key_id(self, actor_id: str) -> str: """Get local key ID from actor ID""" return f"{actor_id}#main-key" - async def checkSignature( + async def check_signature( self, signature: str, key_id: str, @@ -971,11 +971,11 @@ to_sign = "\n".join(f"{k.lower()}: {v}" for k,v in headers.items()) if key_id.startswith("acct:"): actor = key_id[5:] - actor_id = await self.getAPActorIdFromAccount(actor) + actor_id = await self.get_ap_actor_id_from_account(actor) else: actor_id = key_id.split("#", 1)[0] - pub_key_id, pub_key_owner, pub_key = await self.getActorPubKeyData(actor_id) + pub_key_id, pub_key_owner, pub_key = await self.get_actor_pub_key_data(actor_id) if pub_key_id != key_id or pub_key_owner != actor_id: raise exceptions.EncryptionError("Public Key mismatch") @@ -994,7 +994,7 @@ return actor_id - def getSignatureData( + def get_signature_data( self, key_id: str, headers: Dict[str, str] @@ -1028,10 +1028,10 @@ "signature": signature } new_headers = {k: v for k,v in headers.items() if not k.startswith("(")} - new_headers["Signature"] = self.buildSignatureHeader(sign_data) + new_headers["Signature"] = self.build_signature_header(sign_data) return new_headers, sign_data - async def convertAndPostItems( + async def convert_and_post_items( self, client: SatXMPPEntity, ap_account: str, @@ -1049,11 +1049,11 @@ @param subscribe_extra_nodes: if True, extra data nodes will be automatically subscribed, that is comment nodes if present and attachments nodes. """ - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id) + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id) for item in items: if item.name == "item": - cached_item = await self.host.memory.storage.searchPubsubItems({ + cached_item = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [service], "nodes": [node], @@ -1070,10 +1070,10 @@ while root_elt.parent is not None: root_elt = root_elt.parent author_jid = jid.JID(root_elt["from"]).userhostJID() - if subscribe_extra_nodes and not self.isVirtualJID(author_jid): + if subscribe_extra_nodes and not self.is_virtual_jid(author_jid): # we subscribe automatically to comment nodes if any - recipient_jid = self.getLocalJIDFromAccount(ap_account) - recipient_client = self.client.getVirtualClient(recipient_jid) + recipient_jid = self.get_local_jid_from_account(ap_account) + recipient_client = self.client.get_virtual_client(recipient_jid) comments_data = event_data.get("comments") if comments_data: comment_service = jid.JID(comments_data["jid"]) @@ -1097,13 +1097,13 @@ # blog item mb_data = await self._m.item_2_mb_data(client, item, service, node) author_jid = jid.JID(mb_data["author_jid"]) - if subscribe_extra_nodes and not self.isVirtualJID(author_jid): + if subscribe_extra_nodes and not self.is_virtual_jid(author_jid): # we subscribe automatically to comment nodes if any - recipient_jid = self.getLocalJIDFromAccount(ap_account) - recipient_client = self.client.getVirtualClient(recipient_jid) + recipient_jid = self.get_local_jid_from_account(ap_account) + recipient_client = self.client.get_virtual_client(recipient_jid) for comment_data in mb_data.get("comments", []): comment_service = jid.JID(comment_data["service"]) - if self.isVirtualJID(comment_service): + if self.is_virtual_jid(comment_service): log.debug( f"ignoring virtual comment service: {comment_data}" ) @@ -1125,14 +1125,14 @@ url_actor = ap_item["actor"] elif item.name == "retract": - url_actor, ap_item = await self.apDeleteItem( + url_actor, ap_item = await self.ap_delete_item( client.jid, node, item["id"] ) else: raise exceptions.InternalError(f"unexpected element: {item.toXml()}") - await self.signAndPost(inbox, url_actor, ap_item) + await self.sign_and_post(inbox, url_actor, ap_item) - async def convertAndPostAttachments( + async def convert_and_post_attachments( self, client: SatXMPPEntity, ap_account: str, @@ -1162,8 +1162,8 @@ f"{len(items)})" ) - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id) + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id) item_elt = items[0] item_id = item_elt["id"] @@ -1179,16 +1179,16 @@ ) return - if self.isVirtualJID(publisher): + if self.is_virtual_jid(publisher): log.debug(f"ignoring item coming from local virtual JID {publisher}") return if publisher is not None: item_elt["publisher"] = publisher.userhost() - item_service, item_node, item_id = self._pa.attachmentNode2Item(node) - item_account = await self.getAPAccountFromJidAndNode(item_service, item_node) - if self.isVirtualJID(item_service): + item_service, item_node, item_id = self._pa.attachment_node_2_item(node) + item_account = await self.get_ap_account_from_jid_and_node(item_service, item_node) + if self.is_virtual_jid(item_service): # it's a virtual JID mapping to an external AP actor, we can use the # item_id directly item_url = item_id @@ -1199,9 +1199,9 @@ ) return else: - item_url = self.buildAPURL(TYPE_ITEM, item_account, item_id) + item_url = self.build_apurl(TYPE_ITEM, item_account, item_id) - old_attachment_pubsub_items = await self.host.memory.storage.searchPubsubItems({ + old_attachment_pubsub_items = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [service], "nodes": [node], @@ -1211,19 +1211,19 @@ old_attachment = {} else: old_attachment_items = [i.data for i in old_attachment_pubsub_items] - old_attachments = self._pa.items2attachmentData(client, old_attachment_items) + old_attachments = self._pa.items_2_attachment_data(client, old_attachment_items) try: old_attachment = old_attachments[0] except IndexError: # no known element was present in attachments old_attachment = {} - publisher_account = await self.getAPAccountFromJidAndNode( + publisher_account = await self.get_ap_account_from_jid_and_node( publisher, None ) - publisher_actor_id = self.buildAPURL(TYPE_ACTOR, publisher_account) + publisher_actor_id = self.build_apurl(TYPE_ACTOR, publisher_account) try: - attachments = self._pa.items2attachmentData(client, [item_elt])[0] + attachments = self._pa.items_2_attachment_data(client, [item_elt])[0] except IndexError: # no known element was present in attachments attachments = {} @@ -1232,24 +1232,24 @@ if "noticed" in attachments: if not "noticed" in old_attachment: # new "noticed" attachment, we translate to "Like" activity - activity_id = self.buildAPURL("like", item_account, item_id) + activity_id = self.build_apurl("like", item_account, item_id) activity = self.create_activity( TYPE_LIKE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) else: if "noticed" in old_attachment: # "noticed" attachment has been removed, we undo the "Like" activity - activity_id = self.buildAPURL("like", item_account, item_id) + activity_id = self.build_apurl("like", item_account, item_id) activity = self.create_activity( TYPE_LIKE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] undo = self.create_activity("Undo", publisher_actor_id, activity) - await self.signAndPost(inbox, publisher_actor_id, undo) + await self.sign_and_post(inbox, publisher_actor_id, undo) # reactions new_reactions = set(attachments.get("reactions", {}).get("reactions", [])) @@ -1258,7 +1258,7 @@ reactions_add = new_reactions - old_reactions for reactions, undo in ((reactions_remove, True), (reactions_add, False)): for reaction in reactions: - activity_id = self.buildAPURL( + activity_id = self.build_apurl( "reaction", item_account, item_id, reaction.encode().hex() ) reaction_activity = self.create_activity( @@ -1274,7 +1274,7 @@ ) else: activy = reaction_activity - await self.signAndPost(inbox, publisher_actor_id, activy) + await self.sign_and_post(inbox, publisher_actor_id, activy) # RSVP if "rsvp" in attachments: @@ -1282,39 +1282,39 @@ old_attending = old_attachment.get("rsvp", {}).get("attending", "no") if attending != old_attending: activity_type = TYPE_JOIN if attending == "yes" else TYPE_LEAVE - activity_id = self.buildAPURL(activity_type.lower(), item_account, item_id) + activity_id = self.build_apurl(activity_type.lower(), item_account, item_id) activity = self.create_activity( activity_type, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) else: if "rsvp" in old_attachment: old_attending = old_attachment.get("rsvp", {}).get("attending", "no") if old_attending == "yes": - activity_id = self.buildAPURL(TYPE_LEAVE.lower(), item_account, item_id) + activity_id = self.build_apurl(TYPE_LEAVE.lower(), item_account, item_id) activity = self.create_activity( TYPE_LEAVE, publisher_actor_id, item_url, activity_id=activity_id ) activity["to"] = [ap_account] activity["cc"] = [NS_AP_PUBLIC] - await self.signAndPost(inbox, publisher_actor_id, activity) + await self.sign_and_post(inbox, publisher_actor_id, activity) - if service.user and self.isVirtualJID(service): + if service.user and self.is_virtual_jid(service): # the item is on a virtual service, we need to store it in cache log.debug("storing attachments item in cache") - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=True ) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( self.client, cached_node, [item_elt], [attachments] ) - async def signAndPost(self, url: str, actor_id: str, doc: dict) -> TReqResponse: + async def sign_and_post(self, url: str, actor_id: str, doc: dict) -> TReqResponse: """Sign a documentent and post it to AP server @param url: AP server endpoint @@ -1322,7 +1322,7 @@ @param doc: document to send """ if self.verbose: - __, actor_args = self.parseAPURL(actor_id) + __, actor_args = self.parse_apurl(actor_id) actor_account = actor_args[0] to_log = [ "", @@ -1331,7 +1331,7 @@ p_url = parse.urlparse(url) body = json.dumps(doc).encode() - digest_algo, digest_hash = self.getDigest(body) + digest_algo, digest_hash = self.get_digest(body) digest = f"{digest_algo}={digest_hash}" headers = { @@ -1343,7 +1343,7 @@ headers["Content-Type"] = ( 'application/activity+json' ) - headers, __ = self.getSignatureData(self.getKeyId(actor_id), headers) + headers, __ = self.get_signature_data(self.get_key_id(actor_id), headers) if self.verbose: if self.verbose>=3: @@ -1364,19 +1364,19 @@ log.info(f"==> response code: {resp.code}") return resp - def _publishMessage(self, mess_data_s: str, service_s: str, profile: str): + def _publish_message(self, mess_data_s: str, service_s: str, profile: str): mess_data: dict = data_format.deserialise(mess_data_s) # type: ignore service = jid.JID(service_s) - client = self.host.getClient(profile) - return defer.ensureDeferred(self.publishMessage(client, mess_data, service)) + client = self.host.get_client(profile) + return defer.ensureDeferred(self.publish_message(client, mess_data, service)) @async_lru(maxsize=LRU_MAX_SIZE) - async def getAPActorIdFromAccount(self, account: str) -> str: + async def get_ap_actor_id_from_account(self, account: str) -> str: """Retrieve account ID from it's handle using WebFinger Don't use this method to get local actor id from a local account derivated for JID: in this case, the actor ID is retrieve with - ``self.buildAPURL(TYPE_ACTOR, ap_account)`` + ``self.build_apurl(TYPE_ACTOR, ap_account)`` @param account: AP handle (user@domain.tld) @return: Actor ID (which is an URL) @@ -1408,21 +1408,21 @@ ) return href - async def getAPActorDataFromAccount(self, account: str) -> dict: + async def get_ap_actor_data_from_account(self, account: str) -> dict: """Retrieve ActivityPub Actor data @param account: ActivityPub Actor identifier """ - href = await self.getAPActorIdFromAccount(account) - return await self.apGet(href) + href = await self.get_ap_actor_id_from_account(account) + return await self.ap_get(href) - async def getAPInboxFromId(self, actor_id: str, use_shared: bool = True) -> str: + async def get_ap_inbox_from_id(self, actor_id: str, use_shared: bool = True) -> str: """Retrieve inbox of an actor_id @param use_shared: if True, and a shared inbox exists, it will be used instead of the user inbox """ - data = await self.getActorData(actor_id) + data = await self.get_actor_data(actor_id) if use_shared: try: return data["endpoints"]["sharedInbox"] @@ -1431,15 +1431,15 @@ return data["inbox"] @async_lru(maxsize=LRU_MAX_SIZE) - async def getAPAccountFromId(self, actor_id: str) -> str: + async def get_ap_account_from_id(self, actor_id: str) -> str: """Retrieve AP account from the ID URL Works with external or local actor IDs. @param actor_id: AP ID of the actor (URL to the actor data) @return: AP handle """ - if self.isLocalURL(actor_id): - url_type, url_args = self.parseAPURL(actor_id) + if self.is_local_url(actor_id): + url_type, url_args = self.parse_apurl(actor_id) if url_type != "actor" or not url_args: raise exceptions.DataError( f"invalid local actor ID: {actor_id}" @@ -1458,7 +1458,7 @@ return account url_parsed = parse.urlparse(actor_id) - actor_data = await self.getActorData(actor_id) + actor_data = await self.get_actor_data(actor_id) username = actor_data.get("preferredUsername") if not username: raise exceptions.DataError( @@ -1466,7 +1466,7 @@ ) account = f"{username}@{url_parsed.hostname}" # we try to retrieve the actor ID from the account to check it - found_id = await self.getAPActorIdFromAccount(account) + found_id = await self.get_ap_actor_id_from_account(account) if found_id != actor_id: # cf. https://socialhub.activitypub.rocks/t/how-to-retrieve-user-server-tld-handle-from-actors-url/2196 msg = ( @@ -1478,7 +1478,7 @@ raise exceptions.DataError(msg) return account - async def getAPItems( + async def get_ap_items( self, collection: dict, max_items: Optional[int] = None, @@ -1552,7 +1552,7 @@ retrieved_items = 0 current_page = collection["last"] while retrieved_items < count: - page_data, items = await self.parseAPPage( + page_data, items = await self.parse_ap_page( current_page, parser, only_ids ) if not items: @@ -1588,7 +1588,7 @@ found_after_id = False while retrieved_items < count: - __, page_items = await self.parseAPPage(page, parser, only_ids) + __, page_items = await self.parse_ap_page(page, parser, only_ids) if not page_items: break retrieved_items += len(page_items) @@ -1661,7 +1661,7 @@ __, item_elt = await self.ap_item_2_mb_data_and_elt(ap_item) return item_elt - async def parseAPPage( + async def parse_ap_page( self, page: Union[str, dict], parser: Callable[[dict], Awaitable[domish.Element]], @@ -1674,13 +1674,13 @@ @param only_ids: if True, only retrieve items IDs @return: page data, pubsub items """ - page_data = await self.apGetObject(page) + page_data = await self.ap_get_object(page) if page_data is None: log.warning('No data found in collection') return {}, [] - ap_items = await self.apGetList(page_data, "orderedItems", only_ids=only_ids) + ap_items = await self.ap_get_list(page_data, "orderedItems", only_ids=only_ids) if ap_items is None: - ap_items = await self.apGetList(page_data, "items", only_ids=only_ids) + ap_items = await self.ap_get_list(page_data, "items", only_ids=only_ids) if not ap_items: log.warning(f'No item field found in collection: {page_data!r}') return page_data, [] @@ -1699,7 +1699,7 @@ return page_data, items - async def getCommentsNodes( + async def get_comments_nodes( self, item_id: str, parent_id: Optional[str] @@ -1719,13 +1719,13 @@ """ if parent_id is None or not self.comments_max_depth: return ( - self._m.getCommentsNode(parent_id) if parent_id is not None else None, - self._m.getCommentsNode(item_id) + self._m.get_comments_node(parent_id) if parent_id is not None else None, + self._m.get_comments_node(item_id) ) parent_url = parent_id parents = [] for __ in range(COMMENTS_MAX_PARENTS): - parent_item = await self.apGet(parent_url) + parent_item = await self.ap_get(parent_url) parents.insert(0, parent_item) parent_url = parent_item.get("inReplyTo") if parent_url is None: @@ -1733,13 +1733,13 @@ parent_limit = self.comments_max_depth-1 if len(parents) <= parent_limit: return ( - self._m.getCommentsNode(parents[-1]["id"]), - self._m.getCommentsNode(item_id) + self._m.get_comments_node(parents[-1]["id"]), + self._m.get_comments_node(item_id) ) else: last_level_item = parents[parent_limit] return ( - self._m.getCommentsNode(last_level_item["id"]), + self._m.get_comments_node(last_level_item["id"]), None ) @@ -1755,7 +1755,7 @@ """ is_activity = self.is_activity(ap_item) if is_activity: - ap_object = await self.apGetObject(ap_item, "object") + ap_object = await self.ap_get_object(ap_item, "object") if not ap_object: log.warning(f'No "object" found in AP item {ap_item!r}') raise exceptions.DataError @@ -1815,16 +1815,16 @@ # author if is_activity: - authors = await self.apGetActors(ap_item, "actor") + authors = await self.ap_get_actors(ap_item, "actor") else: - authors = await self.apGetActors(ap_object, "attributedTo") + authors = await self.ap_get_actors(ap_object, "attributedTo") if len(authors) > 1: # we only keep first item as author # TODO: handle multiple actors log.warning("multiple actors are not managed") account = authors[0] - author_jid = self.getLocalJIDFromAccount(account).full() + author_jid = self.get_local_jid_from_account(account).full() mb_data["author"] = account.split("@", 1)[0] mb_data["author_jid"] = author_jid @@ -1848,12 +1848,12 @@ # comments in_reply_to = ap_object.get("inReplyTo") - __, comments_node = await self.getCommentsNodes(item_id, in_reply_to) + __, comments_node = await self.get_comments_nodes(item_id, in_reply_to) if comments_node is not None: comments_data = { "service": author_jid, "node": comments_node, - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=author_jid, node=comments_node @@ -1863,7 +1863,7 @@ return mb_data - async def getReplyToIdFromXMPPNode( + async def get_reply_to_id_from_xmpp_node( self, client: SatXMPPEntity, ap_account: str, @@ -1885,7 +1885,7 @@ """ # FIXME: propose a protoXEP to properly get parent item, node and service - found_items = await self.host.memory.storage.searchPubsubItems({ + found_items = await self.host.memory.storage.search_pubsub_items({ "profiles": [client.profile], "names": [parent_item] }) @@ -1894,7 +1894,7 @@ parent_ap_account = ap_account elif len(found_items) == 1: cached_node = found_items[0].node - parent_ap_account = await self.getAPAccountFromJidAndNode( + parent_ap_account = await self.get_ap_account_from_jid_and_node( cached_node.service, cached_node.name ) @@ -1917,12 +1917,12 @@ parent_ap_account = ap_account else: cached_node = cached_item.node - parent_ap_account = await self.getAPAccountFromJidAndNode( + parent_ap_account = await self.get_ap_account_from_jid_and_node( cached_node.service, cached_node.name ) - return self.buildAPURL( + return self.build_apurl( TYPE_ITEM, parent_ap_account, parent_item ) @@ -1937,11 +1937,11 @@ """ repeated = mb_data["extra"]["repeated"] repeater = jid.JID(repeated["by"]) - repeater_account = await self.getAPAccountFromJidAndNode( + repeater_account = await self.get_ap_account_from_jid_and_node( repeater, None ) - repeater_id = self.buildAPURL(TYPE_ACTOR, repeater_account) + repeater_id = self.build_apurl(TYPE_ACTOR, repeater_account) repeated_uri = repeated["uri"] if not repeated_uri.startswith("xmpp:"): @@ -1950,7 +1950,7 @@ f"item {mb_data}" ) raise NotImplementedError - parsed_url = uri.parseXMPPUri(repeated_uri) + parsed_url = uri.parse_xmpp_uri(repeated_uri) if parsed_url["type"] != "pubsub": log.warning( "Only pubsub URL are handled for repeated item at the moment, ignoring " @@ -1959,9 +1959,9 @@ raise NotImplementedError rep_service = jid.JID(parsed_url["path"]) rep_item = parsed_url["item"] - activity_id = self.buildAPURL("item", repeater.userhost(), mb_data["id"]) + activity_id = self.build_apurl("item", repeater.userhost(), mb_data["id"]) - if self.isVirtualJID(rep_service): + if self.is_virtual_jid(rep_service): # it's an AP actor linked through this gateway # in this case we can simply use the item ID if not rep_item.startswith("https:"): @@ -1974,18 +1974,18 @@ else: # the repeated item is an XMPP publication, we build the corresponding ID rep_node = parsed_url["node"] - repeated_account = await self.getAPAccountFromJidAndNode( + repeated_account = await self.get_ap_account_from_jid_and_node( rep_service, rep_node ) - announced_uri = self.buildAPURL("item", repeated_account, rep_item) + announced_uri = self.build_apurl("item", repeated_account, rep_item) announce = self.create_activity( "Announce", repeater_id, announced_uri, activity_id=activity_id ) announce["to"] = [NS_AP_PUBLIC] announce["cc"] = [ - self.buildAPURL(TYPE_FOLLOWERS, repeater_account), - await self.getAPActorIdFromAccount(repeated_account) + self.build_apurl(TYPE_FOLLOWERS, repeater_account), + await self.get_ap_actor_id_from_account(repeated_account) ] return announce @@ -2020,12 +2020,12 @@ mb_data["id"] = shortuuid.uuid() if not mb_data.get("author_jid"): mb_data["author_jid"] = client.jid.userhost() - ap_account = await self.getAPAccountFromJidAndNode( + ap_account = await self.get_ap_account_from_jid_and_node( jid.JID(mb_data["author_jid"]), None ) - url_actor = self.buildAPURL(TYPE_ACTOR, ap_account) - url_item = self.buildAPURL(TYPE_ITEM, ap_account, mb_data["id"]) + url_actor = self.build_apurl(TYPE_ACTOR, ap_account) + url_item = self.build_apurl(TYPE_ITEM, ap_account, mb_data["id"]) ap_object = { "id": url_item, "type": "Note", @@ -2076,7 +2076,7 @@ # references continue try: - mentioned_id = await self.getAPActorIdFromAccount(mentioned) + mentioned_id = await self.get_ap_actor_id_from_account(mentioned) except Exception as e: log.warning(f"Can't add mention to {mentioned!r}: {e}") else: @@ -2094,27 +2094,27 @@ raise exceptions.InternalError( "node or service is missing in mb_data" ) - target_ap_account = await self.getAPAccountFromJidAndNode( + target_ap_account = await self.get_ap_account_from_jid_and_node( service, node ) - if self.isVirtualJID(service): + if self.is_virtual_jid(service): # service is a proxy JID for AP account - actor_data = await self.getAPActorDataFromAccount(target_ap_account) + actor_data = await self.get_ap_actor_data_from_account(target_ap_account) followers = actor_data.get("followers") else: # service is a real XMPP entity - followers = self.buildAPURL(TYPE_FOLLOWERS, target_ap_account) + followers = self.build_apurl(TYPE_FOLLOWERS, target_ap_account) if followers: ap_object["cc"] = [followers] - if self._m.isCommentNode(node): - parent_item = self._m.getParentItem(node) - if self.isVirtualJID(service): + if self._m.is_comment_node(node): + parent_item = self._m.get_parent_item(node) + if self.is_virtual_jid(service): # the publication is on a virtual node (i.e. an XMPP node managed by # this gateway and linking to an ActivityPub actor) ap_object["inReplyTo"] = parent_item else: # the publication is from a followed real XMPP node - ap_object["inReplyTo"] = await self.getReplyToIdFromXMPPNode( + ap_object["inReplyTo"] = await self.get_reply_to_id_from_xmpp_node( client, ap_account, parent_item, @@ -2125,7 +2125,7 @@ "Create" if is_new else "Update", url_actor, ap_object, activity_id=url_item ) - async def publishMessage( + async def publish_message( self, client: SatXMPPEntity, mess_data: dict, @@ -2151,7 +2151,7 @@ if not service.user: raise ValueError("service must have a local part") account = self._e.unescape(service.user) - ap_actor_data = await self.getAPActorDataFromAccount(account) + ap_actor_data = await self.get_ap_actor_data_from_account(account) try: inbox_url = ap_actor_data["endpoints"]["sharedInbox"] @@ -2160,9 +2160,9 @@ item_data = await self.mb_data_2_ap_item(client, mess_data) url_actor = item_data["actor"] - resp = await self.signAndPost(inbox_url, url_actor, item_data) + resp = await self.sign_and_post(inbox_url, url_actor, item_data) - async def apDeleteItem( + async def ap_delete_item( self, jid_: jid.JID, node: Optional[str], @@ -2182,10 +2182,10 @@ if node is None: node = self._m.namespace - author_account = await self.getAPAccountFromJidAndNode(jid_, node) - author_actor_id = self.buildAPURL(TYPE_ACTOR, author_account) + author_account = await self.get_ap_account_from_jid_and_node(jid_, node) + author_actor_id = self.build_apurl(TYPE_ACTOR, author_account) - items = await self.host.memory.storage.searchPubsubItems({ + items = await self.host.memory.storage.search_pubsub_items({ "profiles": [self.client.profile], "services": [jid_], "names": [item_id] @@ -2210,7 +2210,7 @@ f"{items[0].toXml()}" ) - url_item = self.buildAPURL(TYPE_ITEM, author_account, item_id) + url_item = self.build_apurl(TYPE_ITEM, author_account, item_id) ap_item = self.create_activity( "Delete", author_actor_id, @@ -2223,7 +2223,7 @@ ap_item["to"] = [NS_AP_PUBLIC] return author_actor_id, ap_item - def _messageReceivedTrigger( + def _message_received_trigger( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2248,7 +2248,7 @@ if mess_data["type"] not in ("chat", "normal"): log.warning(f"ignoring message with unexpected type: {mess_data}") return mess_data - if not self.isLocal(mess_data["from"]): + if not self.is_local(mess_data["from"]): log.warning(f"ignoring non local message: {mess_data}") return mess_data if not mess_data["to"].user: @@ -2258,8 +2258,8 @@ return mess_data actor_account = self._e.unescape(mess_data["to"].user) - actor_id = await self.getAPActorIdFromAccount(actor_account) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) + actor_id = await self.get_ap_actor_id_from_account(actor_account) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) try: language, message = next(iter(mess_data["message"].items())) @@ -2282,7 +2282,7 @@ C.KEY_ATTACHMENTS: attachments } - client = self.client.getVirtualClient(mess_data["from"]) + client = self.client.get_virtual_client(mess_data["from"]) ap_item = await self.mb_data_2_ap_item(client, mb_data, public=False) ap_object = ap_item["object"] ap_object["to"] = ap_item["to"] = [actor_id] @@ -2294,10 +2294,10 @@ "name": f"@{actor_account}", }) - await self.signAndPost(inbox, ap_item["actor"], ap_item) + await self.sign_and_post(inbox, ap_item["actor"], ap_item) return mess_data - async def _onMessageRetract( + async def _on_message_retract( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2307,7 +2307,7 @@ if client != self.client: return True from_jid = jid.JID(message_elt["from"]) - if not self.isLocal(from_jid): + if not self.is_local(from_jid): log.debug( f"ignoring retract request from non local jid {from_jid}" ) @@ -2319,15 +2319,15 @@ f"Invalid destinee's JID: {to_jid.full()}" ) ap_account = self._e.unescape(to_jid.user) - actor_id = await self.getAPActorIdFromAccount(ap_account) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) - url_actor, ap_item = await self.apDeleteItem( + actor_id = await self.get_ap_actor_id_from_account(ap_account) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) + url_actor, ap_item = await self.ap_delete_item( from_jid.userhostJID(), None, fastened_elts.id, public=False ) - resp = await self.signAndPost(inbox, url_actor, ap_item) + resp = await self.sign_and_post(inbox, url_actor, ap_item) return False - async def _onReferenceReceived( + async def _on_reference_received( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -2352,7 +2352,7 @@ return False ap_account = self._e.unescape(mentioned.user) - actor_id = await self.getAPActorIdFromAccount(ap_account) + actor_id = await self.get_ap_actor_id_from_account(ap_account) parsed_anchor: dict = reference_data.get("parsed_anchor") if not parsed_anchor: @@ -2380,14 +2380,14 @@ log.warning(f"missing pubsub item in anchor: {reference_data['anchor']}") return False - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, pubsub_service, pubsub_node ) if not cached_node: log.warning(f"Anchored node not found in cache: {reference_data['anchor']}") return False - cached_items, __ = await self.host.memory.storage.getItems( + cached_items, __ = await self.host.memory.storage.get_items( cached_node, item_ids=[pubsub_item] ) if not cached_items: @@ -2410,13 +2410,13 @@ "name": ap_account, }) - inbox = await self.getAPInboxFromId(actor_id, use_shared=False) + inbox = await self.get_ap_inbox_from_id(actor_id, use_shared=False) - resp = await self.signAndPost(inbox, ap_item["actor"], ap_item) + resp = await self.sign_and_post(inbox, ap_item["actor"], ap_item) return False - async def newReplyToXMPPItem( + async def new_reply_to_xmpp_item( self, client: SatXMPPEntity, ap_item: dict, @@ -2425,7 +2425,7 @@ ) -> None: """We got an AP item which is a reply to an XMPP item""" in_reply_to = ap_item["inReplyTo"] - url_type, url_args = self.parseAPURL(in_reply_to) + url_type, url_args = self.parse_apurl(in_reply_to) if url_type != "item": log.warning( "Ignoring AP item replying to an XMPP item with an unexpected URL " @@ -2440,12 +2440,12 @@ f"({in_reply_to!r}):\n{pformat(ap_item)}" ) return - parent_item_service, parent_item_node = await self.getJIDAndNode( + parent_item_service, parent_item_node = await self.get_jid_and_node( parent_item_account ) if parent_item_node is None: parent_item_node = self._m.namespace - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, parent_item_service, parent_item_node, item_ids=[parent_item_id] ) try: @@ -2463,17 +2463,17 @@ comment_node = parent_item_parsed["comments"][0]["node"] except (KeyError, IndexError): # we don't have a comment node set for this item - from sat.tools.xml_tools import ppElt - log.info(f"{ppElt(parent_item_elt.toXml())}") + from sat.tools.xml_tools import pp_elt + log.info(f"{pp_elt(parent_item_elt.toXml())}") raise NotImplementedError() else: __, item_elt = await self.ap_item_2_mb_data_and_elt(ap_item) await self._p.publish(client, comment_service, comment_node, [item_elt]) - await self.notifyMentions( + await self.notify_mentions( targets, mentions, comment_service, comment_node, item_elt["id"] ) - def getAPItemTargets( + def get_ap_item_targets( self, item: Dict[str, Any] ) -> Tuple[bool, Dict[str, Set[str]], List[Dict[str, str]]]: @@ -2499,9 +2499,9 @@ continue if not value: continue - if not self.isLocalURL(value): + if not self.is_local_url(value): continue - target_type = self.parseAPURL(value)[0] + target_type = self.parse_apurl(value)[0] if target_type != TYPE_ACTOR: log.debug(f"ignoring non actor type as a target: {href}") else: @@ -2517,9 +2517,9 @@ if not href: log.warning('Missing "href" field from mention object: {tag!r}') continue - if not self.isLocalURL(href): + if not self.is_local_url(href): continue - uri_type = self.parseAPURL(href)[0] + uri_type = self.parse_apurl(href)[0] if uri_type != TYPE_ACTOR: log.debug(f"ignoring non actor URI as a target: {href}") continue @@ -2531,7 +2531,7 @@ return is_public, targets, mentions - async def newAPItem( + async def new_ap_item( self, client: SatXMPPEntity, destinee: Optional[jid.JID], @@ -2544,14 +2544,14 @@ @param node: XMPP pubsub node @param item: AP object payload """ - is_public, targets, mentions = self.getAPItemTargets(item) + is_public, targets, mentions = self.get_ap_item_targets(item) if not is_public and targets.keys() == {TYPE_ACTOR}: # this is a direct message await self.handle_message_ap_item( client, targets, mentions, destinee, item ) else: - await self.handlePubsubAPItem( + await self.handle_pubsub_ap_item( client, targets, mentions, destinee, node, item, is_public ) @@ -2570,7 +2570,7 @@ @param item: AP object payload """ targets_jids = { - await self.getJIDFromId(t) + await self.get_jid_from_id(t) for t_set in targets.values() for t in t_set } @@ -2596,7 +2596,7 @@ ) await defer.DeferredList(defer_l) - async def notifyMentions( + async def notify_mentions( self, targets: Dict[str, Set[str]], mentions: List[Dict[str, str]], @@ -2612,14 +2612,14 @@ https://www.w3.org/TR/activitystreams-vocabulary/#microsyntaxes). """ - anchor = uri.buildXMPPUri("pubsub", path=service.full(), node=node, item=item_id) + anchor = uri.build_xmpp_uri("pubsub", path=service.full(), node=node, item=item_id) seen = set() # we start with explicit mentions because mentions' content will be used in the # future to fill "begin" and "end" reference attributes (we can't do it at the # moment as there is no way to specify the XML element to use in the blog item). for mention in mentions: - mentioned_jid = await self.getJIDFromId(mention["uri"]) - self._refs.sendReference( + mentioned_jid = await self.get_jid_from_id(mention["uri"]) + self._refs.send_reference( self.client, to_jid=mentioned_jid, anchor=anchor @@ -2627,18 +2627,18 @@ seen.add(mentioned_jid) remaining = { - await self.getJIDFromId(t) + await self.get_jid_from_id(t) for t_set in targets.values() for t in t_set } - seen for target in remaining: - self._refs.sendReference( + self._refs.send_reference( self.client, to_jid=target, anchor=anchor ) - async def handlePubsubAPItem( + async def handle_pubsub_ap_item( self, client: SatXMPPEntity, targets: Dict[str, Set[str]], @@ -2663,23 +2663,23 @@ if in_reply_to and isinstance(in_reply_to, list): in_reply_to = in_reply_to[0] if in_reply_to and isinstance(in_reply_to, str): - if self.isLocalURL(in_reply_to): + if self.is_local_url(in_reply_to): # this is a reply to an XMPP item - await self.newReplyToXMPPItem(client, item, targets, mentions) + await self.new_reply_to_xmpp_item(client, item, targets, mentions) return # this item is a reply to an AP item, we use or create a corresponding node # for comments - parent_node, __ = await self.getCommentsNodes(item["id"], in_reply_to) + parent_node, __ = await self.get_comments_nodes(item["id"], in_reply_to) node = parent_node or node - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=True, create_kwargs={"subscribed": True} ) else: # it is a root item (i.e. not a reply to an other item) create = node == self._events.namespace - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node, with_subscriptions=True, create=create ) if cached_node is None: @@ -2693,7 +2693,7 @@ data, item_elt = await self.ap_events.ap_item_2_event_data_and_elt(item) else: data, item_elt = await self.ap_item_2_mb_data_and_elt(item) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( client, cached_node, [item_elt], @@ -2709,9 +2709,9 @@ [(subscription.subscriber, None, [item_elt])] ) - await self.notifyMentions(targets, mentions, service, node, item_elt["id"]) + await self.notify_mentions(targets, mentions, service, node, item_elt["id"]) - async def newAPDeleteItem( + async def new_ap_delete_item( self, client: SatXMPPEntity, destinee: Optional[jid.JID], @@ -2731,7 +2731,7 @@ raise exceptions.DataError('"id" attribute is missing in item') if not item_id.startswith("http"): raise exceptions.DataError(f"invalid id: {item_id!r}") - if self.isLocalURL(item_id): + if self.is_local_url(item_id): raise ValueError("Local IDs should not be used") # we have no way to know if a deleted item is a direct one (thus a message) or one @@ -2755,10 +2755,10 @@ ) raise exceptions.PermissionError("forbidden") - await self._r.retractByHistory(client, history) + await self._r.retract_by_history(client, history) else: # no history in cache with this ID, it's probably a pubsub item - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, client.jid, node, with_subscriptions=True ) if cached_node is None: @@ -2767,7 +2767,7 @@ "which is not cached" ) raise exceptions.NotFound - await self.host.memory.storage.deletePubsubItems(cached_node, [item_id]) + await self.host.memory.storage.delete_pubsub_items(cached_node, [item_id]) # notifyRetract is expecting domish.Element instances item_elt = domish.Element((None, "item")) item_elt["id"] = item_id
--- a/sat/plugins/plugin_comp_ap_gateway/ad_hoc.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/ad_hoc.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,7 +38,7 @@ self._c = self.host.plugins["XEP-0050"] def init(self, client: SatXMPPEntity) -> None: - self._c.addAdHocCommand( + self._c.add_ad_hoc_command( client, self.xmpp_jid_node_2_ap_actor, "Convert XMPP JID/Node to AP actor", @@ -82,7 +82,7 @@ else: xmpp_jid = jid.JID(command_form["jid"]) xmpp_node = command_form.get("node") - actor = await self.apg.getAPAccountFromJidAndNode(xmpp_jid, xmpp_node) + actor = await self.apg.get_ap_account_from_jid_and_node(xmpp_jid, xmpp_node) note = (self._c.NOTE.INFO, actor) status = self._c.STATUS.COMPLETED payload = None
--- a/sat/plugins/plugin_comp_ap_gateway/events.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/events.py Sat Apr 08 13:54:42 2023 +0200 @@ -113,12 +113,12 @@ """ if not event_data.get("id"): event_data["id"] = shortuuid.uuid() - ap_account = await self.apg.getAPAccountFromJidAndNode( + ap_account = await self.apg.get_ap_account_from_jid_and_node( author_jid, self._events.namespace ) - url_actor = self.apg.buildAPURL(TYPE_ACTOR, ap_account) - url_item = self.apg.buildAPURL(TYPE_ITEM, ap_account, event_data["id"]) + url_actor = self.apg.build_apurl(TYPE_ACTOR, ap_account) + url_item = self.apg.build_apurl(TYPE_ITEM, ap_account, event_data["id"]) ap_object = { "actor": url_actor, "attributedTo": url_actor, @@ -246,7 +246,7 @@ """ is_activity = self.apg.is_activity(ap_item) if is_activity: - ap_object = await self.apg.apGetObject(ap_item, "object") + ap_object = await self.apg.ap_get_object(ap_item, "object") if not ap_object: log.warning(f'No "object" found in AP item {ap_item!r}') raise exceptions.DataError @@ -257,7 +257,7 @@ if "_repeated" in ap_item: # if the event is repeated, we use the original one ID repeated_uri = ap_item["_repeated"]["uri"] - parsed_uri = uri.parseXMPPUri(repeated_uri) + parsed_uri = uri.parse_xmpp_uri(repeated_uri) object_id = parsed_uri["item"] else: object_id = ap_object.get("id") @@ -268,10 +268,10 @@ raise exceptions.DataError("AP Object is not an event") # author - actor = await self.apg.apGetSenderActor(ap_object) + actor = await self.apg.ap_get_sender_actor(ap_object) - account = await self.apg.getAPAccountFromId(actor) - author_jid = self.apg.getLocalJIDFromAccount(account).full() + account = await self.apg.get_ap_account_from_id(actor) + author_jid = self.apg.get_local_jid_from_account(account).full() # name, start, end event_data = { @@ -370,7 +370,7 @@ # comments if ap_object.get("commentsEnabled"): - __, comments_node = await self.apg.getCommentsNodes(object_id, None) + __, comments_node = await self.apg.get_comments_nodes(object_id, None) event_data["comments"] = { "service": author_jid, "node": comments_node,
--- a/sat/plugins/plugin_comp_ap_gateway/http_server.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/http_server.py Sat Apr 08 13:54:42 2023 +0200 @@ -66,7 +66,7 @@ self._seen_digest = deque(maxlen=50) super().__init__() - def responseCode( + def response_code( self, request: "HTTPRequest", http_code: int, @@ -77,17 +77,17 @@ log.warning(msg) request.setResponseCode(http_code, None if msg is None else msg.encode()) - def _onRequestError(self, failure_: failure.Failure, request: "HTTPRequest") -> None: + def _on_request_error(self, failure_: failure.Failure, request: "HTTPRequest") -> None: exc = failure_.value if isinstance(exc, exceptions.NotFound): - self.responseCode( + self.response_code( request, http.NOT_FOUND, str(exc) ) else: log.exception(f"Internal error: {failure_.value}") - self.responseCode( + self.response_code( request, http.INTERNAL_SERVER_ERROR, f"internal error: {failure_.value}" @@ -107,7 +107,7 @@ http.BAD_REQUEST, "Bad Request" , "Invalid webfinger resource" ).render(request) - actor_url = self.apg.buildAPURL(TYPE_ACTOR, account) + actor_url = self.apg.build_apurl(TYPE_ACTOR, account) resp = { "aliases": [actor_url], @@ -124,7 +124,7 @@ request.write(json.dumps(resp).encode()) request.finish() - async def handleUndoActivity( + async def handle_undo_activity( self, request: "HTTPRequest", data: dict, @@ -136,7 +136,7 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) + client = await self.apg.get_virtual_client(signing_actor) object_ = data.get("object") if isinstance(object_, str): # we check first if it's not a cached object @@ -149,10 +149,10 @@ # because we'll undo the activity, we can remove it from cache await self.apg.client._ap_storage.remove(ap_cache_key) else: - objects = await self.apg.apGetList(data, "object") + objects = await self.apg.ap_get_list(data, "object") for obj in objects: type_ = obj.get("type") - actor = await self.apg.apGetSenderActor(obj) + actor = await self.apg.ap_get_sender_actor(obj) if actor != signing_actor: log.warning(f"ignoring object not attributed to signing actor: {data}") continue @@ -163,7 +163,7 @@ except KeyError: log.warning(f'ignoring invalid object, missing "object" key: {data}') continue - if not self.apg.isLocalURL(target_account): + if not self.apg.is_local_url(target_account): log.warning(f"ignoring unfollow request to non local actor: {data}") continue await self.apg._p.unsubscribe( @@ -175,17 +175,17 @@ elif type_ == "Announce": # we can use directly the Announce object, as only the "id" field is # needed - await self.apg.newAPDeleteItem(client, None, node, obj) + await self.apg.new_ap_delete_item(client, None, node, obj) elif type_ == TYPE_LIKE: - await self.handleAttachmentItem(client, obj, {"noticed": False}) + await self.handle_attachment_item(client, obj, {"noticed": False}) elif type_ == TYPE_REACTION: - await self.handleAttachmentItem(client, obj, { + await self.handle_attachment_item(client, obj, { "reactions": {"operation": "update", "remove": [obj["content"]]} }) else: log.warning(f"Unmanaged undo type: {type_!r}") - async def handleFollowActivity( + async def handle_follow_activity( self, request: "HTTPRequest", data: dict, @@ -197,14 +197,14 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) + client = await self.apg.get_virtual_client(signing_actor) try: subscription = await self.apg._p.subscribe( client, account_jid, node, # subscriptions from AP are always public - options=self.apg._pps.setPublicOpt() + options=self.apg._pps.set_public_opt() ) except pubsub.SubscriptionPending: log.info(f"subscription to node {node!r} of {account_jid} is pending") @@ -213,15 +213,15 @@ if subscription.state != "subscribed": # other states should raise an Exception raise exceptions.InternalError('"subscribed" state was expected') - inbox = await self.apg.getAPInboxFromId(signing_actor, use_shared=False) - actor_id = self.apg.buildAPURL(TYPE_ACTOR, ap_account) + inbox = await self.apg.get_ap_inbox_from_id(signing_actor, use_shared=False) + actor_id = self.apg.build_apurl(TYPE_ACTOR, ap_account) accept_data = self.apg.create_activity( "Accept", actor_id, object_=data ) - await self.apg.signAndPost(inbox, actor_id, accept_data) + await self.apg.sign_and_post(inbox, actor_id, accept_data) await self.apg._c.synchronise(client, account_jid, node, resync=False) - async def handleAcceptActivity( + async def handle_accept_activity( self, request: "HTTPRequest", data: dict, @@ -233,12 +233,12 @@ ) -> None: if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: type_ = obj.get("type") if type_ == "Follow": - follow_node = await self.apg.host.memory.storage.getPubsubNode( + follow_node = await self.apg.host.memory.storage.get_pubsub_node( client, client.jid, node, with_subscriptions=True ) if follow_node is None: @@ -270,7 +270,7 @@ else: log.warning(f"Unmanaged accept type: {type_!r}") - async def handleDeleteActivity( + async def handle_delete_activity( self, request: "HTTPRequest", data: dict, @@ -282,12 +282,12 @@ ): if node is None: node = self.apg._m.namespace - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: - await self.apg.newAPDeleteItem(client, account_jid, node, obj) + await self.apg.new_ap_delete_item(client, account_jid, node, obj) - async def handleNewAPItems( + async def handle_new_ap_items( self, request: "HTTPRequest", data: dict, @@ -298,7 +298,7 @@ ): """Helper method to handle workflow for new AP items - accept globally the same parameter as for handleCreateActivity + accept globally the same parameter as for handle_create_activity @param repeated: if True, the item is an item republished from somewhere else """ if "_repeated" in data: @@ -307,25 +307,25 @@ f"happen. Ignoring object from {signing_actor}\n{data}" ) raise exceptions.DataError("unexpected field in item") - client = await self.apg.getVirtualClient(signing_actor) - objects = await self.apg.apGetList(data, "object") + client = await self.apg.get_virtual_client(signing_actor) + objects = await self.apg.ap_get_list(data, "object") for obj in objects: if node is None: if obj.get("type") == TYPE_EVENT: node = self.apg._events.namespace else: node = self.apg._m.namespace - sender = await self.apg.apGetSenderActor(obj) + sender = await self.apg.ap_get_sender_actor(obj) if repeated: # we don't check sender when item is repeated, as it should be different # from post author in this case - sender_jid = await self.apg.getJIDFromId(sender) - repeater_jid = await self.apg.getJIDFromId(signing_actor) + sender_jid = await self.apg.get_jid_from_id(sender) + repeater_jid = await self.apg.get_jid_from_id(signing_actor) repeated_item_id = obj["id"] - if self.apg.isLocalURL(repeated_item_id): + if self.apg.is_local_url(repeated_item_id): # the repeated object is from XMPP, we need to parse the URL to find # the right ID - url_type, url_args = self.apg.parseAPURL(repeated_item_id) + url_type, url_args = self.apg.parse_apurl(repeated_item_id) if url_type != "item": raise exceptions.DataError( "local URI is not an item: {repeated_id}" @@ -339,7 +339,7 @@ "local URI is invalid: {repeated_id}" ) else: - url_jid, url_node = await self.apg.getJIDAndNode(url_account) + url_jid, url_node = await self.apg.get_jid_and_node(url_account) if ((url_jid != sender_jid or url_node and url_node != self.apg._m.namespace)): raise exceptions.DataError( @@ -352,7 +352,7 @@ obj["_repeated"] = { "by": repeater_jid.full(), "at": data.get("published"), - "uri": uri.buildXMPPUri( + "uri": uri.build_xmpp_uri( "pubsub", path=sender_jid.full(), node=self.apg._m.namespace, @@ -369,9 +369,9 @@ ) continue - await self.apg.newAPItem(client, account_jid, node, obj) + await self.apg.new_ap_item(client, account_jid, node, obj) - async def handleCreateActivity( + async def handle_create_activity( self, request: "HTTPRequest", data: dict, @@ -381,9 +381,9 @@ ap_url: str, signing_actor: str ): - await self.handleNewAPItems(request, data, account_jid, node, signing_actor) + await self.handle_new_ap_items(request, data, account_jid, node, signing_actor) - async def handleUpdateActivity( + async def handle_update_activity( self, request: "HTTPRequest", data: dict, @@ -395,9 +395,9 @@ ): # Update is the same as create: the item ID stays the same, thus the item will be # overwritten - await self.handleNewAPItems(request, data, account_jid, node, signing_actor) + await self.handle_new_ap_items(request, data, account_jid, node, signing_actor) - async def handleAnnounceActivity( + async def handle_announce_activity( self, request: "HTTPRequest", data: dict, @@ -408,7 +408,7 @@ signing_actor: str ): # we create a new item - await self.handleNewAPItems( + await self.handle_new_ap_items( request, data, account_jid, @@ -417,7 +417,7 @@ repeated=True ) - async def handleAttachmentItem( + async def handle_attachment_item( self, client: SatXMPPEntity, data: dict, @@ -447,10 +447,10 @@ await client._ap_storage.aset(f"{ST_AP_CACHE}{data['id']}", data) for target_id in target_ids: - if not self.apg.isLocalURL(target_id): + if not self.apg.is_local_url(target_id): log.debug(f"ignoring non local target ID: {target_id}") continue - url_type, url_args = self.apg.parseAPURL(target_id) + url_type, url_args = self.apg.parse_apurl(target_id) if url_type != TYPE_ITEM: log.warning(f"unexpected local URL for attachment on item {target_id}") continue @@ -458,20 +458,20 @@ account, item_id = url_args except ValueError: raise ValueError(f"invalid URL: {target_id}") - author_jid, item_node = await self.apg.getJIDAndNode(account) + author_jid, item_node = await self.apg.get_jid_and_node(account) if item_node is None: item_node = self.apg._m.namespace - attachment_node = self.apg._pa.getAttachmentNodeName( + attachment_node = self.apg._pa.get_attachment_node_name( author_jid, item_node, item_id ) - cached_node = await self.apg.host.memory.storage.getPubsubNode( + cached_node = await self.apg.host.memory.storage.get_pubsub_node( client, author_jid, attachment_node, with_subscriptions=True, create=True ) - found_items, __ = await self.apg.host.memory.storage.getItems( + found_items, __ = await self.apg.host.memory.storage.get_items( cached_node, item_ids=[client.jid.userhost()] ) if not found_items: @@ -487,16 +487,16 @@ None ) # we reparse the element, as there can be other attachments - attachments_data = self.apg._pa.items2attachmentData(client, [item_elt]) + attachments_data = self.apg._pa.items_2_attachment_data(client, [item_elt]) # and we update the cache - await self.apg.host.memory.storage.cachePubsubItems( + await self.apg.host.memory.storage.cache_pubsub_items( client, cached_node, [item_elt], attachments_data or [{}] ) - if self.apg.isVirtualJID(author_jid): + if self.apg.is_virtual_jid(author_jid): # the attachment is on t a virtual pubsub service (linking to an AP item), # we notify all subscribers for subscription in cached_node.subscriptions: @@ -509,11 +509,11 @@ ) else: # the attachment is on an XMPP item, we publish it to the attachment node - await self.apg._p.sendItems( + await self.apg._p.send_items( client, author_jid, attachment_node, [item_elt] ) - async def handleLikeActivity( + async def handle_like_activity( self, request: "HTTPRequest", data: dict, @@ -523,10 +523,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"noticed": True}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"noticed": True}) - async def handleEmojireactActivity( + async def handle_emojireact_activity( self, request: "HTTPRequest", data: dict, @@ -536,12 +536,12 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, { + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, { "reactions": {"operation": "update", "add": [data["content"]]} }) - async def handleJoinActivity( + async def handle_join_activity( self, request: "HTTPRequest", data: dict, @@ -551,10 +551,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"rsvp": {"attending": "yes"}}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"rsvp": {"attending": "yes"}}) - async def handleLeaveActivity( + async def handle_leave_activity( self, request: "HTTPRequest", data: dict, @@ -564,10 +564,10 @@ ap_url: str, signing_actor: str ) -> None: - client = await self.apg.getVirtualClient(signing_actor) - await self.handleAttachmentItem(client, data, {"rsvp": {"attending": "no"}}) + client = await self.apg.get_virtual_client(signing_actor) + await self.handle_attachment_item(client, data, {"rsvp": {"attending": "no"}}) - async def APActorRequest( + async def ap_actor_request( self, request: "HTTPRequest", data: Optional[dict], @@ -577,24 +577,24 @@ ap_url: str, signing_actor: Optional[str] ) -> dict: - inbox = self.apg.buildAPURL(TYPE_INBOX, ap_account) - shared_inbox = self.apg.buildAPURL(TYPE_SHARED_INBOX) - outbox = self.apg.buildAPURL(TYPE_OUTBOX, ap_account) - followers = self.apg.buildAPURL(TYPE_FOLLOWERS, ap_account) - following = self.apg.buildAPURL(TYPE_FOLLOWING, ap_account) + inbox = self.apg.build_apurl(TYPE_INBOX, ap_account) + shared_inbox = self.apg.build_apurl(TYPE_SHARED_INBOX) + outbox = self.apg.build_apurl(TYPE_OUTBOX, ap_account) + followers = self.apg.build_apurl(TYPE_FOLLOWERS, ap_account) + following = self.apg.build_apurl(TYPE_FOLLOWING, ap_account) # we have to use AP account as preferredUsername because it is used to retrieve # actor handle (see https://socialhub.activitypub.rocks/t/how-to-retrieve-user-server-tld-handle-from-actors-url/2196) preferred_username = ap_account.split("@", 1)[0] - identity_data = await self.apg._i.getIdentity(self.apg.client, account_jid) + identity_data = await self.apg._i.get_identity(self.apg.client, account_jid) if node and node.startswith(self.apg._events.namespace): events = outbox else: - events_account = await self.apg.getAPAccountFromJidAndNode( + events_account = await self.apg.get_ap_account_from_jid_and_node( account_jid, self.apg._events.namespace ) - events = self.apg.buildAPURL(TYPE_OUTBOX, events_account) + events = self.apg.build_apurl(TYPE_OUTBOX, events_account) actor_data = { "@context": [ @@ -636,7 +636,7 @@ except KeyError: log.error(f"incomplete avatar data: {identity_data!r}") else: - avatar_url = self.apg.buildAPURL("avatar", filename) + avatar_url = self.apg.build_apurl("avatar", filename) actor_data["icon"] = { "type": "Image", "url": avatar_url, @@ -645,14 +645,14 @@ return actor_data - def getCanonicalURL(self, request: "HTTPRequest") -> str: + def get_canonical_url(self, request: "HTTPRequest") -> str: return parse.urljoin( f"https://{self.apg.public_url}", request.path.decode().rstrip("/") - # we unescape "@" for the same reason as in [APActorRequest] + # we unescape "@" for the same reason as in [ap_actor_request] ).replace("%40", "@") - def queryData2RSMRequest( + def query_data_2_rsm_request( self, query_data: Dict[str, List[str]] ) -> rsm.RSMRequest: @@ -673,7 +673,7 @@ return rsm.RSMRequest(**kwargs) raise ValueError(f"Invalid query data: {query_data!r}") - async def APOutboxPageRequest( + async def ap_outbox_page_request( self, request: "HTTPRequest", data: Optional[dict], @@ -690,18 +690,18 @@ url_keys = sorted(set(query_data) & {"page", "index", "before", "after"}) query_data = {k: query_data[k] for k in url_keys} try: - items, metadata = await self.apg._p.getItems( + items, metadata = await self.apg._p.get_items( client=self.apg.client, service=account_jid, node=node, - rsm_request=self.queryData2RSMRequest(query_data), + rsm_request=self.query_data_2_rsm_request(query_data), extra = {C.KEY_USE_CACHE: False} ) except error.StanzaError as e: log.warning(f"Can't get data from pubsub node {node} at {account_jid}: {e}") return {} - base_url = self.getCanonicalURL(request) + base_url = self.get_canonical_url(request) url = f"{base_url}?{parse.urlencode(query_data, True)}" if node and node.startswith(self.apg._events.namespace): ordered_items = [ @@ -753,7 +753,7 @@ return ret_data - async def APOutboxRequest( + async def ap_outbox_request( self, request: "HTTPRequest", data: Optional[dict], @@ -769,7 +769,7 @@ parsed_url = parse.urlparse(request.uri.decode()) query_data = parse.parse_qs(parsed_url.query) if query_data: - return await self.APOutboxPageRequest( + return await self.ap_outbox_page_request( request, data, account_jid, node, ap_account, ap_url, query_data ) @@ -779,7 +779,7 @@ # The current workaround is to do a request as if RSM was available, and actually # check its availability according to result. try: - __, metadata = await self.apg._p.getItems( + __, metadata = await self.apg._p.get_items( client=self.apg.client, service=account_jid, node=node, @@ -799,7 +799,7 @@ ) items_count = 20 - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) url_first_page = f"{url}?{parse.urlencode({'page': 'first'})}" url_last_page = f"{url}?{parse.urlencode({'page': 'last'})}" return { @@ -811,7 +811,7 @@ "last": url_last_page, } - async def APInboxRequest( + async def ap_inbox_request( self, request: "HTTPRequest", data: Optional[dict], @@ -824,26 +824,26 @@ assert data is not None if signing_actor is None: raise exceptions.InternalError("signing_actor must be set for inbox requests") - await self.checkSigningActor(data, signing_actor) + await self.check_signing_actor(data, signing_actor) activity_type = (data.get("type") or "").lower() if not activity_type in ACTIVITY_TYPES_LOWER: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"request is not an activity, ignoring" ) if account_jid is None and activity_type not in ACTIVIY_NO_ACCOUNT_ALLOWED: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"{activity_type.title()!r} activity must target an account" ) try: - method = getattr(self, f"handle{activity_type.title()}Activity") + method = getattr(self, f"handle_{activity_type}_activity") except AttributeError: - return self.responseCode( + return self.response_code( request, http.UNSUPPORTED_MEDIA_TYPE, f"{activity_type.title()} activity is not yet supported" @@ -853,7 +853,7 @@ request, data, account_jid, node, ap_account, ap_url, signing_actor ) - async def APFollowersRequest( + async def ap_followers_request( self, request: "HTTPRequest", data: Optional[dict], @@ -866,20 +866,20 @@ if node is None: node = self.apg._m.namespace client = self.apg.client - subscribers = await self.apg._pps.getPublicNodeSubscriptions( + subscribers = await self.apg._pps.get_public_node_subscriptions( client, account_jid, node ) followers = [] for subscriber in subscribers.keys(): - if self.apg.isVirtualJID(subscriber): + if self.apg.is_virtual_jid(subscriber): # the subscriber is an AP user subscribed with this gateway ap_account = self.apg._e.unescape(subscriber.user) else: # regular XMPP user - ap_account = await self.apg.getAPAccountFromJidAndNode(subscriber, node) + ap_account = await self.apg.get_ap_account_from_jid_and_node(subscriber, node) followers.append(ap_account) - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) return { "@context": ["https://www.w3.org/ns/activitystreams"], "type": "OrderedCollection", @@ -892,7 +892,7 @@ } } - async def APFollowingRequest( + async def ap_following_request( self, request: "HTTPRequest", data: Optional[dict], @@ -909,17 +909,17 @@ following = [] for sub_dict in subscriptions: service = jid.JID(sub_dict["service"]) - if self.apg.isVirtualJID(service): + if self.apg.is_virtual_jid(service): # the subscription is to an AP actor with this gateway ap_account = self.apg._e.unescape(service.user) else: # regular XMPP user - ap_account = await self.apg.getAPAccountFromJidAndNode( + ap_account = await self.apg.get_ap_account_from_jid_and_node( service, sub_dict["node"] ) following.append(ap_account) - url = self.getCanonicalURL(request) + url = self.get_canonical_url(request) return { "@context": ["https://www.w3.org/ns/activitystreams"], "type": "OrderedCollection", @@ -953,7 +953,7 @@ to_log.append(f" headers:\n{headers}") return to_log - async def APRequest( + async def ap_request( self, request: "HTTPRequest", data: Optional[dict] = None, @@ -967,13 +967,13 @@ f"https://{self.apg.public_url}", path ) - request_type, extra_args = self.apg.parseAPURL(ap_url) + request_type, extra_args = self.apg.parse_apurl(ap_url) if ((MEDIA_TYPE_AP not in (request.getHeader("accept") or "") and request_type in self.apg.html_redirect)): # this is not a AP request, and we have a redirections for it kw = {} if extra_args: - kw["jid"], kw["node"] = await self.apg.getJIDAndNode(extra_args[0]) + kw["jid"], kw["node"] = await self.apg.get_jid_and_node(extra_args[0]) kw["jid_user"] = kw["jid"].user if kw["node"] is None: kw["node"] = self.apg._m.namespace @@ -1007,7 +1007,7 @@ if len(extra_args) == 0: if request_type != "shared_inbox": raise exceptions.DataError(f"Invalid request type: {request_type!r}") - ret_data = await self.APInboxRequest( + ret_data = await self.ap_inbox_request( request, data, None, None, None, ap_url, signing_actor ) elif request_type == "avatar": @@ -1017,14 +1017,14 @@ avatar_path = self.apg.host.common_cache.getPath(avatar_filename) return static.File(str(avatar_path)).render(request) elif request_type == "item": - ret_data = await self.apg.apGetLocalObject(ap_url) + ret_data = await self.apg.ap_get_local_object(ap_url) if "@context" not in ret_data: ret_data["@context"] = [NS_AP] else: if len(extra_args) > 1: log.warning(f"unexpected extra arguments: {extra_args!r}") ap_account = extra_args[0] - account_jid, node = await self.apg.getJIDAndNode(ap_account) + account_jid, node = await self.apg.get_jid_and_node(ap_account) if request_type not in AP_REQUEST_TYPES.get( request.method.decode().upper(), [] ): @@ -1046,12 +1046,12 @@ log.info("\n".join(to_log)) request.finish() - async def APPostRequest(self, request: "HTTPRequest") -> None: + async def ap_post_request(self, request: "HTTPRequest") -> None: try: data = json.load(request.content) if not isinstance(data, dict): log.warning(f"JSON data should be an object (uri={request.uri.decode()})") - self.responseCode( + self.response_code( request, http.BAD_REQUEST, f"invalid body, was expecting a JSON object" @@ -1059,7 +1059,7 @@ request.finish() return except (json.JSONDecodeError, ValueError) as e: - self.responseCode( + self.response_code( request, http.BAD_REQUEST, f"invalid json in inbox request: {e}" @@ -1081,14 +1081,14 @@ pass try: - signing_actor = await self.checkSignature(request) + signing_actor = await self.check_signature(request) except exceptions.EncryptionError as e: if self.apg.verbose: to_log = self._get_to_log(request) to_log.append(f" body: {request.content.read()!r}") request.content.seek(0) log.info("\n".join(to_log)) - self.responseCode( + self.response_code( request, http.FORBIDDEN, f"invalid signature: {e}" @@ -1096,7 +1096,7 @@ request.finish() return except Exception as e: - self.responseCode( + self.response_code( request, http.INTERNAL_SERVER_ERROR, f"Can't check signature: {e}" @@ -1115,27 +1115,27 @@ # default response code, may be changed, e.g. in case of exception try: - return await self.APRequest(request, data, signing_actor) + return await self.ap_request(request, data, signing_actor) except Exception as e: - self._onRequestError(failure.Failure(e), request) + self._on_request_error(failure.Failure(e), request) - async def checkSigningActor(self, data: dict, signing_actor: str) -> None: + async def check_signing_actor(self, data: dict, signing_actor: str) -> None: """That that signing actor correspond to actor declared in data @param data: request payload @param signing_actor: actor ID of the signing entity, as returned by - checkSignature + check_signature @raise exceptions.NotFound: no actor found in data @raise exceptions.EncryptionError: signing actor doesn't match actor in data """ - actor = await self.apg.apGetSenderActor(data) + actor = await self.apg.ap_get_sender_actor(data) if signing_actor != actor: raise exceptions.EncryptionError( f"signing actor ({signing_actor}) doesn't match actor in data ({actor})" ) - async def checkSignature(self, request: "HTTPRequest") -> str: + async def check_signature(self, request: "HTTPRequest") -> str: """Check and validate HTTP signature @return: id of the signing actor @@ -1242,7 +1242,7 @@ raise exceptions.EncryptionError( "Only SHA-256 algorithm is currently supported for digest" ) - __, computed_digest = self.apg.getDigest(body) + __, computed_digest = self.apg.get_digest(body) if given_digest != computed_digest: raise exceptions.EncryptionError( f"SHA-256 given and computed digest differ:\n" @@ -1275,7 +1275,7 @@ raise exceptions.EncryptionError("Signature has expired") try: - return await self.apg.checkSignature( + return await self.apg.check_signature( sign_data["signature"], key_id, headers @@ -1287,7 +1287,7 @@ "Using workaround for (request-target) encoding bug in signature, " "see https://github.com/mastodon/mastodon/issues/18871" ) - return await self.apg.checkSignature( + return await self.apg.check_signature( sign_data["signature"], key_id, headers @@ -1303,8 +1303,8 @@ defer.ensureDeferred(self.webfinger(request)) return server.NOT_DONE_YET elif path.startswith(self.apg.ap_path): - d = defer.ensureDeferred(self.APRequest(request)) - d.addErrback(self._onRequestError, request) + d = defer.ensureDeferred(self.ap_request(request)) + d.addErrback(self._on_request_error, request) return server.NOT_DONE_YET return web_resource.NoResource().render(request) @@ -1313,7 +1313,7 @@ path = request.path.decode().lstrip("/") if not path.startswith(self.apg.ap_path): return web_resource.NoResource().render(request) - defer.ensureDeferred(self.APPostRequest(request)) + defer.ensureDeferred(self.ap_post_request(request)) return server.NOT_DONE_YET
--- a/sat/plugins/plugin_comp_ap_gateway/pubsub_service.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_ap_gateway/pubsub_service.py Sat Apr 08 13:54:42 2023 +0200 @@ -34,7 +34,7 @@ from sat.core.constants import Const as C from sat.tools import image from sat.tools.utils import ensure_deferred -from sat.tools.web import downloadFile +from sat.tools.web import download_file from sat.memory.sqla_mapping import PubsubSub, SubscriptionState from .constants import ( @@ -74,7 +74,7 @@ "name": "Libervia ActivityPub Gateway", } - async def getAPActorIdsAndInbox( + async def get_ap_actor_ids_and_inbox( self, requestor: jid.JID, recipient: jid.JID, @@ -92,16 +92,16 @@ "item-not-found", text="No user part specified" ) - requestor_actor_id = self.apg.buildAPURL(TYPE_ACTOR, requestor.userhost()) + requestor_actor_id = self.apg.build_apurl(TYPE_ACTOR, requestor.userhost()) recipient_account = self.apg._e.unescape(recipient.user) - recipient_actor_id = await self.apg.getAPActorIdFromAccount(recipient_account) - inbox = await self.apg.getAPInboxFromId(recipient_actor_id, use_shared=False) + recipient_actor_id = await self.apg.get_ap_actor_id_from_account(recipient_account) + inbox = await self.apg.get_ap_inbox_from_id(recipient_actor_id, use_shared=False) return requestor_actor_id, recipient_actor_id, inbox @ensure_deferred async def publish(self, requestor, service, nodeIdentifier, items): - if self.apg.local_only and not self.apg.isLocal(requestor): + if self.apg.local_only and not self.apg.is_local(requestor): raise error.StanzaError( "forbidden", "Only local users can publish on this gateway." @@ -118,19 +118,19 @@ f"{ap_account!r} is not a valid ActivityPub actor account." ) - client = self.apg.client.getVirtualClient(requestor) - if self.apg._pa.isAttachmentNode(nodeIdentifier): - await self.apg.convertAndPostAttachments( + client = self.apg.client.get_virtual_client(requestor) + if self.apg._pa.is_attachment_node(nodeIdentifier): + await self.apg.convert_and_post_attachments( client, ap_account, service, nodeIdentifier, items, publisher=requestor ) else: - await self.apg.convertAndPostItems( + await self.apg.convert_and_post_items( client, ap_account, service, nodeIdentifier, items ) - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, nodeIdentifier, with_subscriptions=True, create=True ) - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( client, cached_node, items @@ -144,27 +144,27 @@ [(subscription.subscriber, None, items)] ) - async def apFollowing2Elt(self, ap_item: dict) -> domish.Element: + async def ap_following_2_elt(self, ap_item: dict) -> domish.Element: """Convert actor ID from following collection to XMPP item""" actor_id = ap_item["id"] - actor_jid = await self.apg.getJIDFromId(actor_id) - subscription_elt = self.apg._pps.buildSubscriptionElt( + actor_jid = await self.apg.get_jid_from_id(actor_id) + subscription_elt = self.apg._pps.build_subscription_elt( self.apg._m.namespace, actor_jid ) item_elt = pubsub.Item(id=actor_id, payload=subscription_elt) return item_elt - async def apFollower2Elt(self, ap_item: dict) -> domish.Element: + async def ap_follower_2_elt(self, ap_item: dict) -> domish.Element: """Convert actor ID from followers collection to XMPP item""" actor_id = ap_item["id"] - actor_jid = await self.apg.getJIDFromId(actor_id) - subscriber_elt = self.apg._pps.buildSubscriberElt(actor_jid) + actor_jid = await self.apg.get_jid_from_id(actor_id) + subscriber_elt = self.apg._pps.build_subscriber_elt(actor_jid) item_elt = pubsub.Item(id=actor_id, payload=subscriber_elt) return item_elt - async def generateVCard(self, ap_account: str) -> domish.Element: + async def generate_v_card(self, ap_account: str) -> domish.Element: """Generate vCard4 (XEP-0292) item element from ap_account's metadata""" - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) identity_data = {} summary = actor_data.get("summary") @@ -181,13 +181,13 @@ value = actor_data.get(field) if value: identity_data.setdefault("nicknames", []).append(value) - vcard_elt = self.apg._v.dict2VCard(identity_data) + vcard_elt = self.apg._v.dict_2_v_card(identity_data) item_elt = domish.Element((pubsub.NS_PUBSUB, "item")) item_elt.addChild(vcard_elt) item_elt["id"] = self.apg._p.ID_SINGLETON return item_elt - async def getAvatarData( + async def get_avatar_data( self, client: SatXMPPEntity, ap_account: str @@ -197,9 +197,9 @@ ``cache_uid``, `path``` and ``media_type`` keys are always files ``base64`` key is only filled if the file was not already in cache """ - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) - for icon in await self.apg.apGetList(actor_data, "icon"): + for icon in await self.apg.ap_get_list(actor_data, "icon"): url = icon.get("url") if icon["type"] != "Image" or not url: continue @@ -221,19 +221,19 @@ if cache_uid is None: cache = None else: - cache = self.apg.host.common_cache.getMetadata(cache_uid) + cache = self.apg.host.common_cache.get_metadata(cache_uid) if cache is None: with tempfile.TemporaryDirectory() as dir_name: dest_path = Path(dir_name, filename) - await downloadFile(url, dest_path, max_size=MAX_AVATAR_SIZE) + await download_file(url, dest_path, max_size=MAX_AVATAR_SIZE) avatar_data = { "path": dest_path, "filename": filename, 'media_type': image.guess_type(dest_path), } - await self.apg._i.cacheAvatar( + await self.apg._i.cache_avatar( self.apg.IMPORT_NAME, avatar_data ) @@ -246,7 +246,7 @@ return avatar_data - async def generateAvatarMetadata( + async def generate_avatar_metadata( self, client: SatXMPPEntity, ap_account: str @@ -256,14 +256,14 @@ @raise StanzaError("item-not-found"): no avatar is present in actor data (in ``icon`` field) """ - avatar_data = await self.getAvatarData(client, ap_account) - return self.apg._a.buildItemMetadataElt(avatar_data) + avatar_data = await self.get_avatar_data(client, ap_account) + return self.apg._a.build_item_metadata_elt(avatar_data) - def _blockingB64EncodeAvatar(self, avatar_data: Dict[str, Any]) -> None: + def _blocking_b_6_4_encode_avatar(self, avatar_data: Dict[str, Any]) -> None: with avatar_data["path"].open("rb") as f: avatar_data["base64"] = b64encode(f.read()).decode() - async def generateAvatarData( + async def generate_avatar_data( self, client: SatXMPPEntity, ap_account: str, @@ -274,9 +274,9 @@ @raise StanzaError("item-not-found"): no avatar cached with requested ID """ if not itemIdentifiers: - avatar_data = await self.getAvatarData(client, ap_account) + avatar_data = await self.get_avatar_data(client, ap_account) if "base64" not in avatar_data: - await threads.deferToThread(self._blockingB64EncodeAvatar, avatar_data) + await threads.deferToThread(self._blocking_b_6_4_encode_avatar, avatar_data) else: if len(itemIdentifiers) > 1: # only a single item ID is supported @@ -284,16 +284,16 @@ item_id = itemIdentifiers[0] # just to be sure that that we don't have an empty string assert item_id - cache_data = self.apg.host.common_cache.getMetadata(item_id) + cache_data = self.apg.host.common_cache.get_metadata(item_id) if cache_data is None: raise error.StanzaError("item-not-found") avatar_data = { "cache_uid": item_id, "path": cache_data["path"] } - await threads.deferToThread(self._blockingB64EncodeAvatar, avatar_data) + await threads.deferToThread(self._blocking_b_6_4_encode_avatar, avatar_data) - return self.apg._a.buildItemDataElt(avatar_data) + return self.apg._a.build_item_data_elt(avatar_data) @ensure_deferred async def items( @@ -320,31 +320,31 @@ if node == self.apg._pps.subscriptions_node: collection_name = "following" - parser = self.apFollowing2Elt + parser = self.ap_following_2_elt kwargs["only_ids"] = True use_cache = False elif node.startswith(self.apg._pps.subscribers_node_prefix): collection_name = "followers" - parser = self.apFollower2Elt + parser = self.ap_follower_2_elt kwargs["only_ids"] = True use_cache = False elif node == self.apg._v.node: # vCard4 request - item_elt = await self.generateVCard(ap_account) + item_elt = await self.generate_v_card(ap_account) return [item_elt], None elif node == self.apg._a.namespace_metadata: - item_elt = await self.generateAvatarMetadata(self.apg.client, ap_account) + item_elt = await self.generate_avatar_metadata(self.apg.client, ap_account) return [item_elt], None elif node == self.apg._a.namespace_data: - item_elt = await self.generateAvatarData( + item_elt = await self.generate_avatar_data( self.apg.client, ap_account, itemIdentifiers ) return [item_elt], None - elif self.apg._pa.isAttachmentNode(node): + elif self.apg._pa.is_attachment_node(node): use_cache = True # we check cache here because we emit an item-not-found error if the node is # not in cache, as we are not dealing with real AP items - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) if cached_node is None: @@ -365,14 +365,14 @@ if use_cache: if cached_node is None: - cached_node = await self.host.memory.storage.getPubsubNode( + cached_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) # TODO: check if node is synchronised if cached_node is not None: # the node is cached, we return items from cache log.debug(f"node {node!r} from {service} is in cache") - pubsub_items, metadata = await self.apg._c.getItemsFromCache( + pubsub_items, metadata = await self.apg._c.get_items_from_cache( client, cached_node, maxItems, itemIdentifiers, rsm_request=rsm_req ) try: @@ -384,7 +384,7 @@ if itemIdentifiers: items = [] for item_id in itemIdentifiers: - item_data = await self.apg.apGet(item_id) + item_data = await self.apg.ap_get(item_id) item_elt = await parser(item_data) items.append(item_elt) return items, None @@ -419,11 +419,11 @@ f"No cache found for node {node} at {service} (AP account {ap_account}), " "using Collection Paging to RSM translation" ) - if self.apg._m.isCommentNode(node): - parent_item = self.apg._m.getParentItem(node) + if self.apg._m.is_comment_node(node): + parent_item = self.apg._m.get_parent_item(node) try: - parent_data = await self.apg.apGet(parent_item) - collection = await self.apg.apGetObject( + parent_data = await self.apg.ap_get(parent_item) + collection = await self.apg.ap_get_object( parent_data.get("object", {}), "replies" ) @@ -433,8 +433,8 @@ text=e ) else: - actor_data = await self.apg.getAPActorDataFromAccount(ap_account) - collection = await self.apg.apGetObject(actor_data, collection_name) + actor_data = await self.apg.get_ap_actor_data_from_account(ap_account) + collection = await self.apg.ap_get_object(actor_data, collection_name) if not collection: raise error.StanzaError( "item-not-found", @@ -442,7 +442,7 @@ ) kwargs["parser"] = parser - return await self.apg.getAPItems(collection, **kwargs) + return await self.apg.get_ap_items(collection, **kwargs) @ensure_deferred async def retract(self, requestor, service, nodeIdentifier, itemIdentifiers): @@ -459,11 +459,11 @@ sub_state = SubscriptionState.PENDING else: sub_state = SubscriptionState.SUBSCRIBED - node = await self.host.memory.storage.getPubsubNode( + node = await self.host.memory.storage.get_pubsub_node( client, service, nodeIdentifier, with_subscriptions=True ) if node is None: - node = await self.host.memory.storage.setPubsubNode( + node = await self.host.memory.storage.set_pubsub_node( client, service, nodeIdentifier, @@ -510,13 +510,13 @@ if nodeIdentifier in (self.apg._m.namespace, self.apg._events.namespace): # if we subscribe to microblog or events node, we follow the corresponding # account - req_actor_id, recip_actor_id, inbox = await self.getAPActorIdsAndInbox( + req_actor_id, recip_actor_id, inbox = await self.get_ap_actor_ids_and_inbox( requestor, service ) data = self.apg.create_activity("Follow", req_actor_id, recip_actor_id) - resp = await self.apg.signAndPost(inbox, req_actor_id, data) + resp = await self.apg.sign_and_post(inbox, req_actor_id, data) if resp.code >= 300: text = await resp.text() raise error.StanzaError("service-unavailable", text=text) @@ -524,7 +524,7 @@ @ensure_deferred async def unsubscribe(self, requestor, service, nodeIdentifier, subscriber): - req_actor_id, recip_actor_id, inbox = await self.getAPActorIdsAndInbox( + req_actor_id, recip_actor_id, inbox = await self.get_ap_actor_ids_and_inbox( requestor, service ) data = self.apg.create_activity( @@ -537,7 +537,7 @@ ) ) - resp = await self.apg.signAndPost(inbox, req_actor_id, data) + resp = await self.apg.sign_and_post(inbox, req_actor_id, data) if resp.code >= 300: text = await resp.text() raise error.StanzaError("service-unavailable", text=text)
--- a/sat/plugins/plugin_comp_file_sharing.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_file_sharing.py Sat Apr 08 13:54:42 2023 +0200 @@ -113,7 +113,7 @@ return resource.ErrorPage(code, brief, details).render(request) - def getDispositionType(self, media_type, media_subtype): + def get_disposition_type(self, media_type, media_subtype): if media_type in ('image', 'video'): return 'inline' elif media_type == 'application' and media_subtype == 'pdf': @@ -136,7 +136,7 @@ "Date, Content-Length, Content-Range") return super().render(request) - def render_OPTIONS(self, request): + def render_options(self, request): request.setResponseCode(http.OK) return b"" @@ -146,17 +146,17 @@ except exceptions.DataError: return self.errorPage(request, http.NOT_FOUND) - defer.ensureDeferred(self.renderGet(request)) + defer.ensureDeferred(self.render_get(request)) return server.NOT_DONE_YET - async def renderGet(self, request): + async def render_get(self, request): try: upload_id, filename = request.upload_data except exceptions.DataError: request.write(self.errorPage(request, http.FORBIDDEN)) request.finish() return - found_files = await request.file_sharing.host.memory.getFiles( + found_files = await request.file_sharing.host.memory.get_files( client=None, peer_jid=None, perms_to_check=None, public_id=upload_id) if not found_files: request.write(self.errorPage(request, http.NOT_FOUND)) @@ -170,7 +170,7 @@ file_res = static.File(file_path) file_res.type = f'{found_file["media_type"]}/{found_file["media_subtype"]}' file_res.encoding = file_res.contentEncodings.get(Path(found_file['name']).suffix) - disp_type = self.getDispositionType( + disp_type = self.get_disposition_type( found_file['media_type'], found_file['media_subtype']) # the URL is percent encoded, and not all browsers/tools unquote the file name, # thus we add a content disposition header @@ -190,10 +190,10 @@ request.finish() def render_PUT(self, request): - defer.ensureDeferred(self.renderPut(request)) + defer.ensureDeferred(self.render_put(request)) return server.NOT_DONE_YET - async def renderPut(self, request): + async def render_put(self, request): try: client, upload_request = request.upload_request_data upload_id, filename = request.upload_data @@ -228,7 +228,7 @@ "path": path } - await request.file_sharing.registerReceivedFile( + await request.file_sharing.register_received_file( client, upload_request.from_, file_data, tmp_file_path, public_id=public_id, ) @@ -273,7 +273,7 @@ def file_tmp_dir(self): return self.channel.site.file_tmp_dir - def refuseRequest(self): + def refuse_request(self): if self.content is not None: self.content.close() self.content = open(os.devnull, 'w+b') @@ -287,16 +287,16 @@ upload_id, filename = self.upload_data except exceptions.DataError as e: log.warning(f"Invalid PUT request, we stop here: {e}") - return self.refuseRequest() + return self.refuse_request() try: client, upload_request, timer = self.file_sharing.expected_uploads.pop(upload_id) except KeyError: log.warning(f"unknown (expired?) upload ID received for a PUT: {upload_id!r}") - return self.refuseRequest() + return self.refuse_request() if not timer.active: log.warning(f"upload id {upload_id!r} used for a PUT, but it is expired") - return self.refuseRequest() + return self.refuse_request() timer.cancel() @@ -305,7 +305,7 @@ f"invalid filename for PUT (upload id: {upload_id!r}, URL: {self.channel._path.decode()}). Original " f"{upload_request.filename!r} doesn't match {filename!r}" ) - return self.refuseRequest() + return self.refuse_request() self.upload_request_data = (client, upload_request) @@ -355,24 +355,24 @@ self._h = self.host.plugins["XEP-0300"] self._t = self.host.plugins["XEP-0264"] self._hu = self.host.plugins["XEP-0363"] - self._hu.registerHandler(self._on_http_upload) - self.host.trigger.add("FILE_getDestDir", self._getDestDirTrigger) + self._hu.register_handler(self._on_http_upload) + self.host.trigger.add("FILE_getDestDir", self._get_dest_dir_trigger) self.host.trigger.add( - "XEP-0234_fileSendingRequest", self._fileSendingRequestTrigger, priority=1000 + "XEP-0234_fileSendingRequest", self._file_sending_request_trigger, priority=1000 ) - self.host.trigger.add("XEP-0234_buildFileElement", self._addFileMetadataElts) - self.host.trigger.add("XEP-0234_parseFileElement", self._getFileMetadataElts) - self.host.trigger.add("XEP-0329_compGetFilesFromNode", self._addFileMetadata) + self.host.trigger.add("XEP-0234_buildFileElement", self._add_file_metadata_elts) + self.host.trigger.add("XEP-0234_parseFileElement", self._get_file_metadata_elts) + self.host.trigger.add("XEP-0329_compGetFilesFromNode", self._add_file_metadata) self.host.trigger.add( "XEP-0329_compGetFilesFromNode_build_directory", - self._addDirectoryMetadataElts) + self._add_directory_metadata_elts) self.host.trigger.add( "XEP-0329_parseResult_directory", - self._getDirectoryMetadataElts) + self._get_directory_metadata_elts) self.files_path = self.host.get_local_path(None, C.FILES_DIR) - self.http_port = int(self.host.memory.getConfig( + self.http_port = int(self.host.memory.config_get( 'component file-sharing', 'http_upload_port', 8888)) - connection_type = self.host.memory.getConfig( + connection_type = self.host.memory.config_get( 'component file-sharing', 'http_upload_connection_type', 'https') if connection_type not in ('http', 'https'): raise exceptions.ConfigError( @@ -383,51 +383,51 @@ if connection_type == 'http': reactor.listenTCP(self.http_port, self.server) else: - options = tls.getOptionsFromConfig( + options = tls.get_options_from_config( self.host.memory.config, "component file-sharing") - tls.TLSOptionsCheck(options) - context_factory = tls.getTLSContextFactory(options) + tls.tls_options_check(options) + context_factory = tls.get_tls_context_factory(options) reactor.listenSSL(self.http_port, self.server, context_factory) - def getHandler(self, client): + def get_handler(self, client): return Comments_handler(self) - def profileConnecting(self, client): + def profile_connecting(self, client): # we activate HTTP upload client.enabled_features.add("XEP-0363") self.init() - public_base_url = self.host.memory.getConfig( + public_base_url = self.host.memory.config_get( 'component file-sharing', 'http_upload_public_facing_url') if public_base_url is None: client._file_sharing_base_url = f"https://{client.host}:{self.http_port}" else: client._file_sharing_base_url = public_base_url path = client.file_tmp_dir = os.path.join( - self.host.memory.getConfig("", "local_dir"), + self.host.memory.config_get("", "local_dir"), C.FILES_TMP_DIR, - regex.pathEscape(client.profile), + regex.path_escape(client.profile), ) if not os.path.exists(path): os.makedirs(path) - def getQuota(self, client, entity): + def get_quota(self, client, entity): """Return maximum size allowed for all files for entity""" - quotas = self.host.memory.getConfig("component file-sharing", "quotas_json", {}) - if self.host.memory.isAdminJID(entity): + quotas = self.host.memory.config_get("component file-sharing", "quotas_json", {}) + if self.host.memory.is_admin_jid(entity): quota = quotas.get("admins") else: try: quota = quotas["jids"][entity.userhost()] except KeyError: quota = quotas.get("users") - return None if quota is None else utils.parseSize(quota) + return None if quota is None else utils.parse_size(quota) async def generate_thumbnails(self, extra: dict, image_path: Path): thumbnails = extra.setdefault(C.KEY_THUMBNAILS, []) for max_thumb_size in self._t.SIZES: try: - thumb_size, thumb_id = await self._t.generateThumbnail( + thumb_size, thumb_id = await self._t.generate_thumbnail( image_path, max_thumb_size, # we keep thumbnails for 6 months @@ -438,7 +438,7 @@ break thumbnails.append({"id": thumb_id, "size": thumb_size}) - async def registerReceivedFile( + async def register_received_file( self, client, peer_jid, file_data, file_path, public_id=None, extra=None): """Post file reception tasks @@ -460,9 +460,9 @@ log.debug(_("Reusing already generated hash")) file_hash = file_data["hash_hasher"].hexdigest() else: - hasher = self._h.getHasher(HASH_ALGO) + hasher = self._h.get_hasher(HASH_ALGO) with file_path.open('rb') as f: - file_hash = await self._h.calculateHash(f, hasher) + file_hash = await self._h.calculate_hash(f, hasher) final_path = self.files_path/file_hash if final_path.is_file(): @@ -493,7 +493,7 @@ else: await self.generate_thumbnails(extra, thumb_path) - await self.host.memory.setFile( + await self.host.memory.set_file( client, name=name, version="", @@ -508,7 +508,7 @@ extra=extra, ) - async def _getDestDirTrigger( + async def _get_dest_dir_trigger( self, client, peer_jid, transfer_data, file_data, stream_object ): """This trigger accept file sending request, and store file locally""" @@ -522,17 +522,17 @@ assert C.KEY_PROGRESS_ID in file_data filename = file_data["name"] assert filename and not "/" in filename - quota = self.getQuota(client, peer_jid) + quota = self.get_quota(client, peer_jid) if quota is not None: - used_space = await self.host.memory.fileGetUsedSpace(client, peer_jid) + used_space = await self.host.memory.file_get_used_space(client, peer_jid) if (used_space + file_data["size"]) > quota: raise error.StanzaError( "not-acceptable", text=OVER_QUOTA_TXT.format( - quota=utils.getHumanSize(quota), - used_space=utils.getHumanSize(used_space), - file_size=utils.getHumanSize(file_data['size']) + quota=utils.get_human_size(quota), + used_space=utils.get_human_size(used_space), + file_size=utils.get_human_size(file_data['size']) ) ) file_tmp_dir = self.host.get_local_path( @@ -543,26 +543,26 @@ transfer_data["finished_d"].addCallback( lambda __: defer.ensureDeferred( - self.registerReceivedFile(client, peer_jid, file_data, file_tmp_path) + self.register_received_file(client, peer_jid, file_data, file_tmp_path) ) ) - self._f.openFileWrite( + self._f.open_file_write( client, file_tmp_path, transfer_data, file_data, stream_object ) return False, True - async def _retrieveFiles( + async def _retrieve_files( self, client, session, content_data, content_name, file_data, file_elt ): """This method retrieve a file on request, and send if after checking permissions""" peer_jid = session["peer_jid"] if session['local_jid'].user: - owner = client.getOwnerFromJid(session['local_jid']) + owner = client.get_owner_from_jid(session['local_jid']) else: owner = peer_jid try: - found_files = await self.host.memory.getFiles( + found_files = await self.host.memory.get_files( client, peer_jid=peer_jid, name=file_data.get("name"), @@ -595,7 +595,7 @@ type_=found_file['type'])) file_hash = found_file["file_hash"] file_path = self.files_path / file_hash - file_data["hash_hasher"] = hasher = self._h.getHasher(found_file["hash_algo"]) + file_data["hash_hasher"] = hasher = self._h.get_hasher(found_file["hash_algo"]) size = file_data["size"] = found_file["size"] file_data["file_hash"] = file_hash file_data["hash_algo"] = found_file["hash_algo"] @@ -608,13 +608,13 @@ self.host, client, file_path, - uid=self._jf.getProgressId(session, content_name), + uid=self._jf.get_progress_id(session, content_name), size=size, data_cb=lambda data: hasher.update(data), ) return True - def _fileSendingRequestTrigger( + def _file_sending_request_trigger( self, client, session, content_data, content_name, file_data, file_elt ): if not client.is_component: @@ -622,7 +622,7 @@ else: return ( False, - defer.ensureDeferred(self._retrieveFiles( + defer.ensureDeferred(self._retrieve_files( client, session, content_data, content_name, file_data, file_elt )), ) @@ -642,19 +642,19 @@ if request.from_.host not in client._file_sharing_allowed_hosts: raise error.StanzaError("forbidden") - quota = self.getQuota(client, request.from_) + quota = self.get_quota(client, request.from_) if quota is not None: - used_space = await self.host.memory.fileGetUsedSpace(client, request.from_) + used_space = await self.host.memory.file_get_used_space(client, request.from_) if (used_space + request.size) > quota: raise error.StanzaError( "not-acceptable", text=OVER_QUOTA_TXT.format( - quota=utils.getHumanSize(quota), - used_space=utils.getHumanSize(used_space), - file_size=utils.getHumanSize(request.size) + quota=utils.get_human_size(quota), + used_space=utils.get_human_size(used_space), + file_size=utils.get_human_size(request.size) ), - appCondition = self._hu.getFileTooLargeElt(max(quota - used_space, 0)) + appCondition = self._hu.get_file_too_large_elt(max(quota - used_space, 0)) ) upload_id = shortuuid.ShortUUID().random(length=30) @@ -671,7 +671,7 @@ ## metadata triggers ## - def _addFileMetadataElts(self, client, file_elt, extra_args): + def _add_file_metadata_elts(self, client, file_elt, extra_args): # affiliation affiliation = extra_args.get('affiliation') if affiliation is not None: @@ -693,7 +693,7 @@ comment_elt["count"] = str(count) return True - def _getFileMetadataElts(self, client, file_elt, file_data): + def _get_file_metadata_elts(self, client, file_elt, file_data): # affiliation try: affiliation_elt = next(file_elt.elements(NS_FS_AFFILIATION, "affiliation")) @@ -712,17 +712,17 @@ file_data["comments_count"] = comments_elt["count"] return True - def _addFileMetadata( + def _add_file_metadata( self, client, iq_elt, iq_result_elt, owner, node_path, files_data): for file_data in files_data: - file_data["comments_url"] = uri.buildXMPPUri( + file_data["comments_url"] = uri.build_xmpp_uri( "pubsub", path=client.jid.full(), node=COMMENT_NODE_PREFIX + file_data["id"], ) return True - def _addDirectoryMetadataElts( + def _add_directory_metadata_elts( self, client, file_data, directory_elt, owner, node_path): affiliation = file_data.get('affiliation') if affiliation is not None: @@ -731,7 +731,7 @@ content=affiliation ) - def _getDirectoryMetadataElts( + def _get_directory_metadata_elts( self, client, elt, file_data): try: affiliation_elt = next(elt.elements(NS_FS_AFFILIATION, "affiliation")) @@ -754,7 +754,7 @@ "name": "files commenting service", } - def _getFileId(self, nodeIdentifier): + def _get_file_id(self, nodeIdentifier): if not nodeIdentifier.startswith(COMMENT_NODE_PREFIX): raise error.StanzaError("item-not-found") file_id = nodeIdentifier[len(COMMENT_NODE_PREFIX) :] @@ -762,10 +762,10 @@ raise error.StanzaError("item-not-found") return file_id - async def getFileData(self, requestor, nodeIdentifier): - file_id = self._getFileId(nodeIdentifier) + async def get_file_data(self, requestor, nodeIdentifier): + file_id = self._get_file_id(nodeIdentifier) try: - files = await self.host.memory.getFiles(self.parent, requestor, file_id) + files = await self.host.memory.get_files(self.parent, requestor, file_id) except (exceptions.NotFound, exceptions.PermissionError): # we don't differenciate between NotFound and PermissionError # to avoid leaking information on existing files @@ -776,7 +776,7 @@ raise error.InternalError("there should be only one file") return files[0] - def commentsUpdate(self, extra, new_comments, peer_jid): + def comments_update(self, extra, new_comments, peer_jid): """update comments (replace or insert new_comments) @param extra(dict): extra data to update @@ -807,7 +807,7 @@ current_comments.extend(new_comments) - def commentsDelete(self, extra, comments): + def comments_delete(self, extra, comments): try: comments_dict = extra["comments"] except KeyError: @@ -818,7 +818,7 @@ except ValueError: continue - def _getFrom(self, item_elt): + def _get_from(self, item_elt): """retrieve publisher of an item @param item_elt(domish.element): <item> element @@ -832,22 +832,22 @@ @ensure_deferred async def publish(self, requestor, service, nodeIdentifier, items): # we retrieve file a first time to check authorisations - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) file_id = file_data["id"] - comments = [(item["id"], self._getFrom(item), item.toXml()) for item in items] + comments = [(item["id"], self._get_from(item), item.toXml()) for item in items] if requestor.userhostJID() == file_data["owner"]: peer_jid = None else: peer_jid = requestor.userhost() - update_cb = partial(self.commentsUpdate, new_comments=comments, peer_jid=peer_jid) + update_cb = partial(self.comments_update, new_comments=comments, peer_jid=peer_jid) try: - await self.host.memory.fileUpdate(file_id, "extra", update_cb) + await self.host.memory.file_update(file_id, "extra", update_cb) except exceptions.PermissionError: raise error.StanzaError("not-authorized") @ensure_deferred async def items(self, requestor, service, nodeIdentifier, maxItems, itemIdentifiers): - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) comments = file_data["extra"].get("comments", []) if itemIdentifiers: return [generic.parseXml(c[2]) for c in comments if c[0] in itemIdentifiers] @@ -856,7 +856,7 @@ @ensure_deferred async def retract(self, requestor, service, nodeIdentifier, itemIdentifiers): - file_data = await self.getFileData(requestor, nodeIdentifier) + file_data = await self.get_file_data(requestor, nodeIdentifier) file_id = file_data["id"] try: comments = file_data["extra"]["comments"] @@ -880,5 +880,5 @@ if not all([c[1] == requestor.userhost() for c in to_remove]): raise error.StanzaError("not-authorized") - remove_cb = partial(self.commentsDelete, comments=to_remove) - await self.host.memory.fileUpdate(file_id, "extra", remove_cb) + remove_cb = partial(self.comments_delete, comments=to_remove) + await self.host.memory.file_update(file_id, "extra", remove_cb)
--- a/sat/plugins/plugin_comp_file_sharing_management.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_comp_file_sharing_management.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,8 +74,8 @@ self._c = host.plugins["XEP-0050"] self._t = host.plugins["XEP-0264"] self.files_path = host.get_local_path(None, C.FILES_DIR) - host.bridge.addMethod( - "fileSharingDelete", + host.bridge.add_method( + "file_sharing_delete", ".plugin", in_sign="ssss", out_sign="", @@ -83,30 +83,30 @@ async_=True, ) - def profileConnected(self, client): - self._c.addAdHocCommand( - client, self._onChangeFile, "Change Permissions of File(s)", + def profile_connected(self, client): + self._c.add_ad_hoc_command( + client, self._on_change_file, "Change Permissions of File(s)", node=NS_FILE_MANAGEMENT_PERM, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onDeleteFile, "Delete File(s)", + self._c.add_ad_hoc_command( + client, self._on_delete_file, "Delete File(s)", node=NS_FILE_MANAGEMENT_DELETE, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onGenThumbnails, "Generate Thumbnails", + self._c.add_ad_hoc_command( + client, self._on_gen_thumbnails, "Generate Thumbnails", node=NS_FILE_MANAGEMENT_THUMB, allowed_magics=C.ENTITY_ALL, ) - self._c.addAdHocCommand( - client, self._onQuota, "Get Quota", + self._c.add_ad_hoc_command( + client, self._on_quota, "Get Quota", node=NS_FILE_MANAGEMENT_QUOTA, allowed_magics=C.ENTITY_ALL, ) def _delete(self, service_jid_s, path, namespace, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service_jid = jid.JID(service_jid_s) if service_jid_s else None return defer.ensureDeferred(self._c.sequence( client, @@ -127,7 +127,7 @@ note = (self._c.NOTE.ERROR, reason) return payload, status, None, note - def _getRootArgs(self): + def _get_root_args(self): """Create the form to select the file to use @return (tuple): arguments to use in defer.returnValue @@ -149,7 +149,7 @@ payload = form.toElement() return payload, status, None, None - async def _getFileData(self, client, session_data, command_form): + async def _get_file_data(self, client, session_data, command_form): """Retrieve field requested in root form "found_file" will also be set in session_data @@ -162,10 +162,10 @@ path = fields['path'].value.strip() namespace = fields['namespace'].value or None except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if not path: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) requestor = session_data['requestor'] requestor_bare = requestor.userhostJID() @@ -176,7 +176,7 @@ # this must be managed try: - found_files = await self.host.memory.getFiles( + found_files = await self.host.memory.get_files( client, requestor_bare, path=parent_path, name=basename, namespace=namespace) found_file = found_files[0] @@ -194,7 +194,7 @@ session_data['namespace'] = namespace return found_file - def _updateReadPermission(self, access, allowed_jids): + def _update_read_permission(self, access, allowed_jids): if not allowed_jids: if C.ACCESS_PERM_READ in access: del access[C.ACCESS_PERM_READ] @@ -208,27 +208,27 @@ "jids": [j.full() for j in allowed_jids] } - async def _updateDir(self, client, requestor, namespace, file_data, allowed_jids): + async def _update_dir(self, client, requestor, namespace, file_data, allowed_jids): """Recursively update permission of a directory and all subdirectories @param file_data(dict): metadata of the file @param allowed_jids(list[jid.JID]): list of entities allowed to read the file """ assert file_data['type'] == C.FILE_TYPE_DIRECTORY - files_data = await self.host.memory.getFiles( + files_data = await self.host.memory.get_files( client, requestor, parent=file_data['id'], namespace=namespace) for file_data in files_data: if not file_data['access'].get(C.ACCESS_PERM_READ, {}): log.debug("setting {perm} read permission for {name}".format( perm=allowed_jids, name=file_data['name'])) - await self.host.memory.fileUpdate( + await self.host.memory.file_update( file_data['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) if file_data['type'] == C.FILE_TYPE_DIRECTORY: - await self._updateDir(client, requestor, namespace, file_data, 'PUBLIC') + await self._update_dir(client, requestor, namespace, file_data, 'PUBLIC') - async def _onChangeFile(self, client, command_elt, session_data, action, node): + async def _on_change_file(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -241,12 +241,12 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we retrieve it and ask for permissions try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args @@ -288,7 +288,7 @@ try: read_allowed = command_form.fields['read_allowed'] except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if read_allowed.value == 'PUBLIC': allowed_jids = 'PUBLIC' @@ -301,26 +301,26 @@ except RuntimeError as e: log.warning(_("Can't use read_allowed values: {reason}").format( reason=e)) - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if found_file['type'] == C.FILE_TYPE_FILE: - await self.host.memory.fileUpdate( + await self.host.memory.file_update( found_file['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) else: try: recursive = command_form.fields['recursive'] except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) - await self.host.memory.fileUpdate( + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) + await self.host.memory.file_update( found_file['id'], 'access', - partial(self._updateReadPermission, allowed_jids=allowed_jids)) + partial(self._update_read_permission, allowed_jids=allowed_jids)) if recursive: # we set all file under the directory as public (if they haven't # already a permission set), so allowed entities of root directory # can read them. namespace = session_data['namespace'] - await self._updateDir( + await self._update_dir( client, requestor_bare, namespace, found_file, 'PUBLIC') # job done, we can end the session @@ -329,7 +329,7 @@ note = (self._c.NOTE.INFO, _("management session done")) return (payload, status, None, note) - async def _onDeleteFile(self, client, command_elt, session_data, action, node): + async def _on_delete_file(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -342,12 +342,12 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we need confirmation before actually deleting try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args if found_file['type'] == C.FILE_TYPE_DIRECTORY: @@ -373,31 +373,31 @@ try: confirmed = C.bool(command_form.fields['confirm'].value) except KeyError: - self._c.adHocError(self._c.ERROR.BAD_PAYLOAD) + self._c.ad_hoc_error(self._c.ERROR.BAD_PAYLOAD) if not confirmed: note = None else: recursive = found_file['type'] == C.FILE_TYPE_DIRECTORY - await self.host.memory.fileDelete( + await self.host.memory.file_delete( client, requestor_bare, found_file['id'], recursive) note = (self._c.NOTE.INFO, _("file deleted")) status = self._c.STATUS.COMPLETED payload = None return (payload, status, None, note) - def _updateThumbs(self, extra, thumbnails): + def _update_thumbs(self, extra, thumbnails): extra[C.KEY_THUMBNAILS] = thumbnails - async def _genThumbs(self, client, requestor, namespace, file_data): + async def _gen_thumbs(self, client, requestor, namespace, file_data): """Recursively generate thumbnails @param file_data(dict): metadata of the file """ if file_data['type'] == C.FILE_TYPE_DIRECTORY: - sub_files_data = await self.host.memory.getFiles( + sub_files_data = await self.host.memory.get_files( client, requestor, parent=file_data['id'], namespace=namespace) for sub_file_data in sub_files_data: - await self._genThumbs(client, requestor, namespace, sub_file_data) + await self._gen_thumbs(client, requestor, namespace, sub_file_data) elif file_data['type'] == C.FILE_TYPE_FILE: media_type = file_data['media_type'] @@ -407,7 +407,7 @@ for max_thumb_size in self._t.SIZES: try: - thumb_size, thumb_id = await self._t.generateThumbnail( + thumb_size, thumb_id = await self._t.generate_thumbnail( file_path, max_thumb_size, # we keep thumbnails for 6 months @@ -419,9 +419,9 @@ break thumbnails.append({"id": thumb_id, "size": thumb_size}) - await self.host.memory.fileUpdate( + await self.host.memory.file_update( file_data['id'], 'extra', - partial(self._updateThumbs, thumbnails=thumbnails)) + partial(self._update_thumbs, thumbnails=thumbnails)) log.info("thumbnails for [{file_name}] generated" .format(file_name=file_data['name'])) @@ -429,7 +429,7 @@ else: log.warning("unmanaged file type: {type_}".format(type_=file_data['type'])) - async def _onGenThumbnails(self, client, command_elt, session_data, action, node): + async def _on_gen_thumbnails(self, client, command_elt, session_data, action, node): try: x_elt = next(command_elt.elements(data_form.NS_X_DATA, "x")) command_form = data_form.Form.fromElement(x_elt) @@ -441,17 +441,17 @@ if command_form is None or len(command_form.fields) == 0: # root request - return self._getRootArgs() + return self._get_root_args() elif found_file is None: # file selected, we retrieve it and ask for permissions try: - found_file = await self._getFileData(client, session_data, command_form) + found_file = await self._get_file_data(client, session_data, command_form) except WorkflowError as e: return e.err_args log.info("Generating thumbnails as requested") - await self._genThumbs(client, requestor, found_file['namespace'], found_file) + await self._gen_thumbs(client, requestor, found_file['namespace'], found_file) # job done, we can end the session status = self._c.STATUS.COMPLETED @@ -459,11 +459,11 @@ note = (self._c.NOTE.INFO, _("thumbnails generated")) return (payload, status, None, note) - async def _onQuota(self, client, command_elt, session_data, action, node): + async def _on_quota(self, client, command_elt, session_data, action, node): requestor = session_data['requestor'] - quota = self.host.plugins["file_sharing"].getQuota(client, requestor) + quota = self.host.plugins["file_sharing"].get_quota(client, requestor) try: - size_used = await self.host.memory.fileGetUsedSpace(client, requestor) + size_used = await self.host.memory.file_get_used_space(client, requestor) except exceptions.PermissionError: raise WorkflowError(self._err(_("forbidden"))) status = self._c.STATUS.COMPLETED @@ -473,10 +473,10 @@ note = ( self._c.NOTE.INFO, _("You are currently using {size_used} on {size_quota}").format( - size_used = utils.getHumanSize(size_used), + size_used = utils.get_human_size(size_used), size_quota = ( _("unlimited quota") if quota is None - else utils.getHumanSize(quota) + else utils.get_human_size(quota) ) ) )
--- a/sat/plugins/plugin_dbg_manhole.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_dbg_manhole.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,11 +45,11 @@ def __init__(self, host): self.host = host - port = int(host.memory.getConfig(None, "manhole_debug_dangerous_port_int", 0)) + port = int(host.memory.config_get(None, "manhole_debug_dangerous_port_int", 0)) if port: - self.startManhole(port) + self.start_manhole(port) - def startManhole(self, port): + def start_manhole(self, port): log.warning(_("/!\\ Manhole debug server activated, be sure to not use it in " "production, this is dangerous /!\\")) log.info(_("You can connect to manhole server using telnet on port {port}")
--- a/sat/plugins/plugin_exp_command_export.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_command_export.py Sat Apr 08 13:54:42 2023 +0200 @@ -72,7 +72,7 @@ def write(self, message): self.transport.write(message.encode("utf-8")) - def boolOption(self, key): + def bool_option(self, key): """ Get boolean value from options @param key: name of the option @return: True if key exists and set to "true" (case insensitive), @@ -92,13 +92,13 @@ log.info(_("Plugin command export initialization")) self.host = host self.spawned = {} # key = entity - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=10000) - host.bridge.addMethod( - "exportCommand", + host.trigger.add("messageReceived", self.message_received_trigger, priority=10000) + host.bridge.add_method( + "command_export", ".plugin", in_sign="sasasa{ss}s", out_sign="", - method=self._exportCommand, + method=self._export_command, ) def removeProcess(self, entity, process): @@ -113,7 +113,7 @@ except ValueError: pass - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ from_jid = jid.JID(message_elt["from"]) spawned_key = (from_jid.userhostJID(), client.profile) @@ -131,15 +131,15 @@ exclusive = False for process in processes_set: process.write(mess_data) - _continue &= process.boolOption("continue") - exclusive |= process.boolOption("exclusive") + _continue &= process.bool_option("continue") + exclusive |= process.bool_option("exclusive") if exclusive: raise trigger.SkipOtherTriggers return _continue return True - def _exportCommand(self, command, args, targets, options, profile_key): + def _export_command(self, command, args, targets, options, profile_key): """ Export a commands to authorised targets @param command: full path of the command to execute @param args: list of arguments, with command name as first one @@ -150,7 +150,7 @@ - pty: if set, launch in a pseudo terminal - continue: continue normal messageReceived handling """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) for target in targets: try: _jid = jid.JID(target) @@ -163,5 +163,5 @@ process_prot = ExportCommandProtocol(self, client, _jid, options) self.spawned.setdefault((_jid, client.profile), set()).add(process_prot) reactor.spawnProcess( - process_prot, command, args, usePTY=process_prot.boolOption("pty") + process_prot, command, args, usePTY=process_prot.bool_option("pty") )
--- a/sat/plugins/plugin_exp_invitation.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,10 +61,10 @@ # map from namespace of the invitation to callback handling it self._ns_cb = {} - def getHandler(self, client): + def get_handler(self, client): return PubsubInvitationHandler(self) - def registerNamespace(self, namespace, callback): + def register_namespace(self, namespace, callback): """Register a callback for a namespace @param namespace(unicode): namespace handled @@ -95,7 +95,7 @@ .format(namespace=namespace, callback=self._ns_cb[namespace])) self._ns_cb[namespace] = callback - def _generateBaseInvitation(self, client, invitee_jid, name, extra): + def _generate_base_invitation(self, client, invitee_jid, name, extra): """Generate common mess_data end invitation_elt @param invitee_jid(jid.JID): entitee to send invitation to @@ -113,8 +113,8 @@ "subject": {}, "extra": {}, } - client.generateMessageXML(mess_data) - self._h.addHintElements(mess_data["xml"], [self._h.HINT_STORE]) + client.generate_message_xml(mess_data) + self._h.add_hint_elements(mess_data["xml"], [self._h.HINT_STORE]) invitation_elt = mess_data["xml"].addElement("invitation", NS_INVITATION) if name is not None: invitation_elt["name"] = name @@ -128,7 +128,7 @@ invitation_elt['thumb_url'] = thumb_url return mess_data, invitation_elt - def sendPubsubInvitation( + def send_pubsub_invitation( self, client: SatXMPPEntity, invitee_jid: jid.JID, @@ -145,12 +145,12 @@ @param node: pubsub node @param item_id: pubsub id None when the invitation is for a whole node - @param name: see [_generateBaseInvitation] - @param extra: see [_generateBaseInvitation] + @param name: see [_generate_base_invitation] + @param extra: see [_generate_base_invitation] """ if extra is None: extra = {} - mess_data, invitation_elt = self._generateBaseInvitation( + mess_data, invitation_elt = self._generate_base_invitation( client, invitee_jid, name, extra) pubsub_elt = invitation_elt.addElement("pubsub") pubsub_elt["service"] = service.full() @@ -172,7 +172,7 @@ invitation_elt.addChild(extra.pop("element")) client.send(mess_data["xml"]) - async def sendFileSharingInvitation( + async def send_file_sharing_invitation( self, client, invitee_jid, service, repos_type=None, namespace=None, path=None, name=None, extra=None ): @@ -185,13 +185,13 @@ - "photos": photos album @param namespace(unicode, None): namespace of the shared repository @param path(unicode, None): path of the shared repository - @param name(unicode, None): see [_generateBaseInvitation] - @param extra(dict, None): see [_generateBaseInvitation] + @param name(unicode, None): see [_generate_base_invitation] + @param extra(dict, None): see [_generate_base_invitation] """ if extra is None: extra = {} li_plg = self.host.plugins["LIST_INTEREST"] - li_plg.normaliseFileSharingService(client, service) + li_plg.normalise_file_sharing_service(client, service) # FIXME: not the best place to adapt permission, but it's necessary to check them # for UX @@ -205,7 +205,7 @@ if "thumb_url" not in extra: # we have no thumbnail, we check in our own list of interests if there is one try: - item_id = li_plg.getFileSharingId(service, namespace, path) + item_id = li_plg.get_file_sharing_id(service, namespace, path) own_interest = await li_plg.get(client, item_id) except exceptions.NotFound: log.debug( @@ -218,7 +218,7 @@ except KeyError: pass - mess_data, invitation_elt = self._generateBaseInvitation( + mess_data, invitation_elt = self._generate_base_invitation( client, invitee_jid, name, extra) file_sharing_elt = invitation_elt.addElement("file_sharing") file_sharing_elt["service"] = service.full() @@ -235,7 +235,7 @@ file_sharing_elt["path"] = path client.send(mess_data["xml"]) - async def _parsePubsubElt(self, client, pubsub_elt): + async def _parse_pubsub_elt(self, client, pubsub_elt): try: service = jid.JID(pubsub_elt["service"]) node = pubsub_elt["node"] @@ -246,7 +246,7 @@ if item_id is not None: try: - items, metadata = await self._p.getItems( + items, metadata = await self._p.get_items( client, service, node, item_ids=[item_id] ) except Exception as e: @@ -276,7 +276,7 @@ return namespace, args - async def _parseFileSharingElt(self, client, file_sharing_elt): + async def _parse_file_sharing_elt(self, client, file_sharing_elt): try: service = jid.JID(file_sharing_elt["service"]) except (RuntimeError, KeyError): @@ -286,10 +286,10 @@ sharing_ns = file_sharing_elt.getAttribute("namespace") path = file_sharing_elt.getAttribute("path") args = [service, repos_type, sharing_ns, path] - ns_fis = self.host.getNamespace("fis") + ns_fis = self.host.get_namespace("fis") return ns_fis, args - async def onInvitation(self, message_elt, client): + async def on_invitation(self, message_elt, client): log.debug("invitation received [{profile}]".format(profile=client.profile)) invitation_elt = message_elt.invitation @@ -303,9 +303,9 @@ log.warning("unexpected element: {xml}".format(xml=elt.toXml())) continue if elt.name == "pubsub": - method = self._parsePubsubElt + method = self._parse_pubsub_elt elif elt.name == "file_sharing": - method = self._parseFileSharingElt + method = self._parse_file_sharing_elt else: log.warning("not implemented invitation element: {xml}".format( xml = elt.toXml())) @@ -324,7 +324,7 @@ 'No handler for namespace "{namespace}", invitation ignored') .format(namespace=namespace)) else: - await utils.asDeferred(cb, client, namespace, name, extra, *args) + await utils.as_deferred(cb, client, namespace, name, extra, *args) @implementer(iwokkel.IDisco) @@ -337,7 +337,7 @@ self.xmlstream.addObserver( INVITATION, lambda message_elt: defer.ensureDeferred( - self.plugin_parent.onInvitation(message_elt, client=self.parent) + self.plugin_parent.on_invitation(message_elt, client=self.parent) ), )
--- a/sat/plugins/plugin_exp_invitation_file.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation_file.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,32 +45,32 @@ def __init__(self, host): log.info(_("File Sharing Invitation plugin initialization")) self.host = host - ns_fis = host.getNamespace("fis") - host.plugins["INVITATION"].registerNamespace(ns_fis, self.onInvitation) - host.bridge.addMethod( - "FISInvite", + ns_fis = host.get_namespace("fis") + host.plugins["INVITATION"].register_namespace(ns_fis, self.on_invitation) + host.bridge.add_method( + "fis_invite", ".plugin", in_sign="ssssssss", out_sign="", - method=self._sendFileSharingInvitation, + method=self._send_file_sharing_invitation, async_=True ) - def _sendFileSharingInvitation( + def _send_file_sharing_invitation( self, invitee_jid_s, service_s, repos_type=None, namespace=None, path=None, name=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) invitee_jid = jid.JID(invitee_jid_s) service = jid.JID(service_s) extra = data_format.deserialise(extra_s) return defer.ensureDeferred( - self.host.plugins["INVITATION"].sendFileSharingInvitation( + self.host.plugins["INVITATION"].send_file_sharing_invitation( client, invitee_jid, service, repos_type=repos_type or None, namespace=namespace or None, path=path or None, name=name or None, extra=extra) ) - def onInvitation( + def on_invitation( self, client: SatXMPPEntity, namespace: str, @@ -97,7 +97,7 @@ path=path) ) return defer.ensureDeferred( - self.host.plugins['LIST_INTEREST'].registerFileSharing( + self.host.plugins['LIST_INTEREST'].register_file_sharing( client, service, repos_type, sharing_ns, path, name, extra ) )
--- a/sat/plugins/plugin_exp_invitation_pubsub.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_invitation_pubsub.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,12 +51,12 @@ self._p = host.plugins["XEP-0060"] # namespace to handler map self._ns_handler = {} - host.bridge.addMethod( - "psInvite", + host.bridge.add_method( + "ps_invite", ".plugin", in_sign="sssssss", out_sign="", - method=self._sendPubsubInvitation, + method=self._send_pubsub_invitation, async_=True ) @@ -66,12 +66,12 @@ handler ) -> None: self._ns_handler[namespace] = handler - self.host.plugins["INVITATION"].registerNamespace(namespace, self.onInvitation) + self.host.plugins["INVITATION"].register_namespace(namespace, self.on_invitation) - def _sendPubsubInvitation( + def _send_pubsub_invitation( self, invitee_jid_s, service_s, node, item_id=None, name=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) invitee_jid = jid.JID(invitee_jid_s) service = jid.JID(service_s) extra = data_format.deserialise(extra_s) @@ -104,13 +104,13 @@ if namespace: try: handler = self._ns_handler[namespace] - preflight = handler.invitePreflight + preflight = handler.invite_preflight except KeyError: pass except AttributeError: - log.debug(f"no invitePreflight method found for {namespace!r}") + log.debug(f"no invite_preflight method found for {namespace!r}") else: - await utils.asDeferred( + await utils.as_deferred( preflight, client, invitee_jid, service, node, item_id, name, extra ) @@ -118,11 +118,11 @@ item_id = extra.pop("default_item_id", None) # we authorize our invitee to see the nodes of interest - await self._p.setNodeAffiliations(client, service, node, {invitee_jid: "member"}) + await self._p.set_node_affiliations(client, service, node, {invitee_jid: "member"}) log.debug(f"affiliation set on {service}'s {node!r} node") # now we send the invitation - self.host.plugins["INVITATION"].sendPubsubInvitation( + self.host.plugins["INVITATION"].send_pubsub_invitation( client, invitee_jid, service, @@ -132,7 +132,7 @@ extra=extra ) - async def onInvitation( + async def on_invitation( self, client: SatXMPPEntity, namespace: str, @@ -153,7 +153,7 @@ except AttributeError: log.debug(f"no on_invitation_preflight method found for {namespace!r}") else: - await utils.asDeferred( + await utils.as_deferred( preflight, client, namespace, name, extra, service, node, item_id, item_elt ) @@ -164,6 +164,6 @@ if not name: name = extra.pop("name", "") - return await self.host.plugins['LIST_INTEREST'].registerPubsub( + return await self.host.plugins['LIST_INTEREST'].register_pubsub( client, namespace, service, node, item_id, creator, name, element, extra)
--- a/sat/plugins/plugin_exp_jingle_stream.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_jingle_stream.py Sat Apr 08 13:54:42 2023 +0200 @@ -60,7 +60,7 @@ def __init__(self): self.pause = False - def setPause(self, paused): + def set_pause(self, paused): # in Python 2.x, Twisted classes are old style # so we can use property and setter if paused: @@ -78,10 +78,10 @@ def connectionMade(self): if self.factory.client_conn is not None: self.transport.loseConnection() - self.factory.setClientConn(self) + self.factory.set_client_conn(self) def dataReceived(self, data): - self.factory.writeToConsumer(data) + self.factory.write_to_consumer(data) def sendData(self, data): self.transport.write(data) @@ -92,9 +92,9 @@ return if reason.type == error.ConnectionDone: - self.factory.streamFinished() + self.factory.stream_finished() else: - self.factory.streamFailed(reason) + self.factory.stream_failed(reason) @interface.implementer(stream.IStreamProducer) @@ -109,15 +109,15 @@ def __init__(self): self.client_conn = None - def setClientConn(self, stream_protocol): + def set_client_conn(self, stream_protocol): # in Python 2.x, Twisted classes are old style # so we can use property and setter assert self.client_conn is None self.client_conn = stream_protocol if self.consumer is None: - self.client_conn.setPause(True) + self.client_conn.set_pause(True) - def startStream(self, consumer): + def start_stream(self, consumer): if self.consumer is not None: raise exceptions.InternalError( _("stream can't be used with multiple consumers") @@ -127,17 +127,17 @@ consumer.registerProducer(self, True) self.deferred = defer.Deferred() if self.client_conn is not None: - self.client_conn.setPause(False) + self.client_conn.set_pause(False) return self.deferred - def streamFinished(self): + def stream_finished(self): self.client_conn = None if self.consumer: self.consumer.unregisterProducer() self.port_listening.stopListening() self.deferred.callback(None) - def streamFailed(self, failure_): + def stream_failed(self, failure_): self.client_conn = None if self.consumer: self.consumer.unregisterProducer() @@ -146,7 +146,7 @@ elif self.producer: self.producer.stopProducing() - def stopStream(self): + def stop_stream(self): if self.client_conn is not None: self.client_conn.disconnect() @@ -154,10 +154,10 @@ self.producer = producer def pauseProducing(self): - self.client_conn.setPause(True) + self.client_conn.set_pause(True) def resumeProducing(self): - self.client_conn.setPause(False) + self.client_conn.set_pause(False) def stopProducing(self): if self.client_conn: @@ -169,7 +169,7 @@ except AttributeError: log.warning(_("No client connected, can't send data")) - def writeToConsumer(self, data): + def write_to_consumer(self, data): self.consumer.write(data) @@ -180,23 +180,23 @@ log.info(_("Plugin Stream initialization")) self.host = host self._j = host.plugins["XEP-0166"] # shortcut to access jingle - self._j.registerApplication(NS_STREAM, self) - host.bridge.addMethod( - "streamOut", + self._j.register_application(NS_STREAM, self) + host.bridge.add_method( + "stream_out", ".plugin", in_sign="ss", out_sign="s", - method=self._streamOut, + method=self._stream_out, async_=True, ) # jingle callbacks - def _streamOut(self, to_jid_s, profile_key): - client = self.host.getClient(profile_key) - return defer.ensureDeferred(self.streamOut(client, jid.JID(to_jid_s))) + def _stream_out(self, to_jid_s, profile_key): + client = self.host.get_client(profile_key) + return defer.ensureDeferred(self.stream_out(client, jid.JID(to_jid_s))) - async def streamOut(self, client, to_jid): + async def stream_out(self, client, to_jid): """send a stream @param peer_jid(jid.JID): recipient @@ -230,7 +230,7 @@ )) return str(port) - def jingleSessionInit(self, client, session, content_name, stream_object): + def jingle_session_init(self, client, session, content_name, stream_object): content_data = session["contents"][content_name] application_data = content_data["application_data"] assert "stream_object" not in application_data @@ -239,7 +239,7 @@ return desc_elt @defer.inlineCallbacks - def jingleRequestConfirmation(self, client, action, session, content_name, desc_elt): + def jingle_request_confirmation(self, client, action, session, content_name, desc_elt): """This method request confirmation for a jingle session""" content_data = session["contents"][content_name] if content_data["senders"] not in ( @@ -249,7 +249,7 @@ log.warning("Bad sender, assuming initiator") content_data["senders"] = self._j.ROLE_INITIATOR - confirm_data = yield xml_tools.deferDialog( + confirm_data = yield xml_tools.defer_dialog( self.host, _(CONFIRM).format(peer=session["peer_jid"].full()), _(CONFIRM_TITLE), @@ -274,10 +274,10 @@ content_data["stream_object"] = factory finished_d = content_data["finished_d"] = defer.Deferred() args = [client, session, content_name, content_data] - finished_d.addCallbacks(self._finishedCb, self._finishedEb, args, None, args) + finished_d.addCallbacks(self._finished_cb, self._finished_eb, args, None, args) defer.returnValue(True) - def jingleHandler(self, client, action, session, content_name, desc_elt): + def jingle_handler(self, client, action, session, content_name, desc_elt): content_data = session["contents"][content_name] application_data = content_data["application_data"] if action in (self._j.A_ACCEPTED_ACK, self._j.A_SESSION_INITIATE): @@ -287,19 +287,19 @@ content_data["stream_object"] = application_data["stream_object"] finished_d = content_data["finished_d"] = defer.Deferred() args = [client, session, content_name, content_data] - finished_d.addCallbacks(self._finishedCb, self._finishedEb, args, None, args) + finished_d.addCallbacks(self._finished_cb, self._finished_eb, args, None, args) else: log.warning("FIXME: unmanaged action {}".format(action)) return desc_elt - def _finishedCb(self, __, client, session, content_name, content_data): + def _finished_cb(self, __, client, session, content_name, content_data): log.info("Pipe transfer completed") - self._j.contentTerminate(client, session, content_name) - content_data["stream_object"].stopStream() + self._j.content_terminate(client, session, content_name) + content_data["stream_object"].stop_stream() - def _finishedEb(self, failure, client, session, content_name, content_data): + def _finished_eb(self, failure, client, session, content_name, content_data): log.warning("Error while streaming pipe: {}".format(failure)) - self._j.contentTerminate( + self._j.content_terminate( client, session, content_name, reason=self._j.REASON_FAILED_TRANSPORT ) - content_data["stream_object"].stopStream() + content_data["stream_object"].stop_stream()
--- a/sat/plugins/plugin_exp_lang_detect.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_lang_detect.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,9 +65,9 @@ def __init__(self, host): log.info(_("Language detection plugin initialization")) self.host = host - host.memory.updateParams(PARAMS) - host.trigger.add("messageReceived", self.messageReceivedTrigger) - host.trigger.add("sendMessage", self.MessageSendTrigger) + host.memory.update_params(PARAMS) + host.trigger.add("messageReceived", self.message_received_trigger) + host.trigger.add("sendMessage", self.message_send_trigger) def add_language(self, mess_data): message = mess_data["message"] @@ -78,18 +78,18 @@ mess_data["message"] = {lang: msg} return mess_data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ - lang_detect = self.host.memory.getParamA( + lang_detect = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile ) if lang_detect: post_treat.addCallback(self.add_language) return True - def MessageSendTrigger(self, client, data, pre_xml_treatments, post_xml_treatments): - lang_detect = self.host.memory.getParamA( + def message_send_trigger(self, client, data, pre_xml_treatments, post_xml_treatments): + lang_detect = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile ) if lang_detect:
--- a/sat/plugins/plugin_exp_list_of_interest.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_list_of_interest.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,32 +56,32 @@ log.info(_("List of Interest plugin initialization")) self.host = host self._p = self.host.plugins["XEP-0060"] - host.bridge.addMethod( - "interestsList", + host.bridge.add_method( + "interests_list", ".plugin", in_sign="ssss", out_sign="aa{ss}", - method=self._listInterests, + method=self._list_interests, async_=True, ) - host.bridge.addMethod( - "interestsRegisterFileSharing", + host.bridge.add_method( + "interests_file_sharing_register", ".plugin", in_sign="sssssss", out_sign="", - method=self._registerFileSharing, + method=self._register_file_sharing, async_=True, ) - host.bridge.addMethod( - "interestRetract", + host.bridge.add_method( + "interest_retract", ".plugin", in_sign="sss", out_sign="", - method=self._interestRetract, + method=self._interest_retract, async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return ListInterestHandler(self) @defer.inlineCallbacks @@ -99,7 +99,7 @@ if e.condition == "conflict": log.debug(_("requested node already exists")) - async def registerPubsub(self, client, namespace, service, node, item_id=None, + async def register_pubsub(self, client, namespace, service, node, item_id=None, creator=False, name=None, element=None, extra=None): """Register an interesting element in personal list @@ -142,26 +142,26 @@ } if item_id: uri_kwargs['id'] = item_id - interest_uri = uri.buildXMPPUri("pubsub", **uri_kwargs) + interest_uri = uri.build_xmpp_uri("pubsub", **uri_kwargs) # we use URI of the interest as item id to avoid duplicates item_elt = pubsub.Item(interest_uri, payload=interest_elt) await self._p.publish( client, client.jid.userhostJID(), NS_LIST_INTEREST, items=[item_elt] ) - def _registerFileSharing( + def _register_file_sharing( self, service, repos_type, namespace, path, name, extra_raw, profile ): - client = self.host.getClient(profile) + client = self.host.get_client(profile) extra = data_format.deserialise(extra_raw) - return defer.ensureDeferred(self.registerFileSharing( + return defer.ensureDeferred(self.register_file_sharing( client, jid.JID(service), repos_type or None, namespace or None, path or None, name or None, extra )) - def normaliseFileSharingService(self, client, service): + def normalise_file_sharing_service(self, client, service): # FIXME: Q&D fix as the bare file sharing service JID will lead to user own # repository, which thus would not be the same for the host and the guest. # By specifying the user part, we for the use of the host repository. @@ -169,10 +169,10 @@ if service.user is None: service.user = self.host.plugins['XEP-0106'].escape(client.jid.user) - def getFileSharingId(self, service, namespace, path): + def get_file_sharing_id(self, service, namespace, path): return f"{service}_{namespace or ''}_{path or ''}" - async def registerFileSharing( + async def register_file_sharing( self, client, service, repos_type=None, namespace=None, path=None, name=None, extra=None): """Register an interesting file repository in personal list @@ -182,15 +182,15 @@ @param namespace(unicode, None): namespace of the repository @param path(unicode, None): path of the repository @param name(unicode, None): name of the repository - @param extra(dict, None): same as [registerPubsub] + @param extra(dict, None): same as [register_pubsub] """ if extra is None: extra = {} - self.normaliseFileSharingService(client, service) + self.normalise_file_sharing_service(client, service) await self.createNode(client) - item_id = self.getFileSharingId(service, namespace, path) + item_id = self.get_file_sharing_id(service, namespace, path) interest_elt = domish.Element((NS_LIST_INTEREST, "interest")) - interest_elt["namespace"] = self.host.getNamespace("fis") + interest_elt["namespace"] = self.host.get_namespace("fis") if name is not None: interest_elt['name'] = name thumb_url = extra.get('thumb_url') @@ -210,7 +210,7 @@ client, client.jid.userhostJID(), NS_LIST_INTEREST, items=[item_elt] ) - def _listInterestsSerialise(self, interests_data): + def _list_interests_serialise(self, interests_data): interests = [] for item_elt in interests_data[0]: interest_data = {"id": item_elt['id']} @@ -252,16 +252,16 @@ return interests - def _listInterests(self, service, node, namespace, profile): + def _list_interests(self, service, node, namespace, profile): service = jid.JID(service) if service else None node = node or None namespace = namespace or None - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.listInterests(client, service, node, namespace)) - d.addCallback(self._listInterestsSerialise) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.list_interests(client, service, node, namespace)) + d.addCallback(self._list_interests_serialise) return d - async def listInterests(self, client, service=None, node=None, namespace=None): + async def list_interests(self, client, service=None, node=None, namespace=None): """Retrieve list of interests @param service(jid.JID, None): service to use @@ -270,12 +270,12 @@ None to use default node @param namespace(unicode, None): filter interests of this namespace None to retrieve all interests - @return: same as [XEP_0060.getItems] + @return: same as [XEP_0060.get_items] """ # TODO: if a MAM filter were available, it would improve performances if not node: node = NS_LIST_INTEREST - items, metadata = await self._p.getItems(client, service, node) + items, metadata = await self._p.get_items(client, service, node) if namespace is not None: filtered_items = [] for item in items: @@ -291,17 +291,17 @@ return (items, metadata) - def _interestRetract(self, service_s, item_id, profile_key): - d = self._p._retractItem( + def _interest_retract(self, service_s, item_id, profile_key): + d = self._p._retract_item( service_s, NS_LIST_INTEREST, item_id, True, profile_key) d.addCallback(lambda __: None) return d async def get(self, client: SatXMPPEntity, item_id: str) -> dict: """Retrieve a specific interest in profile's list""" - items_data = await self._p.getItems(client, None, NS_LIST_INTEREST, item_ids=[item_id]) + items_data = await self._p.get_items(client, None, NS_LIST_INTEREST, item_ids=[item_id]) try: - return self._listInterestsSerialise(items_data)[0] + return self._list_interests_serialise(items_data)[0] except IndexError: raise exceptions.NotFound
--- a/sat/plugins/plugin_exp_parrot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_parrot.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,21 +48,21 @@ # XXX: This plugin can be potentially dangerous if we don't trust entities linked # this is specially true if we have other triggers. - # sendMessageTrigger avoid other triggers execution, it's deactivated to allow + # send_message_trigger avoid other triggers execution, it's deactivated to allow # /unparrot command in text commands plugin. # FIXME: potentially unsecure, specially with e2e encryption def __init__(self, host): log.info(_("Plugin Parrot initialization")) self.host = host - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=100) - # host.trigger.add("sendMessage", self.sendMessageTrigger, priority=100) + host.trigger.add("messageReceived", self.message_received_trigger, priority=100) + # host.trigger.add("sendMessage", self.send_message_trigger, priority=100) try: - self.host.plugins[C.TEXT_CMDS].registerTextCommands(self) + self.host.plugins[C.TEXT_CMDS].register_text_commands(self) except KeyError: log.info(_("Text commands not available")) - # def sendMessageTrigger(self, client, mess_data, treatments): + # def send_message_trigger(self, client, mess_data, treatments): # """ Deactivate other triggers if recipient is in parrot links """ # try: # _links = client.parrot_links @@ -73,7 +73,7 @@ # log.debug("Parrot link detected, skipping other triggers") # raise trigger.SkipOtherTriggers - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """ Check if source is linked and repeat message, else do nothing """ # TODO: many things are not repeated (subject, thread, etc) from_jid = message_elt["from"] @@ -92,13 +92,13 @@ lang = e.getAttribute("lang") or "" try: - entity_type = self.host.memory.getEntityData( + entity_type = self.host.memory.entity_data_get( client, from_jid, [C.ENTITY_TYPE])[C.ENTITY_TYPE] except (UnknownEntityError, KeyError): entity_type = "contact" if entity_type == C.ENTITY_TYPE_MUC: src_txt = from_jid.resource - if src_txt == self.host.plugins["XEP-0045"].getRoomNick( + if src_txt == self.host.plugins["XEP-0045"].get_room_nick( client, from_jid.userhostJID() ): # we won't repeat our own messages @@ -115,7 +115,7 @@ return True - def addParrot(self, client, source_jid, dest_jid): + def add_parrot(self, client, source_jid, dest_jid): """Add a parrot link from one entity to another one @param source_jid: entity from who messages will be repeated @@ -132,7 +132,7 @@ % (source_jid.userhost(), str(dest_jid)) ) - def removeParrot(self, client, source_jid): + def remove_parrot(self, client, source_jid): """Remove parrot link @param source_jid: this entity will no more be repeated @@ -152,17 +152,17 @@ if not link_left_jid.user or not link_left_jid.host: raise jid.InvalidFormat except (RuntimeError, jid.InvalidFormat, AttributeError): - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Can't activate Parrot mode for invalid jid", mess_data ) return False link_right_jid = mess_data["to"] - self.addParrot(client, link_left_jid, link_right_jid) - self.addParrot(client, link_right_jid, link_left_jid) + self.add_parrot(client, link_left_jid, link_right_jid) + self.add_parrot(client, link_right_jid, link_left_jid) - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Parrot mode activated for {}".format(str(link_left_jid)), mess_data, @@ -180,17 +180,17 @@ if not link_left_jid.user or not link_left_jid.host: raise jid.InvalidFormat except jid.InvalidFormat: - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Can't deactivate Parrot mode for invalid jid", mess_data ) return False link_right_jid = mess_data["to"] - self.removeParrot(client, link_left_jid) - self.removeParrot(client, link_right_jid) + self.remove_parrot(client, link_left_jid) + self.remove_parrot(client, link_right_jid) - txt_cmd.feedBack( + txt_cmd.feed_back( client, "Parrot mode deactivated for {} and {}".format( str(link_left_jid), str(link_right_jid)
--- a/sat/plugins/plugin_exp_pubsub_admin.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_pubsub_admin.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,8 +49,8 @@ def __init__(self, host): self.host = host - host.bridge.addMethod( - "psAdminItemsSend", + host.bridge.add_method( + "ps_admin_items_send", ".plugin", in_sign="ssasss", out_sign="as", @@ -60,7 +60,7 @@ def _publish(self, service, nodeIdentifier, items, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) extra = data_format.deserialise(extra) items = [generic.parseXml(i.encode('utf-8')) for i in items] @@ -68,7 +68,7 @@ client, service, nodeIdentifier, items, extra ) - def _sendCb(self, iq_result): + def _send_cb(self, iq_result): publish_elt = iq_result.admin.pubsub.publish ids = [] for item_elt in publish_elt.elements(pubsub.NS_PUBSUB, 'item'): @@ -90,5 +90,5 @@ for item in items: publish_elt.addChild(item) d = iq_elt.send() - d.addCallback(self._sendCb) + d.addCallback(self._send_cb) return d
--- a/sat/plugins/plugin_exp_pubsub_hook.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_exp_pubsub_hook.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,51 +56,51 @@ log.info(_("PubSub Hook initialization")) self.host = host self.node_hooks = {} # keep track of the number of hooks per node (for all profiles) - host.bridge.addMethod( - "psHookAdd", ".plugin", in_sign="ssssbs", out_sign="", method=self._addHook + host.bridge.add_method( + "ps_hook_add", ".plugin", in_sign="ssssbs", out_sign="", method=self._addHook ) - host.bridge.addMethod( - "psHookRemove", + host.bridge.add_method( + "ps_hook_remove", ".plugin", in_sign="sssss", out_sign="i", method=self._removeHook, ) - host.bridge.addMethod( - "psHookList", + host.bridge.add_method( + "ps_hook_list", ".plugin", in_sign="s", out_sign="aa{ss}", - method=self._listHooks, + method=self._list_hooks, ) @defer.inlineCallbacks - def profileConnected(self, client): + def profile_connected(self, client): hooks = client._hooks = persistent.PersistentBinaryDict( NS_PUBSUB_HOOK, client.profile ) client._hooks_temporary = {} yield hooks.load() for node in hooks: - self._installNodeManager(client, node) + self._install_node_manager(client, node) - def profileDisconnected(self, client): + def profile_disconnected(self, client): for node in client._hooks: - self._removeNodeManager(client, node) + self._remove_node_manager(client, node) - def _installNodeManager(self, client, node): + def _install_node_manager(self, client, node): if node in self.node_hooks: log.debug(_("node manager already set for {node}").format(node=node)) self.node_hooks[node] += 1 else: # first hook on this node - self.host.plugins["XEP-0060"].addManagedNode( - node, items_cb=self._itemsReceived + self.host.plugins["XEP-0060"].add_managed_node( + node, items_cb=self._items_received ) self.node_hooks[node] = 0 log.info(_("node manager installed on {node}").format(node=node)) - def _removeNodeManager(self, client, node): + def _remove_node_manager(self, client, node): try: self.node_hooks[node] -= 1 except KeyError: @@ -108,12 +108,12 @@ else: if self.node_hooks[node] == 0: del self.node_hooks[node] - self.host.plugins["XEP-0060"].removeManagedNode(node, self._itemsReceived) + self.host.plugins["XEP-0060"].remove_managed_node(node, self._items_received) log.debug(_("hook removed")) else: log.debug(_("node still needed for an other hook")) - def installHook(self, client, service, node, hook_type, hook_arg, persistent): + def install_hook(self, client, service, node, hook_type, hook_arg, persistent): if hook_type not in HOOK_TYPES: raise exceptions.DataError( _("{hook_type} is not handled").format(hook_type=hook_type) @@ -124,7 +124,7 @@ hook_type=hook_type ) ) - self._installNodeManager(client, node) + self._install_node_manager(client, node) hook_data = {"service": service, "type": hook_type, "arg": hook_arg} if persistent: @@ -143,7 +143,7 @@ ) ) - def _itemsReceived(self, client, itemsEvent): + def _items_received(self, client, itemsEvent): node = itemsEvent.nodeIdentifier for hooks in (client._hooks, client._hooks_temporary): if node not in hooks: @@ -188,9 +188,9 @@ ) def _addHook(self, service, node, hook_type, hook_arg, persistent, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service = jid.JID(service) if service else client.jid.userhostJID() - return self.addHook( + return self.add_hook( client, service, str(node), @@ -199,7 +199,7 @@ persistent, ) - def addHook(self, client, service, node, hook_type, hook_arg, persistent): + def add_hook(self, client, service, node, hook_type, hook_arg, persistent): r"""Add a hook which will be triggered on a pubsub notification @param service(jid.JID): service of the node @@ -219,21 +219,21 @@ can be a module path, file path, python code """ assert service is not None - return self.installHook(client, service, node, hook_type, hook_arg, persistent) + return self.install_hook(client, service, node, hook_type, hook_arg, persistent) def _removeHook(self, service, node, hook_type, hook_arg, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) service = jid.JID(service) if service else client.jid.userhostJID() - return self.removeHook(client, service, node, hook_type or None, hook_arg or None) + return self.remove_hook(client, service, node, hook_type or None, hook_arg or None) - def removeHook(self, client, service, node, hook_type=None, hook_arg=None): + def remove_hook(self, client, service, node, hook_type=None, hook_arg=None): """Remove a persistent or temporaty root @param service(jid.JID): service of the node @param node(unicode): Pubsub node - @param hook_type(unicode, None): same as for [addHook] + @param hook_type(unicode, None): same as for [add_hook] match all if None - @param hook_arg(unicode, None): same as for [addHook] + @param hook_arg(unicode, None): same as for [add_hook] match all if None @return(int): number of hooks removed """ @@ -254,20 +254,20 @@ if not hooks[node]: # no more hooks, we can remove the node del hooks[node] - self._removeNodeManager(client, node) + self._remove_node_manager(client, node) else: if hooks == client._hooks: hooks.force(node) return removed - def _listHooks(self, profile): - hooks_list = self.listHooks(self.host.getClient(profile)) + def _list_hooks(self, profile): + hooks_list = self.list_hooks(self.host.get_client(profile)) for hook in hooks_list: hook["service"] = hook["service"].full() - hook["persistent"] = C.boolConst(hook["persistent"]) + hook["persistent"] = C.bool_const(hook["persistent"]) return hooks_list - def listHooks(self, client): + def list_hooks(self, client): """return list of registered hooks""" hooks_list = [] for hooks in (client._hooks, client._hooks_temporary):
--- a/sat/plugins/plugin_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,7 +46,7 @@ class ImportPlugin(object): def __init__(self, host): - log.info(_("plugin Import initialization")) + log.info(_("plugin import initialization")) self.host = host def initialize(self, import_handler, name): @@ -54,13 +54,13 @@ @param import_handler(object): specialized import handler instance must have the following methods: - - importItem: import a single main item (i.e. prepare data for publishing) + - import_item: import a single main item (i.e. prepare data for publishing) - importSubitems: import sub items (i.e. items linked to main item, e.g. comments). - Must return a dict with kwargs for recursiveImport if items are to be imported recursively. + Must return a dict with kwargs for recursive_import if items are to be imported recursively. At least "items_import_data", "service" and "node" keys must be provided. if None is returned, no recursion will be done to import subitems, but import can still be done directly by the method. - - publishItem: actualy publish an item - - itemFilters: modify item according to options + - publish_item: actualy publish an item + - item_filters: modify item according to options @param name(unicode): import handler name """ assert name == name.lower().strip() @@ -71,7 +71,7 @@ import_handler.importers = {} def _import(name, location, options, pubsub_service, pubsub_node, profile): - return self._doImport( + return self._do_import( import_handler, name, location, @@ -81,40 +81,40 @@ profile, ) - def _importList(): - return self.listImporters(import_handler) + def _import_list(): + return self.list_importers(import_handler) - def _importDesc(name): + def _import_desc(name): return self.getDescription(import_handler, name) - self.host.bridge.addMethod( - name + "Import", + self.host.bridge.add_method( + name + "import", ".plugin", in_sign="ssa{ss}sss", out_sign="s", method=_import, async_=True, ) - self.host.bridge.addMethod( + self.host.bridge.add_method( name + "ImportList", ".plugin", in_sign="", out_sign="a(ss)", - method=_importList, + method=_import_list, ) - self.host.bridge.addMethod( + self.host.bridge.add_method( name + "ImportDesc", ".plugin", in_sign="s", out_sign="(ss)", - method=_importDesc, + method=_import_desc, ) - def getProgress(self, import_handler, progress_id, profile): - client = self.host.getClient(profile) + def get_progress(self, import_handler, progress_id, profile): + client = self.host.get_client(profile) return client._import[import_handler.name][progress_id] - def listImporters(self, import_handler): + def list_importers(self, import_handler): importers = list(import_handler.importers.keys()) importers.sort() return [ @@ -139,9 +139,9 @@ else: return importer.short_desc, importer.long_desc - def _doImport(self, import_handler, name, location, options, pubsub_service="", + def _do_import(self, import_handler, name, location, options, pubsub_service="", pubsub_node="", profile=C.PROF_KEY_NONE): - client = self.host.getClient(profile) + client = self.host.get_client(profile) options = {key: str(value) for key, value in options.items()} for option in import_handler.BOOL_OPTIONS: try: @@ -158,7 +158,7 @@ _("invalid json option: {option}").format(option=option) ) pubsub_service = jid.JID(pubsub_service) if pubsub_service else None - return self.doImport( + return self.do_import( client, import_handler, str(name), @@ -169,9 +169,9 @@ ) @defer.inlineCallbacks - def doImport(self, client, import_handler, name, location, options=None, + def do_import(self, client, import_handler, name, location, options=None, pubsub_service=None, pubsub_node=None,): - """Import data + """import data @param import_handler(object): instance of the import handler @param name(unicode): name of the importer @@ -221,18 +221,18 @@ "direction": "out", "type": import_handler.name.upper() + "_IMPORT", } - self.host.registerProgressCb( + self.host.register_progress_cb( progress_id, - partial(self.getProgress, import_handler), + partial(self.get_progress, import_handler), metadata, profile=client.profile, ) - self.host.bridge.progressStarted(progress_id, metadata, client.profile) + self.host.bridge.progress_started(progress_id, metadata, client.profile) session = { # session data, can be used by importers "root_service": pubsub_service, "root_node": pubsub_node, } - self.recursiveImport( + self.recursive_import( client, import_handler, items_import_data, @@ -246,7 +246,7 @@ defer.returnValue(progress_id) @defer.inlineCallbacks - def recursiveImport( + def recursive_import( self, client, import_handler, @@ -268,7 +268,7 @@ can be used by importer so store any useful data "root_service" and "root_node" are set to the main pubsub service and node of the import @param options(dict): import options - @param return_data(dict): data to return on progressFinished + @param return_data(dict): data to return on progress_finished @param service(jid.JID, None): PubSub service to use @param node(unicode, None): PubSub node to use @param depth(int): level of recursion @@ -276,14 +276,14 @@ if return_data is None: return_data = {} for idx, item_import_data in enumerate(items_import_data): - item_data = yield import_handler.importItem( + item_data = yield import_handler.import_item( client, item_import_data, session, options, return_data, service, node ) - yield import_handler.itemFilters(client, item_data, session, options) - recurse_kwargs = yield import_handler.importSubItems( + yield import_handler.item_filters(client, item_data, session, options) + recurse_kwargs = yield import_handler.import_sub_items( client, item_import_data, item_data, session, options ) - yield import_handler.publishItem(client, item_data, service, node, session) + yield import_handler.publish_item(client, item_data, service, node, session) if recurse_kwargs is not None: recurse_kwargs["client"] = client @@ -294,7 +294,7 @@ recurse_kwargs["return_data"] = return_data recurse_kwargs["depth"] = depth + 1 log.debug(_("uploading subitems")) - yield self.recursiveImport(**recurse_kwargs) + yield self.recursive_import(**recurse_kwargs) if depth == 0: client._import[import_handler.name][progress_id]["position"] = str( @@ -302,8 +302,8 @@ ) if depth == 0: - self.host.bridge.progressFinished(progress_id, return_data, client.profile) - self.host.removeProgressCb(progress_id, client.profile) + self.host.bridge.progress_finished(progress_id, return_data, client.profile) + self.host.remove_progress_cb(progress_id, client.profile) del client._import[import_handler.name][progress_id] def register(self, import_handler, name, callback, short_desc="", long_desc=""): @@ -311,10 +311,10 @@ @param name(unicode): unique importer name, should indicate the software it can import and always lowercase @param callback(callable): method to call: - the signature must be (client, location, options) (cf. [doImport]) + the signature must be (client, location, options) (cf. [do_import]) the importer must return a tuple with (items_import_data, items_count) items_import_data(iterable[dict]) data specific to specialized importer - cf. importItem docstring of specialized importer for details + cf. import_item docstring of specialized importer for details items_count (int, None) indicate the total number of items (without subitems) useful to display a progress indicator when the iterator is a generator use None if you can't guess the total number of items
--- a/sat/plugins/plugin_misc_account.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_account.py Sat Apr 08 13:54:42 2023 +0200 @@ -108,72 +108,72 @@ def __init__(self, host): log.info(_("Plugin Account initialization")) self.host = host - host.bridge.addMethod( - "registerSatAccount", + host.bridge.add_method( + "libervia_account_register", ".plugin", in_sign="sss", out_sign="", - method=self._registerAccount, + method=self._register_account, async_=True, ) - host.bridge.addMethod( - "getNewAccountDomain", + host.bridge.add_method( + "account_domain_new_get", ".plugin", in_sign="", out_sign="s", - method=self.getNewAccountDomain, + method=self.account_domain_new_get, async_=False, ) - host.bridge.addMethod( - "getAccountDialogUI", + host.bridge.add_method( + "account_dialog_ui_get", ".plugin", in_sign="s", out_sign="s", - method=self._getAccountDialogUI, + method=self._get_account_dialog_ui, async_=False, ) - host.bridge.addMethod( - "asyncConnectWithXMPPCredentials", + host.bridge.add_method( + "credentials_xmpp_connect", ".plugin", in_sign="ss", out_sign="b", - method=self.asyncConnectWithXMPPCredentials, + method=self.credentials_xmpp_connect, async_=True, ) - self.fixEmailAdmins() + self.fix_email_admins() self._sessions = Sessions() - self.__account_cb_id = host.registerCallback( - self._accountDialogCb, with_data=True + self.__account_cb_id = host.register_callback( + self._account_dialog_cb, with_data=True ) - self.__change_password_id = host.registerCallback( - self.__changePasswordCb, with_data=True + self.__change_password_id = host.register_callback( + self.__change_password_cb, with_data=True ) - def deleteBlogCallback(posts, comments): - return lambda data, profile: self.__deleteBlogPostsCb( + def delete_blog_callback(posts, comments): + return lambda data, profile: self.__delete_blog_posts_cb( posts, comments, data, profile ) - self.__delete_posts_id = host.registerCallback( - deleteBlogCallback(True, False), with_data=True + self.__delete_posts_id = host.register_callback( + delete_blog_callback(True, False), with_data=True ) - self.__delete_comments_id = host.registerCallback( - deleteBlogCallback(False, True), with_data=True + self.__delete_comments_id = host.register_callback( + delete_blog_callback(False, True), with_data=True ) - self.__delete_posts_comments_id = host.registerCallback( - deleteBlogCallback(True, True), with_data=True + self.__delete_posts_comments_id = host.register_callback( + delete_blog_callback(True, True), with_data=True ) - self.__delete_account_id = host.registerCallback( - self.__deleteAccountCb, with_data=True + self.__delete_account_id = host.register_callback( + self.__delete_account_cb, with_data=True ) # FIXME: remove this after some time, when the deprecated parameter is really abandoned - def fixEmailAdmins(self): + def fix_email_admins(self): """Handle deprecated config option "admin_email" to fix the admin emails list""" - admin_email = self.getConfig("admin_email") + admin_email = self.config_get("admin_email") if not admin_email: return log.warning( @@ -182,10 +182,10 @@ param_name = "email_admins_list" try: section = "" - value = self.host.memory.getConfig(section, param_name, Exception) + value = self.host.memory.config_get(section, param_name, Exception) except (configparser.NoOptionError, configparser.NoSectionError): section = CONFIG_SECTION - value = self.host.memory.getConfig( + value = self.host.memory.config_get( section, param_name, default_conf[param_name] ) @@ -193,13 +193,13 @@ value.add(admin_email) self.host.memory.config.set(section, param_name, ",".join(value)) - def getConfig(self, name, section=CONFIG_SECTION): + def config_get(self, name, section=CONFIG_SECTION): if name.startswith("email_"): # XXX: email_ parameters were first in [plugin account] section # but as it make more sense to have them in common with other plugins, # they can now be in [DEFAULT] section try: - value = self.host.memory.getConfig(None, name, Exception) + value = self.host.memory.config_get(None, name, Exception) except (configparser.NoOptionError, configparser.NoSectionError): pass else: @@ -209,9 +209,9 @@ default = default_conf[name] else: default = None - return self.host.memory.getConfig(section, name, default) + return self.host.memory.config_get(section, name, default) - def _registerAccount(self, email, password, profile): + def _register_account(self, email, password, profile): return self.registerAccount(email, password, None, profile) def registerAccount(self, email, password, jid_s, profile): @@ -226,11 +226,11 @@ @param profile @return Deferred """ - d = self.createProfile(password, jid_s, profile) - d.addCallback(lambda __: self.sendEmails(email, profile)) + d = self.create_profile(password, jid_s, profile) + d.addCallback(lambda __: self.send_emails(email, profile)) return d - def createProfile(self, password, jid_s, profile): + def create_profile(self, password, jid_s, profile): """Register a new profile and its associated XMPP account. @param password (unicode): password chosen by the user @@ -244,14 +244,14 @@ if not password or not profile: raise exceptions.DataError - if profile.lower() in self.getConfig("reserved_list"): + if profile.lower() in self.config_get("reserved_list"): return defer.fail(Failure(exceptions.ConflictError)) - d = self.host.memory.createProfile(profile, password) - d.addCallback(lambda __: self.profileCreated(password, jid_s, profile)) + d = self.host.memory.create_profile(profile, password) + d.addCallback(lambda __: self.profile_created(password, jid_s, profile)) return d - def profileCreated(self, password, jid_s, profile): + def profile_created(self, password, jid_s, profile): """Create the XMPP account and set the profile connection parameters. @param password (unicode): password chosen by the user @@ -265,30 +265,30 @@ d = defer.succeed(None) jid_ = jid.JID(jid_s) else: - jid_s = profile + "@" + self.getNewAccountDomain() + jid_s = profile + "@" + self.account_domain_new_get() jid_ = jid.JID(jid_s) - d = self.host.plugins["XEP-0077"].registerNewAccount(jid_, password) + d = self.host.plugins["XEP-0077"].register_new_account(jid_, password) def setParams(__): - self.host.memory.setParam( + self.host.memory.param_set( "JabberID", jid_s, "Connection", profile_key=profile ) - d = self.host.memory.setParam( + d = self.host.memory.param_set( "Password", password, "Connection", profile_key=profile ) return d - def removeProfile(failure): - self.host.memory.asyncDeleteProfile(profile) + def remove_profile(failure): + self.host.memory.profile_delete_async(profile) return failure - d.addCallback(lambda __: self.host.memory.startSession(password, profile)) + d.addCallback(lambda __: self.host.memory.start_session(password, profile)) d.addCallback(setParams) - d.addCallback(lambda __: self.host.memory.stopSession(profile)) - d.addErrback(removeProfile) + d.addCallback(lambda __: self.host.memory.stop_session(profile)) + d.addErrback(remove_profile) return d - def _sendEmailEb(self, failure_, email): + def _send_email_eb(self, failure_, email): # TODO: return error code to user log.error( _("Failed to send account creation confirmation to {email}: {msg}").format( @@ -296,13 +296,13 @@ ) ) - def sendEmails(self, email, profile): + def send_emails(self, email, profile): # time to send the email - domain = self.getNewAccountDomain() + domain = self.account_domain_new_get() # email to the administrators - admins_emails = self.getConfig("email_admins_list") + admins_emails = self.config_get("email_admins_list") if not admins_emails: log.warning( "No known admin email, we can't send email to administrator(s).\n" @@ -313,7 +313,7 @@ subject = _("New Libervia account created") # there is no email when an existing XMPP account is used body = f"New account created on {domain}: {profile} [{email or '<no email>'}]" - d_admin = sat_email.sendEmail( + d_admin = sat_email.send_email( self.host.memory.config, admins_emails, subject, body) admins_emails_txt = ", ".join(["<" + addr + ">" for addr in admins_emails]) @@ -333,7 +333,7 @@ # TODO: if use register with an existing account, an XMPP message should be sent return d_admin - jid_s = self.host.memory.getParamA( + jid_s = self.host.memory.param_get_a( "JabberID", "Connection", profile_key=profile ) subject = _("Your Libervia account has been created") @@ -342,20 +342,20 @@ # XXX: this will not fail when the email address doesn't exist # FIXME: check email reception to validate email given by the user # FIXME: delete the profile if the email could not been sent? - d_user = sat_email.sendEmail(self.host.memory.config, [email], subject, body) + d_user = sat_email.send_email(self.host.memory.config, [email], subject, body) d_user.addCallbacks( lambda __: log.debug( "Account creation confirmation sent to <{}>".format(email) ), - self._sendEmailEb, + self._send_email_eb, errbackArgs=[email] ) return defer.DeferredList([d_user, d_admin]) - def getNewAccountDomain(self): + def account_domain_new_get(self): """get the domain that will be set to new account""" - domain = self.getConfig("new_account_domain") or self.getConfig( + domain = self.config_get("new_account_domain") or self.config_get( "xmpp_domain", None ) if not domain: @@ -367,7 +367,7 @@ return DEFAULT_DOMAIN return domain - def _getAccountDialogUI(self, profile): + def _get_account_dialog_ui(self, profile): """Get the main dialog to manage your account @param menu_data @param profile: %(doc_profile)s @@ -381,7 +381,7 @@ ) tab_container = form_ui.current_container - tab_container.addTab( + tab_container.add_tab( "update", D_("Change your password"), container=xml_tools.PairsContainer ) form_ui.addLabel(D_("Current profile password")) @@ -394,7 +394,7 @@ # FIXME: uncomment and fix these features """ if 'GROUPBLOG' in self.host.plugins: - tab_container.addTab("delete_posts", D_("Delete your posts"), container=xml_tools.PairsContainer) + tab_container.add_tab("delete_posts", D_("Delete your posts"), container=xml_tools.PairsContainer) form_ui.addLabel(D_("Current profile password")) form_ui.addPassword("delete_posts_passwd", value="") form_ui.addLabel(D_("Delete all your posts and their comments")) @@ -402,7 +402,7 @@ form_ui.addLabel(D_("Delete all your comments on other's posts")) form_ui.addBool("delete_comments_checkbox", "false") - tab_container.addTab("delete", D_("Delete your account"), container=xml_tools.PairsContainer) + tab_container.add_tab("delete", D_("Delete your account"), container=xml_tools.PairsContainer) form_ui.addLabel(D_("Current profile password")) form_ui.addPassword("delete_passwd", value="") form_ui.addLabel(D_("Delete your account")) @@ -412,12 +412,12 @@ return form_ui.toXml() @defer.inlineCallbacks - def _accountDialogCb(self, data, profile): + def _account_dialog_cb(self, data, profile): """Called when the user submits the main account dialog @param data @param profile """ - sat_cipher = yield self.host.memory.asyncGetParamA( + sat_cipher = yield self.host.memory.param_get_a_async( C.PROFILE_PASS_PATH[1], C.PROFILE_PASS_PATH[0], profile_key=profile ) @@ -442,7 +442,7 @@ verified = yield verify(delete_passwd) assert isinstance(verified, bool) if verified: - defer.returnValue(self.__deleteAccount(profile)) + defer.returnValue(self.__delete_account(profile)) defer.returnValue(error_ui()) # check for blog posts deletion @@ -456,7 +456,7 @@ verified = yield verify(delete_posts_passwd) assert isinstance(verified, bool) if verified: - defer.returnValue(self.__deleteBlogPosts(posts, comments, profile)) + defer.returnValue(self.__delete_blog_posts(posts, comments, profile)) defer.returnValue(error_ui()) """ @@ -469,7 +469,7 @@ assert isinstance(verified, bool) if verified: if new_passwd1 == new_passwd2: - data = yield self.__changePassword(new_passwd1, profile=profile) + data = yield self.__change_password(new_passwd1, profile=profile) defer.returnValue(data) else: defer.returnValue( @@ -481,13 +481,13 @@ defer.returnValue({}) - def __changePassword(self, password, profile): + def __change_password(self, password, profile): """Ask for a confirmation before changing the XMPP account and SàT profile passwords. @param password (str): the new password @param profile (str): %(doc_profile)s """ - session_id, __ = self._sessions.newSession( + session_id, __ = self._sessions.new_session( {"new_password": password}, profile=profile ) form_ui = xml_tools.XMLUI( @@ -504,24 +504,24 @@ form_ui.addText(D_("Continue with changing the password?")) return {"xmlui": form_ui.toXml()} - def __changePasswordCb(self, data, profile): + def __change_password_cb(self, data, profile): """Actually change the user XMPP account and SàT profile password @param data (dict) @profile (str): %(doc_profile)s """ - client = self.host.getClient(profile) - password = self._sessions.profileGet(data["session_id"], profile)["new_password"] + client = self.host.get_client(profile) + password = self._sessions.profile_get(data["session_id"], profile)["new_password"] del self._sessions[data["session_id"]] - def passwordChanged(__): - d = self.host.memory.setParam( + def password_changed(__): + d = self.host.memory.param_set( C.PROFILE_PASS_PATH[1], password, C.PROFILE_PASS_PATH[0], profile_key=profile, ) d.addCallback( - lambda __: self.host.memory.setParam( + lambda __: self.host.memory.param_set( "Password", password, "Connection", profile_key=profile ) ) @@ -536,11 +536,11 @@ ) return defer.succeed({"xmlui": error_ui.toXml()}) - d = self.host.plugins["XEP-0077"].changePassword(client, password) - d.addCallbacks(passwordChanged, errback) + d = self.host.plugins["XEP-0077"].change_password(client, password) + d.addCallbacks(password_changed, errback) return d - def __deleteAccount(self, profile): + def __delete_account(self, profile): """Ask for a confirmation before deleting the XMPP account and SàT profile @param profile """ @@ -561,7 +561,7 @@ D_( "All your data stored on %(server)s, including your %(target)s will be erased." ) - % {"server": self.getNewAccountDomain(), "target": target} + % {"server": self.account_domain_new_get(), "target": target} ) form_ui.addText( D_( @@ -570,26 +570,26 @@ ) return {"xmlui": form_ui.toXml()} - def __deleteAccountCb(self, data, profile): + def __delete_account_cb(self, data, profile): """Actually delete the XMPP account and SàT profile @param data @param profile """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) - def userDeleted(__): + def user_deleted(__): # FIXME: client should be disconnected at this point, so 2 next loop should be removed (to be confirmed) for jid_ in client.roster._jids: # empty roster client.presence.unsubscribe(jid_) - for jid_ in self.host.memory.getWaitingSub( + for jid_ in self.host.memory.sub_waiting_get( profile ): # delete waiting subscriptions - self.host.memory.delWaitingSub(jid_) + self.host.memory.del_waiting_sub(jid_) - delete_profile = lambda: self.host.memory.asyncDeleteProfile( + delete_profile = lambda: self.host.memory.profile_delete_async( profile, force=True ) if "GROUPBLOG" in self.host.plugins: @@ -611,10 +611,10 @@ return defer.succeed({"xmlui": error_ui.toXml()}) d = self.host.plugins["XEP-0077"].unregister(client, jid.JID(client.jid.host)) - d.addCallbacks(userDeleted, errback) + d.addCallbacks(user_deleted, errback) return d - def __deleteBlogPosts(self, posts, comments, profile): + def __delete_blog_posts(self, posts, comments, profile): """Ask for a confirmation before deleting the blog posts @param posts: delete all posts of the user (and their comments) @param comments: delete all the comments of the user on other's posts @@ -678,7 +678,7 @@ return {"xmlui": form_ui.toXml()} - def __deleteBlogPostsCb(self, posts, comments, data, profile): + def __delete_blog_posts_cb(self, posts, comments, data, profile): """Actually delete the XMPP account and SàT profile @param posts: delete all posts of the user (and their comments) @param comments: delete all the comments of the user on other's posts @@ -723,7 +723,7 @@ d.addCallbacks(deleted, errback) return d - def asyncConnectWithXMPPCredentials(self, jid_s, password): + def credentials_xmpp_connect(self, jid_s, password): """Create and connect a new SàT profile using the given XMPP credentials. Re-use given JID and XMPP password for the profile name and profile password. @@ -733,34 +733,34 @@ @raise exceptions.PasswordError, exceptions.ConflictError """ try: # be sure that the profile doesn't exist yet - self.host.memory.getProfileName(jid_s) + self.host.memory.get_profile_name(jid_s) except exceptions.ProfileUnknownError: pass else: raise exceptions.ConflictError - d = self.createProfile(password, jid_s, jid_s) + d = self.create_profile(password, jid_s, jid_s) d.addCallback( - lambda __: self.host.memory.getProfileName(jid_s) + lambda __: self.host.memory.get_profile_name(jid_s) ) # checks if the profile has been successfuly created d.addCallback(lambda profile: defer.ensureDeferred( self.host.connect(profile, password, {}, 0))) def connected(result): - self.sendEmails(None, profile=jid_s) + self.send_emails(None, profile=jid_s) return result - def removeProfile( + def remove_profile( failure ): # profile has been successfully created but the XMPP credentials are wrong! log.debug( "Removing previously auto-created profile: %s" % failure.getErrorMessage() ) - self.host.memory.asyncDeleteProfile(jid_s) + self.host.memory.profile_delete_async(jid_s) raise failure # FIXME: we don't catch the case where the JID host is not an XMPP server, and the user # has to wait until the DBUS timeout ; as a consequence, emails are sent to the admins - # and the profile is not deleted. When the host exists, removeProfile is well called. - d.addCallbacks(connected, removeProfile) + # and the profile is not deleted. When the host exists, remove_profile is well called. + d.addCallbacks(connected, remove_profile) return d
--- a/sat/plugins/plugin_misc_android.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_android.py Sat Apr 08 13:54:42 2023 +0200 @@ -143,7 +143,7 @@ notification_intent.setFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP) notification_intent.setAction(Intent.ACTION_MAIN) - notification_intent.addCategory(Intent.CATEGORY_LAUNCHER) + notification_intent.add_category(Intent.CATEGORY_LAUNCHER) if sat_action is not None: action_data = AndroidString(json.dumps(sat_action).encode()) log.debug(f"adding extra {INTENT_EXTRA_ACTION} ==> {action_data}") @@ -233,10 +233,10 @@ category_label=D_(PARAM_VIBRATE_CATEGORY), vibrate_param_name=PARAM_VIBRATE_NAME, vibrate_param_label=PARAM_VIBRATE_LABEL, - vibrate_options=params.makeOptions(VIBRATION_OPTS, "always"), + vibrate_options=params.make_options(VIBRATION_OPTS, "always"), ring_param_name=PARAM_RING_NAME, ring_param_label=PARAM_RING_LABEL, - ring_options=params.makeOptions(RING_OPTS, "normal"), + ring_options=params.make_options(RING_OPTS, "normal"), ) def __init__(self, host): @@ -245,7 +245,7 @@ self.host = host self._csi = host.plugins.get('XEP-0352') self._csi_timer = None - host.memory.updateParams(self.params) + host.memory.update_params(self.params) try: os.mkdir(SOCKET_DIR, 0o700) except OSError as e: @@ -268,15 +268,15 @@ raise e # we set a low priority because we want the notification to be sent after all # plugins have done their job - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=-1000) + host.trigger.add("messageReceived", self.message_received_trigger, priority=-1000) # profiles autoconnection - host.bridge.addMethod( - "profileAutoconnectGet", + host.bridge.add_method( + "profile_autoconnect_get", ".plugin", in_sign="", out_sign="s", - method=self._profileAutoconnectGet, + method=self._profile_autoconnect_get, async_=True, ) @@ -284,7 +284,7 @@ self.am = activity.getSystemService(Context.AUDIO_SERVICE) # sound notification - media_dir = Path(host.memory.getConfig("", "media_dir")) + media_dir = Path(host.memory.config_get("", "media_dir")) assert media_dir is not None notif_path = media_dir / "sounds" / "notifications" / "music-box.mp3" self.notif_player = MediaPlayer() @@ -297,20 +297,20 @@ log.info("SSL Android patch applied") # DNS fix - defer.ensureDeferred(self.updateResolver()) + defer.ensureDeferred(self.update_resolver()) # Connectivity handling self.cm = activity.getSystemService(Context.CONNECTIVITY_SERVICE) self._net_type = None - d = defer.ensureDeferred(self._checkConnectivity()) - d.addErrback(host.logErrback) + d = defer.ensureDeferred(self._check_connectivity()) + d.addErrback(host.log_errback) # XXX: we need to keep a reference to BroadcastReceiver to avoid # "XXX has no attribute 'invoke'" error (looks like the same issue as # https://github.com/kivy/pyjnius/issues/59) self.br = BroadcastReceiver( callback=lambda *args, **kwargs: reactor.callFromThread( - self.onConnectivityChange + self.on_connectivity_change ), actions=["android.net.conn.CONNECTIVITY_CHANGE"] ) @@ -326,29 +326,29 @@ previous_state = self._state self._state = new_state if new_state == STATE_RUNNING: - self._onRunning(previous_state) + self._on_running(previous_state) elif new_state == STATE_PAUSED: - self._onPaused(previous_state) + self._on_paused(previous_state) elif new_state == STATE_STOPPED: - self._onStopped(previous_state) + self._on_stopped(previous_state) @property def cagou_active(self): return self._state == STATE_RUNNING - def _onRunning(self, previous_state): + def _on_running(self, previous_state): if previous_state is not None: - self.host.bridge.bridgeReactivateSignals() - self.setActive() + self.host.bridge.bridge_reactivate_signals() + self.set_active() - def _onPaused(self, previous_state): - self.host.bridge.bridgeDeactivateSignals() - self.setInactive() + def _on_paused(self, previous_state): + self.host.bridge.bridge_deactivate_signals() + self.set_inactive() - def _onStopped(self, previous_state): - self.setInactive() + def _on_stopped(self, previous_state): + self.set_inactive() - def _notifyMessage(self, mess_data, client): + def _notify_message(self, mess_data, client): """Send notification when suitable notification is sent if: @@ -378,7 +378,7 @@ ringer_mode = self.am.getRingerMode() vibrate_mode = ringer_mode == AudioManager.RINGER_MODE_VIBRATE - ring_setting = self.host.memory.getParamA( + ring_setting = self.host.memory.param_get_a( PARAM_RING_NAME, PARAM_RING_CATEGORY, profile_key=client.profile @@ -387,7 +387,7 @@ if ring_setting != 'never' and ringer_mode == AudioManager.RINGER_MODE_NORMAL: self.notif_player.start() - vibration_setting = self.host.memory.getParamA( + vibration_setting = self.host.memory.param_get_a( PARAM_VIBRATE_NAME, PARAM_VIBRATE_CATEGORY, profile_key=client.profile @@ -400,27 +400,27 @@ log.warning("Can't use vibrator: {e}".format(e=e)) return mess_data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): if not self.cagou_active: # we only send notification is the frontend is not displayed - post_treat.addCallback(self._notifyMessage, client) + post_treat.addCallback(self._notify_message, client) return True # Profile autoconnection - def _profileAutoconnectGet(self): - return defer.ensureDeferred(self.profileAutoconnectGet()) + def _profile_autoconnect_get(self): + return defer.ensureDeferred(self.profile_autoconnect_get()) - async def _getProfilesAutoconnect(self): - autoconnect_dict = await self.host.memory.storage.getIndParamValues( + async def _get_profiles_autoconnect(self): + autoconnect_dict = await self.host.memory.storage.get_ind_param_values( category='Connection', name='autoconnect_backend', ) return [p for p, v in autoconnect_dict.items() if C.bool(v)] - async def profileAutoconnectGet(self): + async def profile_autoconnect_get(self): """Return profile to connect automatically by frontend, if any""" - profiles_autoconnect = await self._getProfilesAutoconnect() + profiles_autoconnect = await self._get_profiles_autoconnect() if not profiles_autoconnect: return None if len(profiles_autoconnect) > 1: @@ -431,56 +431,56 @@ # CSI - def _setInactive(self): + def _set_inactive(self): self._csi_timer = None - for client in self.host.getClients(C.PROF_KEY_ALL): - self._csi.setInactive(client) + for client in self.host.get_clients(C.PROF_KEY_ALL): + self._csi.set_inactive(client) - def setInactive(self): + def set_inactive(self): if self._csi is None or self._csi_timer is not None: return - self._csi_timer = reactor.callLater(CSI_DELAY, self._setInactive) + self._csi_timer = reactor.callLater(CSI_DELAY, self._set_inactive) - def setActive(self): + def set_active(self): if self._csi is None: return if self._csi_timer is not None: self._csi_timer.cancel() self._csi_timer = None - for client in self.host.getClients(C.PROF_KEY_ALL): - self._csi.setActive(client) + for client in self.host.get_clients(C.PROF_KEY_ALL): + self._csi.set_active(client) # Connectivity - async def _handleNetworkChange(self, net_type): + async def _handle_network_change(self, net_type): """Notify the clients about network changes. This way the client can disconnect/reconnect transport, or change delays """ log.debug(f"handling network change ({net_type})") if net_type == NET_TYPE_NONE: - for client in self.host.getClients(C.PROF_KEY_ALL): - client.networkDisabled() + for client in self.host.get_clients(C.PROF_KEY_ALL): + client.network_disabled() else: # DNS servers may have changed - await self.updateResolver() + await self.update_resolver() # client may be there but disabled (e.g. with stream management) - for client in self.host.getClients(C.PROF_KEY_ALL): + for client in self.host.get_clients(C.PROF_KEY_ALL): log.debug(f"enabling network for {client.profile}") - client.networkEnabled() + client.network_enabled() # profiles may have been disconnected and then purged, we try # to reconnect them in case - profiles_autoconnect = await self._getProfilesAutoconnect() + profiles_autoconnect = await self._get_profiles_autoconnect() for profile in profiles_autoconnect: - if not self.host.isConnected(profile): + if not self.host.is_connected(profile): log.info(f"{profile} is not connected, reconnecting it") try: await self.host.connect(profile) except Exception as e: log.error(f"Can't connect profile {profile}: {e}") - async def _checkConnectivity(self): + async def _check_connectivity(self): active_network = self.cm.getActiveNetworkInfo() if active_network is None: net_type = NET_TYPE_NONE @@ -506,24 +506,24 @@ log.info("network activated (type={net_type_android})" .format(net_type_android=net_type_android)) else: - log.debug("_checkConnectivity called without network change ({net_type})" + log.debug("_check_connectivity called without network change ({net_type})" .format(net_type = net_type)) - # we always call _handleNetworkChange even if there is not connectivity change + # we always call _handle_network_change even if there is not connectivity change # to be sure to reconnect when necessary - await self._handleNetworkChange(net_type) + await self._handle_network_change(net_type) - def onConnectivityChange(self): - log.debug("onConnectivityChange called") - d = defer.ensureDeferred(self._checkConnectivity()) - d.addErrback(self.host.logErrback) + def on_connectivity_change(self): + log.debug("on_connectivity_change called") + d = defer.ensureDeferred(self._check_connectivity()) + d.addErrback(self.host.log_errback) - async def updateResolver(self): + async def update_resolver(self): # There is no "/etc/resolv.conf" on Android, which confuse Twisted and makes # SRV record checking unusable. We fixe that by checking DNS server used, and # updating Twisted's resolver accordingly - dns_servers = await self.getDNSServers() + dns_servers = await self.get_dns_servers() log.info( "Patching Twisted to use Android DNS resolver ({dns_servers})".format( @@ -531,7 +531,7 @@ ) dns_client.theResolver = dns_client.createResolver(servers=dns_servers) - async def getDNSServers(self): + async def get_dns_servers(self): servers = [] if api_version < 26:
--- a/sat/plugins/plugin_misc_app_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_app_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -84,31 +84,31 @@ self._started = {} # instance id to app data map self._instances = {} - host.bridge.addMethod( - "applicationsList", + host.bridge.add_method( + "applications_list", ".plugin", in_sign="as", out_sign="as", method=self.list_applications, ) - host.bridge.addMethod( - "applicationStart", + host.bridge.add_method( + "application_start", ".plugin", in_sign="ss", out_sign="s", method=self._start, async_=True, ) - host.bridge.addMethod( - "applicationStop", + host.bridge.add_method( + "application_stop", ".plugin", in_sign="sss", out_sign="", method=self._stop, async_=True, ) - host.bridge.addMethod( - "applicationExposedGet", + host.bridge.add_method( + "application_exposed_get", ".plugin", in_sign="sss", out_sign="s", @@ -117,12 +117,12 @@ ) # application has been started succeesfully, # args: name, instance_id, extra - host.bridge.addSignal( + host.bridge.add_signal( "application_started", ".plugin", signature="sss" ) # application went wrong with the application # args: name, instance_id, extra - host.bridge.addSignal( + host.bridge.add_signal( "application_error", ".plugin", signature="sss" ) yaml.add_constructor( @@ -169,7 +169,7 @@ "expected" ) - value = self.host.memory.getConfig(section, name, default) + value = self.host.memory.config_get(section, name, default) # FIXME: "public_url" is used only here and doesn't take multi-sites into account if name == "public_url" and (not value or value.startswith('http')): if not value: @@ -408,7 +408,7 @@ log.info(f"{app_name!r} is already started or being started") return ret_data else: - cache_path = self.host.memory.getCachePath( + cache_path = self.host.memory.get_cache_path( PLUGIN_INFO[C.PI_IMPORT_NAME], app_name ) cache_path.mkdir(0o700, parents=True, exist_ok=True)
--- a/sat/plugins/plugin_misc_attach.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_attach.py Sat Apr 08 13:54:42 2023 +0200 @@ -58,10 +58,10 @@ log.info(_("plugin Attach initialization")) self.host = host self._u = host.plugins["UPLOAD"] - host.trigger.add("sendMessage", self._sendMessageTrigger) - host.trigger.add("sendMessageComponent", self._sendMessageTrigger) + host.trigger.add("sendMessage", self._send_message_trigger) + host.trigger.add("sendMessageComponent", self._send_message_trigger) self._attachments_handlers = {'clear': [], 'encrypted': []} - self.register(self.defaultCanHandle, self.defaultAttach, False, -1000) + self.register(self.default_can_handle, self.default_attach, False, -1000) def register(self, can_handle, attach, encrypted=False, priority=0): """Register an attachments handler @@ -94,7 +94,7 @@ handlers.sort(key=lambda h: h.priority, reverse=True) log.debug(f"new attachments handler: {handler}") - async def attachFiles(self, client, data): + async def attach_files(self, client, data): """Main method to attach file It will do generic pre-treatment, and call the suitable attachments handler @@ -135,13 +135,13 @@ _("Can't resize attachment of type {main_type!r}: {attachment}") .format(main_type=main_type, attachment=attachment)) - if client.encryption.isEncryptionRequested(data): + if client.encryption.is_encryption_requested(data): handlers = self._attachments_handlers['encrypted'] else: handlers = self._attachments_handlers['clear'] for handler in handlers: - can_handle = await utils.asDeferred(handler.can_handle, client, data) + can_handle = await utils.as_deferred(handler.can_handle, client, data) if can_handle: break else: @@ -150,7 +150,7 @@ destinee = data['to'] )) - await utils.asDeferred(handler.attach, client, data) + await utils.as_deferred(handler.attach, client, data) for dir_path in tmp_dirs_to_clean: log.debug(f"Cleaning temporary directory at {dir_path}") @@ -220,7 +220,7 @@ progress_id = attachment.pop("progress_id", None) if progress_id: extra["progress_id"] = progress_id - check_certificate = self.host.memory.getParamA( + check_certificate = self.host.memory.param_get_a( "check_certificate", "Connection", profile_key=client.profile) if not check_certificate: extra['ignore_tls_errors'] = True @@ -251,19 +251,19 @@ return data - def _attachFiles(self, data, client): - return defer.ensureDeferred(self.attachFiles(client, data)) + def _attach_files(self, data, client): + return defer.ensureDeferred(self.attach_files(client, data)) - def _sendMessageTrigger( + def _send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments): if mess_data['extra'].get(C.KEY_ATTACHMENTS): - post_xml_treatments.addCallback(self._attachFiles, client=client) + post_xml_treatments.addCallback(self._attach_files, client=client) return True - async def defaultCanHandle(self, client, data): + async def default_can_handle(self, client, data): return True - async def defaultAttach(self, client, data): + async def default_attach(self, client, data): await self.upload_files(client, data) # TODO: handle xhtml-im body_elt = data["xml"].body
--- a/sat/plugins/plugin_misc_debug.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_debug.py Sat Apr 08 13:54:42 2023 +0200 @@ -40,15 +40,15 @@ def __init__(self, host): log.info(_("Plugin Debug initialization")) self.host = host - host.bridge.addMethod( - "debugFakeSignal", + host.bridge.add_method( + "debug_signal_fake", ".plugin", in_sign="sss", out_sign="", - method=self._fakeSignal, + method=self._fake_signal, ) - def _fakeSignal(self, signal, arguments, profile_key): + def _fake_signal(self, signal, arguments, profile_key): """send a signal from backend @param signal(str): name of the signal @@ -58,6 +58,6 @@ args = json.loads(arguments) method = getattr(self.host.bridge, signal) if profile_key != C.PROF_KEY_NONE: - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) args.append(profile) method(*args)
--- a/sat/plugins/plugin_misc_download.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_download.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,20 +54,20 @@ def __init__(self, host): log.info(_("plugin Download initialization")) self.host = host - host.bridge.addMethod( - "fileDownload", + host.bridge.add_method( + "file_download", ".plugin", in_sign="ssss", out_sign="s", - method=self._fileDownload, + method=self._file_download, async_=True, ) - host.bridge.addMethod( - "fileDownloadComplete", + host.bridge.add_method( + "file_download_complete", ".plugin", in_sign="ssss", out_sign="s", - method=self._fileDownloadComplete, + method=self._file_download_complete, async_=True, ) self._download_callbacks = {} @@ -75,11 +75,11 @@ self.register_scheme('http', self.download_http) self.register_scheme('https', self.download_http) - def _fileDownload( + def _file_download( self, attachment_s: str, dest_path: str, extra_s: str, profile: str ) -> defer.Deferred: d = defer.ensureDeferred(self.file_download( - self.host.getClient(profile), + self.host.get_client(profile), data_format.deserialise(attachment_s), Path(dest_path), data_format.deserialise(extra_s) @@ -118,11 +118,11 @@ else: return {"progress": progress_id} - def _fileDownloadComplete( + def _file_download_complete( self, attachment_s: str, dest_path: str, extra_s: str, profile: str ) -> defer.Deferred: d = defer.ensureDeferred(self.file_download_complete( - self.host.getClient(profile), + self.host.get_client(profile), data_format.deserialise(attachment_s), Path(dest_path), data_format.deserialise(extra_s) @@ -168,7 +168,7 @@ # we hash the URL to have an unique identifier, and avoid double download url_hash = hashlib.sha256(uri_parsed.geturl().encode()).hexdigest() cache_uid = f"{stem}_{url_hash}" - cache_data = client.cache.getMetadata(cache_uid) + cache_data = client.cache.get_metadata(cache_uid) if cache_data is not None: # file is already in cache, we return it download_d = defer.succeed(cache_data['path']) @@ -176,14 +176,14 @@ else: # the file is not in cache unique_name = '.'.join([cache_uid] + suffixes) - with client.cache.cacheData( + with client.cache.cache_data( "DOWNLOAD", cache_uid, filename=unique_name) as f: # we close the file and only use its name, the file will be opened # by the registered callback dest_path = Path(f.name) # should we check certificates? - check_certificate = self.host.memory.getParamA( + check_certificate = self.host.memory.param_get_a( "check_certificate", "Connection", profile_key=client.profile) if not check_certificate: extra['ignore_tls_errors'] = True @@ -203,7 +203,7 @@ "Can't download URI {uri}: {reason}").format( uri=uri, reason=e)) if cache_uid is not None: - client.cache.removeFromCache(cache_uid) + client.cache.remove_from_cache(cache_uid) elif dest_path.exists(): dest_path.unlink() raise e
--- a/sat/plugins/plugin_misc_email_invitation.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_email_invitation.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,30 +77,30 @@ log.info(_("plugin Invitations initialization")) self.host = host self.invitations = persistent.LazyPersistentBinaryDict('invitations') - host.bridge.addMethod("invitationCreate", ".plugin", in_sign='sasssssssssa{ss}s', + host.bridge.add_method("invitation_create", ".plugin", in_sign='sasssssssssa{ss}s', out_sign='a{ss}', method=self._create, async_=True) - host.bridge.addMethod("invitationGet", ".plugin", in_sign='s', out_sign='a{ss}', + host.bridge.add_method("invitation_get", ".plugin", in_sign='s', out_sign='a{ss}', method=self.get, async_=True) - host.bridge.addMethod("invitationDelete", ".plugin", in_sign='s', out_sign='', + host.bridge.add_method("invitation_delete", ".plugin", in_sign='s', out_sign='', method=self._delete, async_=True) - host.bridge.addMethod("invitationModify", ".plugin", in_sign='sa{ss}b', + host.bridge.add_method("invitation_modify", ".plugin", in_sign='sa{ss}b', out_sign='', method=self._modify, async_=True) - host.bridge.addMethod("invitationList", ".plugin", in_sign='s', + host.bridge.add_method("invitation_list", ".plugin", in_sign='s', out_sign='a{sa{ss}}', method=self._list, async_=True) - host.bridge.addMethod("invitationSimpleCreate", ".plugin", in_sign='sssss', + host.bridge.add_method("invitation_simple_create", ".plugin", in_sign='sssss', out_sign='a{ss}', - method=self._simpleCreate, + method=self._simple_create, async_=True) - def checkExtra(self, extra): + def check_extra(self, extra): if EXTRA_RESERVED.intersection(extra): raise ValueError( _("You can't use following key(s) in extra, they are reserved: {}") @@ -132,7 +132,7 @@ kwargs[key] = str(value) return defer.ensureDeferred(self.create(**kwargs)) - async def getExistingInvitation(self, email: Optional[str]) -> Optional[dict]: + async def get_existing_invitation(self, email: Optional[str]) -> Optional[dict]: """Retrieve existing invitation with given email @param email: check if any invitation exist with this email @@ -151,7 +151,7 @@ invitation[KEY_ID] = id_ return invitation - async def _createAccountAndProfile( + async def _create_account_and_profile( self, id_: str, kwargs: dict, @@ -161,7 +161,7 @@ ## XMPP account creation password = kwargs.pop('password', None) if password is None: - password = utils.generatePassword() + password = utils.generate_password() assert password # XXX: password is here saved in clear in database # it is needed for invitation as the same password is used for profile @@ -173,7 +173,7 @@ jid_ = kwargs.pop('jid_', None) if not jid_: - domain = self.host.memory.getConfig(None, 'xmpp_domain') + domain = self.host.memory.config_get(None, 'xmpp_domain') if not domain: # TODO: fallback to profile's domain raise ValueError(_("You need to specify xmpp_domain in sat.conf")) @@ -186,7 +186,7 @@ # we don't register account if there is no user as anonymous login is then # used try: - await self.host.plugins['XEP-0077'].registerNewAccount(jid_, password) + await self.host.plugins['XEP-0077'].register_new_account(jid_, password) except error.StanzaError as e: prefix = jid_.user idx = 0 @@ -197,7 +197,7 @@ log.info(_("requested jid already exists, trying with {}".format( jid_.full()))) try: - await self.host.plugins['XEP-0077'].registerNewAccount( + await self.host.plugins['XEP-0077'].register_new_account( jid_, password ) @@ -216,11 +216,11 @@ uuid=id_ ) # profile creation should not fail as we generate unique name ourselves - await self.host.memory.createProfile(guest_profile, password) - await self.host.memory.startSession(password, guest_profile) - await self.host.memory.setParam("JabberID", jid_.full(), "Connection", + await self.host.memory.create_profile(guest_profile, password) + await self.host.memory.start_session(password, guest_profile) + await self.host.memory.param_set("JabberID", jid_.full(), "Connection", profile_key=guest_profile) - await self.host.memory.setParam("Password", password, "Connection", + await self.host.memory.param_set("Password", password, "Connection", profile_key=guest_profile) async def create(self, **kwargs): @@ -290,11 +290,11 @@ _("You can't use following key(s) in both args and extra: {}").format( ', '.join(set(kwargs).intersection(extra)))) - self.checkExtra(extra) + self.check_extra(extra) email = kwargs.pop('email', None) - existing = await self.getExistingInvitation(email) + existing = await self.get_existing_invitation(email) if existing is not None: log.info(f"There is already an invitation for {email!r}") extra.update(existing) @@ -316,7 +316,7 @@ id_ = existing[KEY_ID] if existing else str(shortuuid.uuid()) if existing is None: - await self._createAccountAndProfile(id_, kwargs, extra) + await self._create_account_and_profile(id_, kwargs, extra) profile = kwargs.pop('profile', None) guest_profile = extra[KEY_GUEST_PROFILE] @@ -333,8 +333,8 @@ pass else: await self.host.connect(guest_profile, password) - guest_client = self.host.getClient(guest_profile) - await id_plugin.setIdentity(guest_client, {'nicknames': [name]}) + guest_client = self.host.get_client(guest_profile) + await id_plugin.set_identity(guest_client, {'nicknames': [name]}) await self.host.disconnect(guest_profile) ## email @@ -370,7 +370,7 @@ invite_url = url_template.format(**format_args) format_args['url'] = invite_url - await sat_email.sendEmail( + await sat_email.send_email( self.host.memory.config, [email] + emails_extra, (kwargs.pop('message_subject', None) or DEFAULT_SUBJECT).format( @@ -384,11 +384,11 @@ # FIXME: a parameter to disable auto roster adding would be nice if profile is not None: try: - client = self.host.getClient(profile) + client = self.host.get_client(profile) except Exception as e: log.error(f"Can't get host profile: {profile}: {e}") else: - await self.host.updateContact(client, jid_, name, ['guests']) + await self.host.contact_update(client, jid_, name, ['guests']) if kwargs: log.warning(_("Not all arguments have been consumed: {}").format(kwargs)) @@ -400,20 +400,20 @@ return extra - def _simpleCreate(self, invitee_email, invitee_name, url_template, extra_s, profile): - client = self.host.getClient(profile) + def _simple_create(self, invitee_email, invitee_name, url_template, extra_s, profile): + client = self.host.get_client(profile) # FIXME: needed because python-dbus use a specific string class invitee_email = str(invitee_email) invitee_name = str(invitee_name) url_template = str(url_template) extra = data_format.deserialise(extra_s) d = defer.ensureDeferred( - self.simpleCreate(client, invitee_email, invitee_name, url_template, extra) + self.simple_create(client, invitee_email, invitee_name, url_template, extra) ) d.addCallback(lambda data: {k: str(v) for k,v in data.items()}) return d - async def simpleCreate( + async def simple_create( self, client, invitee_email, invitee_name, url_template, extra): """Simplified method to invite somebody by email""" return await self.create( @@ -443,7 +443,7 @@ password = data['password'] try: await self.host.connect(guest_profile, password) - guest_client = self.host.getClient(guest_profile) + guest_client = self.host.get_client(guest_profile) # XXX: be extra careful to use guest_client and not client below, as this will # delete the associated XMPP account log.debug("deleting XMPP account") @@ -453,7 +453,7 @@ f"Can't delete {guest_profile}'s XMPP account, maybe it as already been " f"deleted: {e}") try: - await self.host.memory.asyncDeleteProfile(guest_profile, True) + await self.host.memory.profile_delete_async(guest_profile, True) except Exception as e: log.warning(f"Can't delete guest profile {guest_profile}: {e}") log.debug("removing guest data") @@ -474,8 +474,8 @@ else update them @raise KeyError: there is not invitation with this id_ """ - self.checkExtra(new_extra) - def gotCurrentData(current_data): + self.check_extra(new_extra) + def got_current_data(current_data): if replace: new_data = new_extra for k in EXTRA_RESERVED: @@ -500,7 +500,7 @@ self.invitations[id_] = new_data d = self.invitations[id_] - d.addCallback(gotCurrentData) + d.addCallback(got_current_data) return d def _list(self, profile=C.PROF_KEY_NONE):
--- a/sat/plugins/plugin_misc_extra_pep.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_extra_pep.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,13 +62,13 @@ "category_label": D_(PARAM_KEY), "param_name": PARAM_NAME, "param_label": D_(PARAM_LABEL), - "jids": "\n".join({elt.toXml() for elt in params.createJidElts(PARAM_DEFAULT)}), + "jids": "\n".join({elt.toXml() for elt in params.create_jid_elts(PARAM_DEFAULT)}), } def __init__(self, host): log.info(_("Plugin Extra PEP initialization")) self.host = host - host.memory.updateParams(self.params) + host.memory.update_params(self.params) - def getFollowedEntities(self, profile_key): - return self.host.memory.getParamA(PARAM_NAME, PARAM_KEY, profile_key=profile_key) + def get_followed_entities(self, profile_key): + return self.host.memory.param_get_a(PARAM_NAME, PARAM_KEY, profile_key=profile_key)
--- a/sat/plugins/plugin_misc_file.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_file.py Sat Apr 08 13:54:42 2023 +0200 @@ -69,24 +69,24 @@ def __init__(self, host): log.info(_("plugin File initialization")) self.host = host - host.bridge.addMethod( - "fileSend", + host.bridge.add_method( + "file_send", ".plugin", in_sign="ssssss", out_sign="a{ss}", - method=self._fileSend, + method=self._file_send, async_=True, ) self._file_managers = [] - host.importMenu( + host.import_menu( (D_("Action"), D_("send file")), - self._fileSendMenu, + self._file_send_menu, security_limit=10, help_string=D_("Send a file"), type_=C.MENU_SINGLE, ) - def _fileSend( + def _file_send( self, peer_jid_s: str, filepath: str, @@ -95,13 +95,13 @@ extra_s: str, profile: str = C.PROF_KEY_NONE ) -> defer.Deferred: - client = self.host.getClient(profile) - return defer.ensureDeferred(self.fileSend( + client = self.host.get_client(profile) + return defer.ensureDeferred(self.file_send( client, jid.JID(peer_jid_s), filepath, name or None, file_desc or None, data_format.deserialise(extra_s) )) - async def fileSend( + async def file_send( self, client, peer_jid, filepath, filename=None, file_desc=None, extra=None ): """Send a file using best available method @@ -119,7 +119,7 @@ if not filename: filename = os.path.basename(filepath) or "_" for manager, priority in self._file_managers: - if await utils.asDeferred(manager.canHandleFileSend, + if await utils.as_deferred(manager.can_handle_file_send, client, peer_jid, filepath): try: method_name = manager.name @@ -131,8 +131,8 @@ ) ) try: - progress_id = await utils.asDeferred( - manager.fileSend, client, peer_jid, filepath, filename, file_desc, + progress_id = await utils.as_deferred( + manager.file_send, client, peer_jid, filepath, filename, file_desc, extra ) except Exception as e: @@ -155,15 +155,15 @@ ).toXml() } - def _onFileChoosed(self, peer_jid, data, profile): - client = self.host.getClient(profile) + def _on_file_choosed(self, peer_jid, data, profile): + client = self.host.get_client(profile) cancelled = C.bool(data.get("cancelled", C.BOOL_FALSE)) if cancelled: return path = data["path"] - return self.fileSend(client, peer_jid, path) + return self.file_send(client, peer_jid, path) - def _fileSendMenu(self, data, profile): + def _file_send_menu(self, data, profile): """ XMLUI activated by menu: return file sending UI @param profile: %(doc_profile)s @@ -173,8 +173,8 @@ except RuntimeError: raise exceptions.DataError(_("Invalid JID")) - file_choosed_id = self.host.registerCallback( - partial(self._onFileChoosed, jid_), + file_choosed_id = self.host.register_callback( + partial(self._on_file_choosed, jid_), with_data=True, one_shot=True, ) @@ -193,7 +193,7 @@ def register(self, manager, priority: int = 0) -> None: """Register a fileSending manager - @param manager: object implementing canHandleFileSend, and fileSend methods + @param manager: object implementing can_handle_file_send, and file_send methods @param priority: pririoty of this manager, the higher available will be used """ m_data = (manager, priority) @@ -201,9 +201,9 @@ raise exceptions.ConflictError( f"Manager {manager} is already registered" ) - if not hasattr(manager, "canHandleFileSend") or not hasattr(manager, "fileSend"): + if not hasattr(manager, "can_handle_file_send") or not hasattr(manager, "file_send"): raise ValueError( - f'{manager} must have both "canHandleFileSend" and "fileSend" methods to ' + f'{manager} must have both "can_handle_file_send" and "file_send" methods to ' 'be registered') self._file_managers.append(m_data) self._file_managers.sort(key=lambda m: m[1], reverse=True) @@ -219,7 +219,7 @@ # Dialogs with user # the overwrite check is done here - def openFileWrite(self, client, file_path, transfer_data, file_data, stream_object): + def open_file_write(self, client, file_path, transfer_data, file_data, stream_object): """create SatFile or FileStremaObject for the requested file and fill suitable data """ if stream_object: @@ -245,15 +245,15 @@ data_cb=file_data.get("data_cb"), ) - async def _gotConfirmation( + async def _got_confirmation( self, client, data, peer_jid, transfer_data, file_data, stream_object ): """Called when the permission and dest path have been received @param peer_jid(jid.JID): jid of the file sender - @param transfer_data(dict): same as for [self.getDestDir] - @param file_data(dict): same as for [self.getDestDir] - @param stream_object(bool): same as for [self.getDestDir] + @param transfer_data(dict): same as for [self.get_dest_dir] + @param file_data(dict): same as for [self.get_dest_dir] + @param stream_object(bool): same as for [self.get_dest_dir] return (bool): True if copy is wanted and OK False if user wants to cancel if file exists ask confirmation and call again self._getDestDir if needed @@ -266,7 +266,7 @@ # we manage case where file already exists if os.path.exists(file_path): - overwrite = await xml_tools.deferConfirm( + overwrite = await xml_tools.defer_confirm( self.host, _(CONFIRM_OVERWRITE).format(file_path), _(CONFIRM_OVERWRITE_TITLE), @@ -280,12 +280,12 @@ ) if not overwrite: - return await self.getDestDir(client, peer_jid, transfer_data, file_data) + return await self.get_dest_dir(client, peer_jid, transfer_data, file_data) - self.openFileWrite(client, file_path, transfer_data, file_data, stream_object) + self.open_file_write(client, file_path, transfer_data, file_data, stream_object) return True - async def getDestDir( + async def get_dest_dir( self, client, peer_jid, transfer_data, file_data, stream_object=False ): """Request confirmation and destination dir to user @@ -296,7 +296,7 @@ @param filename(unicode): name of the file @param transfer_data(dict): data of the transfer session, it will be only used to store the file_obj. - "file_obj" (or "stream_object") key *MUST NOT* exist before using getDestDir + "file_obj" (or "stream_object") key *MUST NOT* exist before using get_dest_dir @param file_data(dict): information about the file to be transfered It MUST contain the following keys: - peer_jid (jid.JID): other peer jid @@ -314,7 +314,7 @@ a stream.FileStreamObject will be used return: True if transfer is accepted """ - cont, ret_value = await self.host.trigger.asyncReturnPoint( + cont, ret_value = await self.host.trigger.async_return_point( "FILE_getDestDir", client, peer_jid, transfer_data, file_data, stream_object ) if not cont: @@ -323,8 +323,8 @@ assert filename and not "/" in filename assert PROGRESS_ID_KEY in file_data # human readable size - file_data["size_human"] = common_utils.getHumanSize(file_data["size"]) - resp_data = await xml_tools.deferDialog( + file_data["size_human"] = common_utils.get_human_size(file_data["size"]) + resp_data = await xml_tools.defer_dialog( self.host, _(CONFIRM).format(peer=peer_jid.full(), **file_data), _(CONFIRM_TITLE), @@ -339,7 +339,7 @@ profile=client.profile, ) - accepted = await self._gotConfirmation( + accepted = await self._got_confirmation( client, resp_data, peer_jid,
--- a/sat/plugins/plugin_misc_forums.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_forums.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,26 +62,26 @@ self._p.OPT_SEND_ITEM_SUBSCRIBE: 1, self._p.OPT_PUBLISH_MODEL: self._p.ACCESS_OPEN, } - host.registerNamespace('forums', NS_FORUMS) - host.bridge.addMethod("forumsGet", ".plugin", + host.register_namespace('forums', NS_FORUMS) + host.bridge.add_method("forums_get", ".plugin", in_sign='ssss', out_sign='s', method=self._get, async_=True) - host.bridge.addMethod("forumsSet", ".plugin", + host.bridge.add_method("forums_set", ".plugin", in_sign='sssss', out_sign='', method=self._set, async_=True) - host.bridge.addMethod("forumTopicsGet", ".plugin", + host.bridge.add_method("forum_topics_get", ".plugin", in_sign='ssa{ss}s', out_sign='(aa{ss}s)', - method=self._getTopics, + method=self._get_topics, async_=True) - host.bridge.addMethod("forumTopicCreate", ".plugin", + host.bridge.add_method("forum_topic_create", ".plugin", in_sign='ssa{ss}s', out_sign='', - method=self._createTopic, + method=self._create_topic, async_=True) @defer.inlineCallbacks - def _createForums(self, client, forums, service, node, forums_elt=None, names=None): + def _create_forums(self, client, forums, service, node, forums_elt=None, names=None): """Recursively create <forums> element(s) @param forums(list): forums which may have subforums @@ -115,7 +115,7 @@ log.info(_("creating missing forum node")) forum_node = FORUM_TOPICS_NODE_TPL.format(node=node, uuid=shortuuid.uuid()) yield self._p.createNode(client, service, forum_node, self._node_options) - value = uri.buildXMPPUri('pubsub', + value = uri.build_xmpp_uri('pubsub', path=service.full(), node=forum_node) if key in FORUM_ATTR: @@ -124,7 +124,7 @@ forum_elt.addElement(key, content=value) elif key == 'sub-forums': sub_forums_elt = forum_elt.addElement('forums') - yield self._createForums(client, value, service, node, sub_forums_elt, names=names) + yield self._create_forums(client, value, service, node, sub_forums_elt, names=names) else: log.warning(_("Unknown forum attribute: {key}").format(key=key)) if not forum_elt.getAttribute('title'): @@ -137,7 +137,7 @@ raise ValueError(_("forum need uri or sub-forums")) defer.returnValue(forums_elt) - def _parseForums(self, parent_elt=None, forums=None): + def _parse_forums(self, parent_elt=None, forums=None): """Recursivly parse a <forums> elements and return corresponding forums data @param item(domish.Element): item with <forums> element @@ -170,7 +170,7 @@ data[elt.name] = str(elt) elif elt.name == 'forums': sub_forums = data['sub-forums'] = [] - self._parseForums(elt, sub_forums) + self._parse_forums(elt, sub_forums) if not 'title' in data or not {'uri', 'sub-forums'}.intersection(data): log.warning(_("invalid forum, ignoring: {xml}").format(xml=forum_elt.toXml())) else: @@ -181,7 +181,7 @@ return forums def _get(self, service=None, node=None, forums_key=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if service.strip(): service = jid.JID(service) else: @@ -199,14 +199,14 @@ node = NS_FORUMS if forums_key is None: forums_key = 'default' - items_data = await self._p.getItems(client, service, node, item_ids=[forums_key]) + items_data = await self._p.get_items(client, service, node, item_ids=[forums_key]) item = items_data[0][0] # we have the item and need to convert it to json - forums = self._parseForums(item) + forums = self._parse_forums(item) return forums def _set(self, forums, service=None, node=None, forums_key=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) forums = json.loads(forums) if service.strip(): service = jid.JID(service) @@ -241,16 +241,16 @@ node = NS_FORUMS if forums_key is None: forums_key = 'default' - forums_elt = await self._createForums(client, forums, service, node) - return await self._p.sendItem( + forums_elt = await self._create_forums(client, forums, service, node) + return await self._p.send_item( client, service, node, forums_elt, item_id=forums_key ) - def _getTopics(self, service, node, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - extra = self._p.parseExtra(extra) + def _get_topics(self, service, node, extra=None, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + extra = self._p.parse_extra(extra) d = defer.ensureDeferred( - self.getTopics( + self.get_topics( client, jid.JID(service), node, rsm_request=extra.rsm_request, extra=extra.extra ) @@ -260,12 +260,12 @@ ) return d - async def getTopics(self, client, service, node, rsm_request=None, extra=None): + async def get_topics(self, client, service, node, rsm_request=None, extra=None): """Retrieve topics data Topics are simple microblog URIs with some metadata duplicated from first post """ - topics_data = await self._p.getItems( + topics_data = await self._p.get_items( client, service, node, rsm_request=rsm_request, extra=extra ) topics = [] @@ -279,13 +279,13 @@ topics.append(topic) return (topics, metadata) - def _createTopic(self, service, node, mb_data, profile_key): - client = self.host.getClient(profile_key) + def _create_topic(self, service, node, mb_data, profile_key): + client = self.host.get_client(profile_key) return defer.ensureDeferred( - self.createTopic(client, jid.JID(service), node, mb_data) + self.create_topic(client, jid.JID(service), node, mb_data) ) - async def createTopic(self, client, service, node, mb_data): + async def create_topic(self, client, service, node, mb_data): try: title = mb_data['title'] content = mb_data.pop('content') @@ -296,7 +296,7 @@ topic_node = FORUM_TOPIC_NODE_TPL.format(node=node, uuid=shortuuid.uuid()) await self._p.createNode(client, service, topic_node, self._node_options) await self._m.send(client, mb_data, service, topic_node) - topic_uri = uri.buildXMPPUri('pubsub', + topic_uri = uri.build_xmpp_uri('pubsub', subtype='microblog', path=service.full(), node=topic_node) @@ -304,4 +304,4 @@ topic_elt['uri'] = topic_uri topic_elt['author'] = client.jid.userhost() topic_elt.addElement('title', content = title) - await self._p.sendItem(client, service, node, topic_elt) + await self._p.send_item(client, service, node, topic_elt)
--- a/sat/plugins/plugin_misc_groupblog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_groupblog.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,19 +61,19 @@ log.info(_("Group blog plugin initialization")) self.host = host self._p = self.host.plugins["XEP-0060"] - host.trigger.add("XEP-0277_item2data", self._item2dataTrigger) - host.trigger.add("XEP-0277_data2entry", self._data2entryTrigger) - host.trigger.add("XEP-0277_comments", self._commentsTrigger) + host.trigger.add("XEP-0277_item2data", self._item_2_data_trigger) + host.trigger.add("XEP-0277_data2entry", self._data_2_entry_trigger) + host.trigger.add("XEP-0277_comments", self._comments_trigger) ## plugin management methods ## - def getHandler(self, client): + def get_handler(self, client): return GroupBlog_handler() @defer.inlineCallbacks - def profileConnected(self, client): + def profile_connected(self, client): try: - yield self.host.checkFeatures(client, (NS_PUBSUB_GROUPBLOG,)) + yield self.host.check_features(client, (NS_PUBSUB_GROUPBLOG,)) except exceptions.FeatureNotFound: client.server_groupblog_available = False log.warning( @@ -85,21 +85,21 @@ client.server_groupblog_available = True log.info(_("Server can manage group blogs")) - def getFeatures(self, profile): + def features_get(self, profile): try: - client = self.host.getClient(profile) + client = self.host.get_client(profile) except exceptions.ProfileNotSetError: return {} try: - return {"available": C.boolConst(client.server_groupblog_available)} + return {"available": C.bool_const(client.server_groupblog_available)} except AttributeError: - if self.host.isConnected(profile): + if self.host.is_connected(profile): log.debug("Profile is not connected, service is not checked yet") else: log.error("client.server_groupblog_available should be available !") return {} - def _item2dataTrigger(self, item_elt, entry_elt, microblog_data): + def _item_2_data_trigger(self, item_elt, entry_elt, microblog_data): """Parse item to find group permission elements""" config_form = data_form.findForm(item_elt, NS_PUBSUB_ITEM_CONFIG) if config_form is None: @@ -109,7 +109,7 @@ opt = self._p.OPT_ROSTER_GROUPS_ALLOWED microblog_data['groups'] = config_form.fields[opt].values - def _data2entryTrigger(self, client, mb_data, entry_elt, item_elt): + def _data_2_entry_trigger(self, client, mb_data, entry_elt, item_elt): """Build fine access permission if needed This trigger check if "group*" key are present, @@ -130,7 +130,7 @@ form.addField(allowed) item_elt.addChild(form.toElement()) - def _commentsTrigger(self, client, mb_data, options): + def _comments_trigger(self, client, mb_data, options): """This method is called when a comments node is about to be created It changes the access mode to roster if needed, and give the authorized groups
--- a/sat/plugins/plugin_misc_identity.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_identity.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,11 +77,11 @@ "avatar": { "type": dict, # convert avatar path to avatar metadata (and check validity) - "set_data_filter": self.avatarSetDataFilter, + "set_data_filter": self.avatar_set_data_filter, # update profile avatar, so all frontends are aware - "set_post_treatment": self.avatarSetPostTreatment, - "update_is_new_data": self.avatarUpdateIsNewData, - "update_data_filter": self.avatarUpdateDataFilter, + "set_post_treatment": self.avatar_set_post_treatment, + "update_is_new_data": self.avatar_update_is_new_data, + "update_data_filter": self.avatar_update_data_filter, # we store the metadata in database, to restore it on next connection # (it is stored only for roster entities) "store": True, @@ -92,70 +92,70 @@ # of returning only the data from the first successful callback "get_all": True, # append nicknames from roster, resource, etc. - "get_post_treatment": self.nicknamesGetPostTreatment, - "update_is_new_data": self.nicknamesUpdateIsNewData, + "get_post_treatment": self.nicknames_get_post_treatment, + "update_is_new_data": self.nicknames_update_is_new_data, "store": True, }, "description": { "type": str, "get_all": True, - "get_post_treatment": self.descriptionGetPostTreatment, + "get_post_treatment": self.description_get_post_treatment, "store": True, } } - host.trigger.add("roster_update", self._rosterUpdateTrigger) - host.memory.setSignalOnUpdate("avatar") - host.memory.setSignalOnUpdate("nicknames") - host.bridge.addMethod( - "identityGet", + host.trigger.add("roster_update", self._roster_update_trigger) + host.memory.set_signal_on_update("avatar") + host.memory.set_signal_on_update("nicknames") + host.bridge.add_method( + "identity_get", ".plugin", in_sign="sasbs", out_sign="s", - method=self._getIdentity, + method=self._get_identity, async_=True, ) - host.bridge.addMethod( - "identitiesGet", + host.bridge.add_method( + "identities_get", ".plugin", in_sign="asass", out_sign="s", - method=self._getIdentities, + method=self._get_identities, async_=True, ) - host.bridge.addMethod( - "identitiesBaseGet", + host.bridge.add_method( + "identities_base_get", ".plugin", in_sign="s", out_sign="s", - method=self._getBaseIdentities, + method=self._get_base_identities, async_=True, ) - host.bridge.addMethod( - "identitySet", + host.bridge.add_method( + "identity_set", ".plugin", in_sign="ss", out_sign="", - method=self._setIdentity, + method=self._set_identity, async_=True, ) - host.bridge.addMethod( - "avatarGet", + host.bridge.add_method( + "avatar_get", ".plugin", in_sign="sbs", out_sign="s", method=self._getAvatar, async_=True, ) - host.bridge.addMethod( - "avatarSet", + host.bridge.add_method( + "avatar_set", ".plugin", in_sign="sss", out_sign="", - method=self._setAvatar, + method=self._set_avatar, async_=True, ) - async def profileConnecting(self, client): + async def profile_connecting(self, client): client._identity_update_lock = [] # we restore known identities from database client._identity_storage = persistent.LazyPersistentBinaryDict( @@ -188,22 +188,22 @@ f"{value}") to_delete.append(key) continue - cache = self.host.common_cache.getMetadata(cache_uid) + cache = self.host.common_cache.get_metadata(cache_uid) if cache is None: log.debug( f"purging avatar for {entity}: it is not in cache anymore") to_delete.append(key) continue - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, entity, name, value, silent=True ) for key in to_delete: await client._identity_storage.adel(key) - def _rosterUpdateTrigger(self, client, roster_item): - old_item = client.roster.getItem(roster_item.jid) + def _roster_update_trigger(self, client, roster_item): + old_item = client.roster.get_item(roster_item.jid) if old_item is None or old_item.name != roster_item.name: log.debug( f"roster nickname has been updated to {roster_item.name!r} for " @@ -247,7 +247,7 @@ cb_list.append(callback) cb_list.sort(key=lambda c: c.priority, reverse=True) - def getIdentityJid(self, client, peer_jid): + def get_identity_jid(self, client, peer_jid): """Return jid to use to set identity metadata if it's a jid of a room occupant, full jid will be used @@ -260,9 +260,9 @@ if self._m is None: return peer_jid.userhostJID() else: - return self._m.getBareOrFull(client, peer_jid) + return self._m.get_bare_or_full(client, peer_jid) - def checkType(self, metadata_name, value): + def check_type(self, metadata_name, value): """Check that type used for a metadata is the one declared in self.metadata""" value_type = self.metadata[metadata_name]["type"] if not isinstance(value, value_type): @@ -270,7 +270,7 @@ f"{value} has wrong type: it is {type(value)} while {value_type} was " f"expected") - def getFieldType(self, metadata_name: str) -> str: + def get_field_type(self, metadata_name: str) -> str: """Return the type the requested field @param metadata_name: name of the field to check @@ -298,7 +298,7 @@ @param use_cache: if False, cache won't be checked @param prefilled_values: map of origin => value to use when `get_all` is set """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) try: metadata = self.metadata[metadata_name] except KeyError: @@ -306,7 +306,7 @@ get_all = metadata.get('get_all', False) if use_cache: try: - data = self.host.memory.getEntityDatum( + data = self.host.memory.get_entity_datum( client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): pass @@ -343,7 +343,7 @@ .format(callback=callback.get, metadata_name=metadata_name, e=e)) else: if data: - self.checkType(metadata_name, data) + self.check_type(metadata_name, data) if get_all: if isinstance(data, list): all_data.extend(data) @@ -359,9 +359,9 @@ post_treatment = metadata.get("get_post_treatment") if post_treatment is not None: - data = await utils.asDeferred(post_treatment, client, entity, data) + data = await utils.as_deferred(post_treatment, client, entity, data) - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, entity, metadata_name, data) if metadata.get('store', False): @@ -381,12 +381,12 @@ @param entity(jid.JID, None): entity for which avatar is requested None to use profile's jid """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) metadata = self.metadata[metadata_name] data_filter = metadata.get("set_data_filter") if data_filter is not None: - data = await utils.asDeferred(data_filter, client, entity, data) - self.checkType(metadata_name, data) + data = await utils.as_deferred(data_filter, client, entity, data) + self.check_type(metadata_name, data) try: callbacks = metadata['callbacks'] @@ -411,7 +411,7 @@ post_treatment = metadata.get("set_post_treatment") if post_treatment is not None: - await utils.asDeferred(post_treatment, client, entity, data) + await utils.as_deferred(post_treatment, client, entity, data) async def update( self, @@ -426,14 +426,14 @@ This method may be called by plugins when an identity metadata is available. @param origin: namespace of the plugin which is source of the metadata """ - entity = self.getIdentityJid(client, entity) + entity = self.get_identity_jid(client, entity) if (entity, metadata_name) in client._identity_update_lock: log.debug(f"update is locked for {entity}'s {metadata_name}") return metadata = self.metadata[metadata_name] try: - cached_data = self.host.memory.getEntityDatum( + cached_data = self.host.memory.get_entity_datum( client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): # metadata is not cached, we do the update @@ -443,7 +443,7 @@ try: update_is_new_data = metadata["update_is_new_data"] except KeyError: - update_is_new_data = self.defaultUpdateIsNewData + update_is_new_data = self.default_update_is_new_data if data is None: if cached_data is None: @@ -467,7 +467,7 @@ # get_all is set, meaning that we have to check all plugins # so we first delete current cache try: - self.host.memory.delEntityDatum(client, entity, metadata_name) + self.host.memory.del_entity_datum(client, entity, metadata_name) except (KeyError, exceptions.UnknownEntityError): pass # then fill it again by calling get, which will retrieve all values @@ -481,32 +481,32 @@ if data is not None: data_filter = metadata['update_data_filter'] if data_filter is not None: - data = await utils.asDeferred(data_filter, client, entity, data) - self.checkType(metadata_name, data) + data = await utils.as_deferred(data_filter, client, entity, data) + self.check_type(metadata_name, data) - self.host.memory.updateEntityData(client, entity, metadata_name, data) + self.host.memory.update_entity_data(client, entity, metadata_name, data) if metadata.get('store', False): key = f"{entity}\n{metadata_name}" await client._identity_storage.aset(key, data) - def defaultUpdateIsNewData(self, client, entity, cached_data, new_data): + def default_update_is_new_data(self, client, entity, cached_data, new_data): return new_data != cached_data def _getAvatar(self, entity, use_cache, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) entity = jid.JID(entity) if entity else None d = defer.ensureDeferred(self.get(client, "avatar", entity, use_cache)) d.addCallback(lambda data: data_format.serialise(data)) return d - def _setAvatar(self, file_path, entity, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + def _set_avatar(self, file_path, entity, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) entity = jid.JID(entity) if entity else None return defer.ensureDeferred( self.set(client, "avatar", file_path, entity)) - def _blockingCacheAvatar( + def _blocking_cache_avatar( self, source: str, avatar_data: dict[str, Any] @@ -546,7 +546,7 @@ img_buf.seek(0) image_hash = hashlib.sha1(img_buf.read()).hexdigest() img_buf.seek(0) - with self.host.common_cache.cacheData( + with self.host.common_cache.cache_data( source, image_hash, media_type ) as f: f.write(img_buf.read()) @@ -554,21 +554,21 @@ avatar_data['filename'] = avatar_data['path'].name avatar_data['cache_uid'] = image_hash - async def cacheAvatar(self, source: str, avatar_data: Dict[str, Any]) -> None: + async def cache_avatar(self, source: str, avatar_data: Dict[str, Any]) -> None: """Resize if necessary and cache avatar @param source: source importing the avatar (usually it is plugin's import name), will be used in cache metadata - @param avatar_data: avatar metadata as build by [avatarSetDataFilter] + @param avatar_data: avatar metadata as build by [avatar_set_data_filter] will be updated with following keys: path: updated path using cached file filename: updated filename using cached file base64: resized and base64 encoded avatar cache_uid: SHA1 hash used as cache unique ID """ - await threads.deferToThread(self._blockingCacheAvatar, source, avatar_data) + await threads.deferToThread(self._blocking_cache_avatar, source, avatar_data) - async def avatarSetDataFilter(self, client, entity, file_path): + async def avatar_set_data_filter(self, client, entity, file_path): """Convert avatar file path to dict data""" file_path = Path(file_path) if not file_path.is_file(): @@ -583,14 +583,14 @@ raise ValueError(f"Can't identify type of image at {file_path}") if not media_type.startswith('image/'): raise ValueError(f"File at {file_path} doesn't appear to be an image") - await self.cacheAvatar(IMPORT_NAME, avatar_data) + await self.cache_avatar(IMPORT_NAME, avatar_data) return avatar_data - async def avatarSetPostTreatment(self, client, entity, avatar_data): + async def avatar_set_post_treatment(self, client, entity, avatar_data): """Update our own avatar""" await self.update(client, IMPORT_NAME, "avatar", avatar_data, entity) - def avatarBuildMetadata( + def avatar_build_metadata( self, path: Path, media_type: Optional[str] = None, @@ -622,10 +622,10 @@ "cache_uid": cache_uid, } - def avatarUpdateIsNewData(self, client, entity, cached_data, new_data): + def avatar_update_is_new_data(self, client, entity, cached_data, new_data): return new_data['path'] != cached_data['path'] - async def avatarUpdateDataFilter(self, client, entity, data): + async def avatar_update_data_filter(self, client, entity, data): if not isinstance(data, dict): raise ValueError(f"Invalid data type ({type(data)}), a dict is expected") mandatory_keys = {'path', 'filename', 'cache_uid'} @@ -633,7 +633,7 @@ raise ValueError(f"missing avatar data keys: {mandatory_keys - data.keys()}") return data - async def nicknamesGetPostTreatment(self, client, entity, plugin_nicknames): + async def nicknames_get_post_treatment(self, client, entity, plugin_nicknames): """Prepend nicknames from core locations + set default nickname nicknames are checked from many locations, there is always at least @@ -648,13 +648,13 @@ # for MUC we add resource if entity.resource: - # getIdentityJid let the resource only if the entity is a MUC room + # get_identity_jid let the resource only if the entity is a MUC room # occupant jid nicknames.append(entity.resource) # we first check roster (if we are not in a component) if not client.is_component: - roster_item = client.roster.getItem(entity.userhostJID()) + roster_item = client.roster.get_item(entity.userhostJID()) if roster_item is not None and roster_item.name: # user set name has priority over entity set name nicknames.append(roster_item.name) @@ -670,10 +670,10 @@ # we remove duplicates while preserving order with dict return list(dict.fromkeys(nicknames)) - def nicknamesUpdateIsNewData(self, client, entity, cached_data, new_nicknames): + def nicknames_update_is_new_data(self, client, entity, cached_data, new_nicknames): return not set(new_nicknames).issubset(cached_data) - async def descriptionGetPostTreatment( + async def description_get_post_treatment( self, client: SatXMPPEntity, entity: jid.JID, @@ -682,15 +682,15 @@ """Join all descriptions in a unique string""" return '\n'.join(plugin_description) - def _getIdentity(self, entity_s, metadata_filter, use_cache, profile): + def _get_identity(self, entity_s, metadata_filter, use_cache, profile): entity = jid.JID(entity_s) - client = self.host.getClient(profile) + client = self.host.get_client(profile) d = defer.ensureDeferred( - self.getIdentity(client, entity, metadata_filter, use_cache)) + self.get_identity(client, entity, metadata_filter, use_cache)) d.addCallback(data_format.serialise) return d - async def getIdentity( + async def get_identity( self, client: SatXMPPEntity, entity: Optional[jid.JID] = None, @@ -719,14 +719,14 @@ return id_data - def _getIdentities(self, entities_s, metadata_filter, profile): + def _get_identities(self, entities_s, metadata_filter, profile): entities = [jid.JID(e) for e in entities_s] - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.getIdentities(client, entities, metadata_filter)) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.get_identities(client, entities, metadata_filter)) d.addCallback(lambda d: data_format.serialise({str(j):i for j, i in d.items()})) return d - async def getIdentities( + async def get_identities( self, client: SatXMPPEntity, entities: List[jid.JID], @@ -735,7 +735,7 @@ """Retrieve several identities at once @param entities: entities from which identities must be retrieved - @param metadata_filter: same as for [getIdentity] + @param metadata_filter: same as for [get_identity] @return: identities metadata where key is jid if an error happens while retrieve a jid entity, it won't be present in the result (and a warning will be logged) @@ -745,7 +745,7 @@ for entity_jid in entities: get_identity_list.append( defer.ensureDeferred( - self.getIdentity( + self.get_identity( client, entity=entity_jid, metadata_filter=metadata_filter, @@ -761,13 +761,13 @@ identities[entity_jid] = identity return identities - def _getBaseIdentities(self, profile_key): - client = self.host.getClient(profile_key) - d = defer.ensureDeferred(self.getBaseIdentities(client)) + def _get_base_identities(self, profile_key): + client = self.host.get_client(profile_key) + d = defer.ensureDeferred(self.get_base_identities(client)) d.addCallback(lambda d: data_format.serialise({str(j):i for j, i in d.items()})) return d - async def getBaseIdentities( + async def get_base_identities( self, client: SatXMPPEntity, ) -> dict: @@ -779,20 +779,20 @@ if client.is_component: entities = [client.jid.userhostJID()] else: - entities = client.roster.getJids() + [client.jid.userhostJID()] + entities = client.roster.get_jids() + [client.jid.userhostJID()] - return await self.getIdentities( + return await self.get_identities( client, entities, ['avatar', 'nicknames'] ) - def _setIdentity(self, id_data_s, profile): - client = self.host.getClient(profile) + def _set_identity(self, id_data_s, profile): + client = self.host.get_client(profile) id_data = data_format.deserialise(id_data_s) - return defer.ensureDeferred(self.setIdentity(client, id_data)) + return defer.ensureDeferred(self.set_identity(client, id_data)) - async def setIdentity(self, client, id_data): + async def set_identity(self, client, id_data): """Update profile's identity @param id_data(dict): data to update, key can be one of self.metadata keys
--- a/sat/plugins/plugin_misc_ip.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_ip.py Sat Apr 08 13:54:42 2023 +0200 @@ -99,7 +99,7 @@ def __init__(self, host): log.info(_("plugin IP discovery initialization")) self.host = host - host.memory.updateParams(PARAMS) + host.memory.update_params(PARAMS) # NAT-Port try: @@ -109,50 +109,50 @@ self._nat = None # XXX: cache is kept until SàT is restarted - # if IP may have changed, use self.refreshIP + # if IP may have changed, use self.refresh_ip self._external_ip_cache = None self._local_ip_cache = None - def getHandler(self, client): + def get_handler(self, client): return IPPlugin_handler() - def refreshIP(self): + def refresh_ip(self): # FIXME: use a trigger instead ? self._external_ip_cache = None self._local_ip_cache = None - def _externalAllowed(self, client): + def _external_allowed(self, client): """Return value of parameter with autorisation of user to do external requests if parameter is not set, a dialog is shown to use to get its confirmation, and parameted is set according to answer @return (defer.Deferred[bool]): True if external request is autorised """ - allow_get_ip = self.host.memory.params.getParamA( + allow_get_ip = self.host.memory.params.param_get_a( GET_IP_NAME, GET_IP_CATEGORY, use_default=False ) if allow_get_ip is None: # we don't have autorisation from user yet to use get_ip, we ask him - def setParam(allowed): - # FIXME: we need to use boolConst as setParam only manage str/unicode + def param_set(allowed): + # FIXME: we need to use bool_const as param_set only manage str/unicode # need to be fixed when params will be refactored - self.host.memory.setParam( - GET_IP_NAME, C.boolConst(allowed), GET_IP_CATEGORY + self.host.memory.param_set( + GET_IP_NAME, C.bool_const(allowed), GET_IP_CATEGORY ) return allowed - d = xml_tools.deferConfirm( + d = xml_tools.defer_confirm( self.host, _(GET_IP_CONFIRM), _(GET_IP_CONFIRM_TITLE), profile=client.profile, ) - d.addCallback(setParam) + d.addCallback(param_set) return d return defer.succeed(allow_get_ip) - def _filterAddresse(self, ip_addr): + def _filter_addresse(self, ip_addr): """Filter acceptable addresses For now, just remove IPv4 local addresses @@ -161,7 +161,7 @@ """ return not ip_addr.startswith("127.") - def _insertFirst(self, addresses, ip_addr): + def _insert_first(self, addresses, ip_addr): """Insert ip_addr as first item in addresses @param addresses(list): list of IP addresses @@ -174,7 +174,7 @@ else: addresses.insert(0, ip_addr) - async def _getIPFromExternal(self, ext_url): + async def _get_ip_from_external(self, ext_url): """Get local IP by doing a connection on an external url @param ext_utl(str): url to connect to @@ -201,7 +201,7 @@ return local_ip @defer.inlineCallbacks - def getLocalIPs(self, client): + def get_local_i_ps(self, client): """Try do discover local area network IPs @return (deferred): list of lan IP addresses @@ -225,43 +225,43 @@ continue for data in inet_list: addresse = data["addr"] - if self._filterAddresse(addresse): + if self._filter_addresse(addresse): addresses.append(addresse) # then we use our connection to server ip = client.xmlstream.transport.getHost().host - if self._filterAddresse(ip): - self._insertFirst(addresses, ip) + if self._filter_addresse(ip): + self._insert_first(addresses, ip) defer.returnValue(addresses) # if server is local, we try with NAT-Port if self._nat is not None: - nat_ip = yield self._nat.getIP(local=True) + nat_ip = yield self._nat.get_ip(local=True) if nat_ip is not None: - self._insertFirst(addresses, nat_ip) + self._insert_first(addresses, nat_ip) defer.returnValue(addresses) if addresses: defer.returnValue(addresses) # still not luck, we need to contact external website - allow_get_ip = yield self._externalAllowed(client) + allow_get_ip = yield self._external_allowed(client) if not allow_get_ip: defer.returnValue(addresses or localhost) try: - local_ip = yield defer.ensureDeferred(self._getIPFromExternal(GET_IP_PAGE)) + local_ip = yield defer.ensureDeferred(self._get_ip_from_external(GET_IP_PAGE)) except (internet_error.DNSLookupError, internet_error.TimeoutError): log.warning("Can't access Domain Name System") else: if local_ip is not None: - self._insertFirst(addresses, local_ip) + self._insert_first(addresses, local_ip) defer.returnValue(addresses or localhost) @defer.inlineCallbacks - def getExternalIP(self, client): + def get_external_ip(self, client): """Try to discover external IP @return (deferred): external IP address or None if it can't be discovered @@ -295,13 +295,13 @@ # then with NAT-Port if self._nat is not None: - nat_ip = yield self._nat.getIP() + nat_ip = yield self._nat.get_ip() if nat_ip is not None: self._external_ip_cache = nat_ip defer.returnValue(nat_ip) # and finally by requesting external website - allow_get_ip = yield self._externalAllowed(client) + allow_get_ip = yield self._external_allowed(client) try: ip = ((yield webclient.getPage(GET_IP_PAGE.encode('utf-8'))) if allow_get_ip else None)
--- a/sat/plugins/plugin_misc_lists.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_lists.py Sat Apr 08 13:54:42 2023 +0200 @@ -208,16 +208,16 @@ log.info(_("Pubsub lists plugin initialization")) self.host = host self._s = self.host.plugins["XEP-0346"] - self.namespace = self._s.getSubmittedNS(APP_NS_TICKETS) - host.registerNamespace("tickets", APP_NS_TICKETS) - host.registerNamespace("tickets_type", NS_TICKETS_TYPE) + self.namespace = self._s.get_submitted_ns(APP_NS_TICKETS) + host.register_namespace("tickets", APP_NS_TICKETS) + host.register_namespace("tickets_type", NS_TICKETS_TYPE) self.host.plugins["PUBSUB_INVITATION"].register( APP_NS_TICKETS, self ) self._p = self.host.plugins["XEP-0060"] self._m = self.host.plugins["XEP-0277"] - host.bridge.addMethod( - "listGet", + host.bridge.add_method( + "list_get", ".plugin", in_sign="ssiassss", out_sign="s", @@ -232,68 +232,68 @@ default_node=self.namespace, form_ns=APP_NS_TICKETS, filters={ - "author": self._s.valueOrPublisherFilter, - "created": self._s.dateFilter, - "updated": self._s.dateFilter, - "time_limit": self._s.dateFilter, + "author": self._s.value_or_publisher_filter, + "created": self._s.date_filter, + "updated": self._s.date_filter, + "time_limit": self._s.date_filter, }, profile_key=profile_key), async_=True, ) - host.bridge.addMethod( - "listSet", + host.bridge.add_method( + "list_set", ".plugin", in_sign="ssa{sas}ssss", out_sign="s", method=self._set, async_=True, ) - host.bridge.addMethod( - "listDeleteItem", + host.bridge.add_method( + "list_delete_item", ".plugin", in_sign="sssbs", out_sign="", method=self._delete, async_=True, ) - host.bridge.addMethod( - "listSchemaGet", + host.bridge.add_method( + "list_schema_get", ".plugin", in_sign="sss", out_sign="s", - method=lambda service, nodeIdentifier, profile_key: self._s._getUISchema( + method=lambda service, nodeIdentifier, profile_key: self._s._get_ui_schema( service, nodeIdentifier, default_node=self.namespace, profile_key=profile_key), async_=True, ) - host.bridge.addMethod( - "listsList", + host.bridge.add_method( + "lists_list", ".plugin", in_sign="sss", out_sign="s", - method=self._listsList, + method=self._lists_list, async_=True, ) - host.bridge.addMethod( - "listTemplatesNamesGet", + host.bridge.add_method( + "list_templates_names_get", ".plugin", in_sign="ss", out_sign="s", - method=self._getTemplatesNames, + method=self._get_templates_names, ) - host.bridge.addMethod( - "listTemplateGet", + host.bridge.add_method( + "list_template_get", ".plugin", in_sign="sss", out_sign="s", - method=self._getTemplate, + method=self._get_template, ) - host.bridge.addMethod( - "listTemplateCreate", + host.bridge.add_method( + "list_template_create", ".plugin", in_sign="ssss", out_sign="(ss)", - method=self._createTemplate, + method=self._create_template, async_=True, ) @@ -309,7 +309,7 @@ item_elt: domish.Element ) -> None: try: - schema = await self._s.getSchemaForm(client, service, node) + schema = await self._s.get_schema_form(client, service, node) except Exception as e: log.warning(f"Can't retrive node schema as {node!r} [{service}]: {e}") else: @@ -323,7 +323,7 @@ def _set(self, service, node, values, schema=None, item_id=None, extra_s='', profile_key=C.PROF_KEY_NONE): - client, service, node, schema, item_id, extra = self._s.prepareBridgeSet( + client, service, node, schema, item_id, extra = self._s.prepare_bridge_set( service, node, schema, item_id, extra_s, profile_key ) d = defer.ensureDeferred(self.set( @@ -346,22 +346,22 @@ 'created' and 'updated' will be forced to current time: - 'created' is set if item_id is None, i.e. if it's a new ticket - 'updated' is set everytime - @param extra(dict, None): same as for [XEP-0060.sendItem] with additional keys: + @param extra(dict, None): same as for [XEP-0060.send_item] with additional keys: - update(bool): if True, get previous item data to merge with current one if True, item_id must be set - other arguments are same as for [self._s.sendDataFormItem] + other arguments are same as for [self._s.send_data_form_item] @return (unicode): id of the created item """ if not node: node = self.namespace if not item_id: - comments_service = await self._m.getCommentsService(client, service) + comments_service = await self._m.get_comments_service(client, service) # we need to use uuid for comments node, because we don't know item id in # advance (we don't want to set it ourselves to let the server choose, so we # can have a nicer id if serial ids is activated) - comments_node = self._m.getCommentsNode( + comments_node = self._m.get_comments_node( node + "_" + str(shortuuid.uuid()) ) options = { @@ -372,7 +372,7 @@ self._p.OPT_PUBLISH_MODEL: self._p.ACCESS_OPEN, } await self._p.createNode(client, comments_service, comments_node, options) - values["comments_uri"] = uri.buildXMPPUri( + values["comments_uri"] = uri.build_xmpp_uri( "pubsub", subtype="microblog", path=comments_service.full(), @@ -386,7 +386,7 @@ def _delete( self, service_s, nodeIdentifier, itemIdentifier, notify, profile_key ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred(self.delete( client, jid.JID(service_s) if service_s else None, @@ -405,26 +405,26 @@ ) -> None: if not node: node = self.namespace - return await self._p.retractItems( + return await self._p.retract_items( service, node, (itemIdentifier,), notify, client.profile ) - def _listsList(self, service, node, profile): + def _lists_list(self, service, node, profile): service = jid.JID(service) if service else None node = node or None - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.listsList(client, service, node)) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.lists_list(client, service, node)) d.addCallback(data_format.serialise) return d - async def listsList( + async def lists_list( self, client, service: Optional[jid.JID], node: Optional[str]=None ) -> List[dict]: """Retrieve list of pubsub lists registered in personal interests @return list: list of lists metadata """ - items, metadata = await self.host.plugins['LIST_INTEREST'].listInterests( + items, metadata = await self.host.plugins['LIST_INTEREST'].list_interests( client, service, node, namespace=APP_NS_TICKETS) lists = [] for item in items: @@ -454,34 +454,34 @@ return lists - def _getTemplatesNames(self, language, profile): - client = self.host.getClient(profile) - return data_format.serialise(self.getTemplatesNames(client, language)) + def _get_templates_names(self, language, profile): + client = self.host.get_client(profile) + return data_format.serialise(self.get_templates_names(client, language)) - def getTemplatesNames(self, client, language: str) -> list: + def get_templates_names(self, client, language: str) -> list: """Retrieve well known list templates""" templates = [{"id": tpl_id, "name": d["name"], "icon": d["icon"]} for tpl_id, d in TEMPLATES.items()] return templates - def _getTemplate(self, name, language, profile): - client = self.host.getClient(profile) - return data_format.serialise(self.getTemplate(client, name, language)) + def _get_template(self, name, language, profile): + client = self.host.get_client(profile) + return data_format.serialise(self.get_template(client, name, language)) - def getTemplate(self, client, name: str, language: str) -> dict: + def get_template(self, client, name: str, language: str) -> dict: """Retrieve a well known template""" return TEMPLATES[name] - def _createTemplate(self, template_id, name, access_model, profile): - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.createTemplate( + def _create_template(self, template_id, name, access_model, profile): + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.create_template( client, template_id, name, access_model )) d.addCallback(lambda node_data: (node_data[0].full(), node_data[1])) return d - async def createTemplate( + async def create_template( self, client, template_id: str, name: str, access_model: str ) -> Tuple[jid.JID, str]: """Create a list from a template""" @@ -493,12 +493,12 @@ 0, {"type": "hidden", "name": NS_TICKETS_TYPE, "value": template_id} ) - schema = xml_tools.dataDict2dataForm( + schema = xml_tools.data_dict_2_data_form( {"namespace": APP_NS_TICKETS, "fields": fields} ).toElement() service = client.jid.userhostJID() - node = self._s.getSubmittedNS(f"{APP_NS_TICKETS}_{name}") + node = self._s.get_submitted_ns(f"{APP_NS_TICKETS}_{name}") options = { self._p.OPT_ACCESS_MODEL: access_model, } @@ -507,11 +507,11 @@ # XXX: should node options be in TEMPLATE? options[self._p.OPT_OVERWRITE_POLICY] = self._p.OWPOL_ANY_PUB await self._p.createNode(client, service, node, options) - await self._s.setSchema(client, service, node, schema) + await self._s.set_schema(client, service, node, schema) list_elt = domish.Element((APP_NS_TICKETS, "list")) list_elt["type"] = template_id try: - await self.host.plugins['LIST_INTEREST'].registerPubsub( + await self.host.plugins['LIST_INTEREST'].register_pubsub( client, APP_NS_TICKETS, service, node, creator=True, name=name, element=list_elt) except Exception as e:
--- a/sat/plugins/plugin_misc_merge_requests.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_merge_requests.py Sat Apr 08 13:54:42 2023 +0200 @@ -69,35 +69,35 @@ log.info(_("Merge requests plugin initialization")) self.host = host self._s = self.host.plugins["XEP-0346"] - self.namespace = self._s.getSubmittedNS(APP_NS_MERGE_REQUESTS) - host.registerNamespace('merge_requests', self.namespace) + self.namespace = self._s.get_submitted_ns(APP_NS_MERGE_REQUESTS) + host.register_namespace('merge_requests', self.namespace) self._p = self.host.plugins["XEP-0060"] self._t = self.host.plugins["LISTS"] self._handlers = {} self._handlers_list = [] # handlers sorted by priority self._type_handlers = {} # data type => handler map - host.bridge.addMethod("mergeRequestsGet", ".plugin", + host.bridge.add_method("merge_requests_get", ".plugin", in_sign='ssiassss', out_sign='s', method=self._get, async_=True ) - host.bridge.addMethod("mergeRequestSet", ".plugin", + host.bridge.add_method("merge_request_set", ".plugin", in_sign='ssssa{sas}ssss', out_sign='s', method=self._set, async_=True) - host.bridge.addMethod("mergeRequestsSchemaGet", ".plugin", + host.bridge.add_method("merge_requests_schema_get", ".plugin", in_sign='sss', out_sign='s', method=lambda service, nodeIdentifier, profile_key: - self._s._getUISchema(service, + self._s._get_ui_schema(service, nodeIdentifier, default_node=self.namespace, profile_key=profile_key), async_=True) - host.bridge.addMethod("mergeRequestParseData", ".plugin", + host.bridge.add_method("merge_request_parse_data", ".plugin", in_sign='ss', out_sign='aa{ss}', - method=self._parseData, + method=self._parse_data, async_=True) - host.bridge.addMethod("mergeRequestsImport", ".plugin", + host.bridge.add_method("merge_requests_import", ".plugin", in_sign='ssssa{ss}s', out_sign='', method=self._import, async_=True @@ -141,7 +141,7 @@ def serialise(self, get_data): tickets_xmlui, metadata, items_patches = get_data - tickets_xmlui_s, metadata = self._p.transItemsData((tickets_xmlui, metadata)) + tickets_xmlui_s, metadata = self._p.trans_items_data((tickets_xmlui, metadata)) return data_format.serialise({ "items": tickets_xmlui_s, "metadata": metadata, @@ -151,7 +151,7 @@ def _get(self, service='', node='', max_items=10, item_ids=None, sub_id=None, extra="", profile_key=C.PROF_KEY_NONE): extra = data_format.deserialise(extra) - client, service, node, max_items, extra, sub_id = self._s.prepareBridgeGet( + client, service, node, max_items, extra, sub_id = self._s.prepare_bridge_get( service, node, max_items, sub_id, extra, profile_key) d = self.get(client, service, node or None, max_items, item_ids, sub_id or None, extra.rsm_request, extra.extra) @@ -178,11 +178,11 @@ # XXX: Q&D way to get list for labels when displaying them, but text when we # have to modify them if C.bool(extra.get('labels_as_list', C.BOOL_FALSE)): - filters = {'labels': self._s.textbox2ListFilter} + filters = {'labels': self._s.textbox_2_list_filter} else: filters = {} tickets_xmlui, metadata = yield defer.ensureDeferred( - self._s.getDataFormItems( + self._s.get_data_form_items( client, service, node, @@ -199,13 +199,13 @@ for ticket in tickets_xmlui: request_type = ticket.named_widgets[FIELD_DATA_TYPE].value request_data = ticket.named_widgets[FIELD_DATA].value - parsed_data = yield self.parseData(request_type, request_data) + parsed_data = yield self.parse_data(request_type, request_data) parsed_patches.append(parsed_data) defer.returnValue((tickets_xmlui, metadata, parsed_patches)) def _set(self, service, node, repository, method, values, schema=None, item_id=None, extra="", profile_key=C.PROF_KEY_NONE): - client, service, node, schema, item_id, extra = self._s.prepareBridgeSet( + client, service, node, schema, item_id, extra = self._s.prepare_bridge_set( service, node, schema, item_id, extra, profile_key) d = defer.ensureDeferred( self.set( @@ -290,13 +290,13 @@ deserialise, form_ns=APP_NS_MERGE_REQUESTS) return item_id - def _parseData(self, data_type, data): - d = self.parseData(data_type, data) + def _parse_data(self, data_type, data): + d = self.parse_data(data_type, data) d.addCallback(lambda parsed_patches: {key: str(value) for key, value in parsed_patches.items()}) return d - def parseData(self, data_type, data): + def parse_data(self, data_type, data): """Parse a merge request data according to type @param data_type(unicode): type of the data to parse @@ -314,7 +314,7 @@ def _import(self, repository, item_id, service=None, node=None, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None d = self.import_request(client, repository, item_id, service, node or None, extra=extra or None) @@ -323,14 +323,14 @@ @defer.inlineCallbacks def import_request(self, client, repository, item, service=None, node=None, extra=None): - """Import a merge request in specified directory + """import a merge request in specified directory @param repository(unicode): path to the repository where the code stands """ if not node: node = self.namespace tickets_xmlui, metadata = yield defer.ensureDeferred( - self._s.getDataFormItems( + self._s.get_data_form_items( client, service, node,
--- a/sat/plugins/plugin_misc_nat_port.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_nat_port.py Sat Apr 08 13:54:42 2023 +0200 @@ -75,7 +75,7 @@ def unload(self): if self._to_unmap: log.info("Cleaning mapped ports") - return threads.deferToThread(self._unmapPortsBlocking) + return threads.deferToThread(self._unmap_ports_blocking) def _init_failed(self, failure_): e = failure_.trap(exceptions.NotFound, exceptions.FeatureNotFound) @@ -98,23 +98,23 @@ except Exception: raise failure.Failure(exceptions.FeatureNotFound()) - def getIP(self, local=False): + def get_ip(self, local=False): """Return IP address found with UPnP-IGD @param local(bool): True to get external IP address, False to get local network one @return (None, str): found IP address, or None of something got wrong """ - def getIP(__): + def get_ip(__): if self._upnp is None: return None # lanaddr can be the empty string if not found, # we need to return None in this case return (self._upnp.lanaddr or None) if local else self._external_ip - return self._initialised.addCallback(getIP) + return self._initialised.addCallback(get_ip) - def _unmapPortsBlocking(self): + def _unmap_ports_blocking(self): """Unmap ports mapped in this session""" self._mutex.acquire() try: @@ -137,7 +137,7 @@ finally: self._mutex.release() - def _mapPortBlocking(self, int_port, ext_port, protocol, desc): + def _map_port_blocking(self, int_port, ext_port, protocol, desc): """Internal blocking method to map port @param int_port(int): internal port to use @@ -186,7 +186,7 @@ return ext_port - def mapPort(self, int_port, ext_port=None, protocol="TCP", desc=DEFAULT_DESC): + def map_port(self, int_port, ext_port=None, protocol="TCP", desc=DEFAULT_DESC): """Add a port mapping @param int_port(int): internal port to use @@ -199,7 +199,7 @@ if self._upnp is None: return defer.succeed(None) - def mappingCb(ext_port): + def mapping_cb(ext_port): log.info( "{protocol} mapping from {int_port} to {ext_port} successful".format( protocol=protocol, int_port=int_port, ext_port=ext_port @@ -207,16 +207,16 @@ ) return ext_port - def mappingEb(failure_): + def mapping_eb(failure_): failure_.trap(MappingError) log.warning("Can't map internal {int_port}".format(int_port=int_port)) - def mappingUnknownEb(failure_): + def mapping_unknown_eb(failure_): log.error(_("error while trying to map ports: {msg}").format(msg=failure_)) d = threads.deferToThread( - self._mapPortBlocking, int_port, ext_port, protocol, desc + self._map_port_blocking, int_port, ext_port, protocol, desc ) - d.addCallbacks(mappingCb, mappingEb) - d.addErrback(mappingUnknownEb) + d.addCallbacks(mapping_cb, mapping_eb) + d.addErrback(mapping_unknown_eb) return d
--- a/sat/plugins/plugin_misc_quiz.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_quiz.py Sat Apr 08 13:54:42 2023 +0200 @@ -44,7 +44,7 @@ class Quiz(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -53,7 +53,7 @@ def __init__(self, host): log.info(_("Plugin Quiz initialization")) - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -62,39 +62,39 @@ game_init={"stage": None}, player_init={"score": 0}, ) - host.bridge.addMethod( - "quizGameLaunch", + host.bridge.add_method( + "quiz_game_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, ) # args: players, room_jid, profile - host.bridge.addMethod( - "quizGameCreate", + host.bridge.add_method( + "quiz_game_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) # args: room_jid, players, profile - host.bridge.addMethod( - "quizGameReady", + host.bridge.add_method( + "quiz_game_ready", ".plugin", in_sign="sss", out_sign="", - method=self._playerReady, + method=self._player_ready, ) # args: player, referee, profile - host.bridge.addMethod( - "quizGameAnswer", + host.bridge.add_method( + "quiz_game_answer", ".plugin", in_sign="ssss", out_sign="", - method=self.playerAnswer, + method=self.player_answer, ) - host.bridge.addSignal( - "quizGameStarted", ".plugin", signature="ssass" + host.bridge.add_signal( + "quiz_game_started", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "quizGameNew", + host.bridge.add_signal( + "quiz_game_new", ".plugin", signature="sa{ss}s", doc={ @@ -104,8 +104,8 @@ "param_2": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameQuestion", + host.bridge.add_signal( + "quiz_game_question", ".plugin", signature="sssis", doc={ @@ -117,8 +117,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGamePlayerBuzzed", + host.bridge.add_signal( + "quiz_game_player_buzzed", ".plugin", signature="ssbs", doc={ @@ -129,8 +129,8 @@ "param_3": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGamePlayerSays", + host.bridge.add_signal( + "quiz_game_player_says", ".plugin", signature="sssis", doc={ @@ -142,8 +142,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameAnswerResult", + host.bridge.add_signal( + "quiz_game_answer_result", ".plugin", signature="ssba{si}s", doc={ @@ -155,8 +155,8 @@ "param_4": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameTimerExpired", + host.bridge.add_signal( + "quiz_game_timer_expired", ".plugin", signature="ss", doc={ @@ -165,8 +165,8 @@ "param_1": "%(doc_profile)s", }, ) - host.bridge.addSignal( - "quizGameTimerRestarted", + host.bridge.add_signal( + "quiz_game_timer_restarted", ".plugin", signature="sis", doc={ @@ -238,7 +238,7 @@ def __start_play(self, room_jid, game_data, profile): """Start the game (tell to the first player after dealer to play""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data["stage"] = "play" next_player_idx = game_data["current_player"] = ( game_data["init_player"] + 1 @@ -251,9 +251,9 @@ mess.firstChildElement().addElement("your_turn") client.send(mess) - def playerAnswer(self, player, referee, answer, profile_key=C.PROF_KEY_NONE): + def player_answer(self, player, referee, answer, profile_key=C.PROF_KEY_NONE): """Called when a player give an answer""" - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.debug( "new player answer (%(profile)s): %(answer)s" % {"profile": client.profile, "answer": answer} @@ -264,17 +264,17 @@ answer_elt.addContent(answer) client.send(mess) - def timerExpired(self, room_jid, profile): + def timer_expired(self, room_jid, profile): """Called when nobody answered the question in time""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] game_data["stage"] = "expired" mess = self.createGameElt(room_jid) mess.firstChildElement().addElement("timer_expired") client.send(mess) - reactor.callLater(4, self.askQuestion, room_jid, client.profile) + reactor.callLater(4, self.ask_question, room_jid, client.profile) - def pauseTimer(self, room_jid): + def pause_timer(self, room_jid): """Stop the timer and save the time left""" game_data = self.games[room_jid] left = max(0, game_data["timer"].getTime() - time()) @@ -283,9 +283,9 @@ game_data["previous_stage"] = game_data["stage"] game_data["stage"] = "paused" - def restartTimer(self, room_jid, profile): + def restart_timer(self, room_jid, profile): """Restart a timer with the saved time""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] assert game_data["time_left"] is not None mess = self.createGameElt(room_jid) @@ -293,15 +293,15 @@ jabber_client.restarted_elt["time_left"] = str(game_data["time_left"]) client.send(mess) game_data["timer"] = reactor.callLater( - game_data["time_left"], self.timerExpired, room_jid, profile + game_data["time_left"], self.timer_expired, room_jid, profile ) game_data["time_left"] = None game_data["stage"] = game_data["previous_stage"] del game_data["previous_stage"] - def askQuestion(self, room_jid, profile): + def ask_question(self, room_jid, profile): """Ask a new question""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] game_data["stage"] = "question" game_data["question_id"] = "1" @@ -314,13 +314,13 @@ ) client.send(mess) game_data["timer"] = reactor.callLater( - timer, self.timerExpired, room_jid, profile + timer, self.timer_expired, room_jid, profile ) game_data["time_left"] = None - def checkAnswer(self, room_jid, player, answer, profile): + def check_answer(self, room_jid, player, answer, profile): """Check if the answer given is right""" - client = self.host.getClient(profile) + client = self.host.get_client(profile) game_data = self.games[room_jid] players_data = game_data["players_data"] good_answer = game_data["question_id"] == "1" and answer == "42" @@ -334,11 +334,11 @@ client.send(mess) if good_answer: - reactor.callLater(4, self.askQuestion, room_jid, profile) + reactor.callLater(4, self.ask_question, room_jid, profile) else: - reactor.callLater(4, self.restartTimer, room_jid, profile) + reactor.callLater(4, self.restart_timer, room_jid, profile) - def newGame(self, room_jid, profile): + def new_game(self, room_jid, profile): """Launch a new round""" common_data = {"game_score": 0} new_game_data = { @@ -349,11 +349,11 @@ ) } msg_elts = self.__game_data_to_xml(new_game_data) - RoomGame.newRound(self, room_jid, (common_data, msg_elts), profile) - reactor.callLater(10, self.askQuestion, room_jid, profile) + RoomGame.new_round(self, room_jid, (common_data, msg_elts), profile) + reactor.callLater(10, self.ask_question, room_jid, profile) def room_game_cmd(self, mess_elt, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) from_jid = jid.JID(mess_elt["from"]) room_jid = jid.JID(from_jid.userhost()) game_elt = mess_elt.firstChildElement() @@ -367,7 +367,7 @@ players = [] for player in elt.elements(): players.append(str(player)) - self.host.bridge.quizGameStarted( + self.host.bridge.quiz_game_started( room_jid.userhost(), from_jid.full(), players, profile ) @@ -383,15 +383,15 @@ if ( list(status.values()).count("ready") == nb_players ): # everybody is ready, we can start the game - self.newGame(room_jid, profile) + self.new_game(room_jid, profile) elif elt.name == "game_data": - self.host.bridge.quizGameNew( + self.host.bridge.quiz_game_new( room_jid.userhost(), self.__xml_to_game_data(elt), profile ) elif elt.name == "question": # A question is asked - self.host.bridge.quizGameQuestion( + self.host.bridge.quiz_game_question( room_jid.userhost(), elt["id"], str(elt), @@ -411,7 +411,7 @@ buzzer_elt["pause"] = str(pause) client.send(mess) if pause: - self.pauseTimer(room_jid) + self.pause_timer(room_jid) # and we send the player answer mess = self.createGameElt(room_jid) _answer = str(elt) @@ -421,16 +421,16 @@ say_elt["delay"] = "3" reactor.callLater(2, client.send, mess) reactor.callLater( - 6, self.checkAnswer, room_jid, player, _answer, profile=profile + 6, self.check_answer, room_jid, player, _answer, profile=profile ) elif elt.name == "player_buzzed": - self.host.bridge.quizGamePlayerBuzzed( + self.host.bridge.quiz_game_player_buzzed( room_jid.userhost(), elt["player"], elt["pause"] == str(True), profile ) elif elt.name == "player_says": - self.host.bridge.quizGamePlayerSays( + self.host.bridge.quiz_game_player_says( room_jid.userhost(), elt["player"], str(elt), @@ -440,15 +440,15 @@ elif elt.name == "answer_result": player, good_answer, score = self.__answer_result_to_signal_args(elt) - self.host.bridge.quizGameAnswerResult( + self.host.bridge.quiz_game_answer_result( room_jid.userhost(), player, good_answer, score, profile ) elif elt.name == "timer_expired": - self.host.bridge.quizGameTimerExpired(room_jid.userhost(), profile) + self.host.bridge.quiz_game_timer_expired(room_jid.userhost(), profile) elif elt.name == "timer_restarted": - self.host.bridge.quizGameTimerRestarted( + self.host.bridge.quiz_game_timer_restarted( room_jid.userhost(), int(elt["time_left"]), profile )
--- a/sat/plugins/plugin_misc_radiocol.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_radiocol.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,7 +65,7 @@ class Radiocol(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -74,7 +74,7 @@ def __init__(self, host): log.info(_("Radio collective initialization")) - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -89,49 +89,49 @@ }, ) self.host = host - host.bridge.addMethod( - "radiocolLaunch", + host.bridge.add_method( + "radiocol_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, async_=True, ) - host.bridge.addMethod( - "radiocolCreate", + host.bridge.add_method( + "radiocol_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) - host.bridge.addMethod( - "radiocolSongAdded", + host.bridge.add_method( + "radiocol_song_added", ".plugin", in_sign="sss", out_sign="", - method=self._radiocolSongAdded, + method=self._radiocol_song_added, async_=True, ) - host.bridge.addSignal( - "radiocolPlayers", ".plugin", signature="ssass" + host.bridge.add_signal( + "radiocol_players", ".plugin", signature="ssass" ) # room_jid, referee, players, profile - host.bridge.addSignal( - "radiocolStarted", ".plugin", signature="ssasais" + host.bridge.add_signal( + "radiocol_started", ".plugin", signature="ssasais" ) # room_jid, referee, players, [QUEUE_TO_START, QUEUE_LIMIT], profile - host.bridge.addSignal( - "radiocolSongRejected", ".plugin", signature="sss" + host.bridge.add_signal( + "radiocol_song_rejected", ".plugin", signature="sss" ) # room_jid, reason, profile - host.bridge.addSignal( - "radiocolPreload", ".plugin", signature="ssssssss" + host.bridge.add_signal( + "radiocol_preload", ".plugin", signature="ssssssss" ) # room_jid, timestamp, filename, title, artist, album, profile - host.bridge.addSignal( - "radiocolPlay", ".plugin", signature="sss" + host.bridge.add_signal( + "radiocol_play", ".plugin", signature="sss" ) # room_jid, filename, profile - host.bridge.addSignal( - "radiocolNoUpload", ".plugin", signature="ss" + host.bridge.add_signal( + "radiocol_no_upload", ".plugin", signature="ss" ) # room_jid, profile - host.bridge.addSignal( - "radiocolUploadOk", ".plugin", signature="ss" + host.bridge.add_signal( + "radiocol_upload_ok", ".plugin", signature="ss" ) # room_jid, profile def __create_preload_elt(self, sender, song_added_elt): @@ -143,10 +143,10 @@ # XXX: the frontend should know the temporary directory where file is put return preload_elt - def _radiocolSongAdded(self, referee_s, song_path, profile): - return self.radiocolSongAdded(jid.JID(referee_s), song_path, profile) + def _radiocol_song_added(self, referee_s, song_path, profile): + return self.radiocol_song_added(jid.JID(referee_s), song_path, profile) - def radiocolSongAdded(self, referee, song_path, profile): + def radiocol_song_added(self, referee, song_path, profile): """This method is called by libervia when a song has been uploaded @param referee (jid.JID): JID of the referee in the room (room userhost + '/' + nick) @param song_path (unicode): absolute path of the song added @@ -174,7 +174,7 @@ song = OggVorbis(song_path) except (OggVorbisHeaderError, HeaderNotFoundError): # this file is not ogg vorbis nor mp3, we reject it - self.deleteFile(song_path) # FIXME: same host trick (see note above) + self.delete_file(song_path) # FIXME: same host trick (see note above) return defer.fail( exceptions.DataError( D_( @@ -200,7 +200,7 @@ ) # FIXME: works only because of the same host trick, see the note under the docstring return self.send(referee, ("", "song_added"), attrs, profile=profile) - def playNext(self, room_jid, profile): + def play_next(self, room_jid, profile): """"Play next song in queue if exists, and put a timer which trigger after the song has been played to play next one""" # TODO: songs need to be erased once played or found invalids @@ -210,7 +210,7 @@ log.debug(_("No more participants in the radiocol: cleaning data")) radio_data["queue"] = [] for filename in radio_data["to_delete"]: - self.deleteFile(filename, radio_data) + self.delete_file(filename, radio_data) radio_data["to_delete"] = {} queue = radio_data["queue"] if not queue: @@ -228,13 +228,13 @@ self.send(room_jid, ("", "upload_ok"), profile=profile) radio_data["upload"] = True - reactor.callLater(length, self.playNext, room_jid, profile) + reactor.callLater(length, self.play_next, room_jid, profile) # we wait more than the song length to delete the file, to manage poorly reactive networks/clients reactor.callLater( - length + 90, self.deleteFile, filename, radio_data + length + 90, self.delete_file, filename, radio_data ) # FIXME: same host trick (see above) - def deleteFile(self, filename, radio_data=None): + def delete_file(self, filename, radio_data=None): """ Delete a previously uploaded file. @param filename: filename to delete, or full filepath if radio_data is None @@ -263,16 +263,16 @@ def room_game_cmd(self, mess_elt, profile): from_jid = jid.JID(mess_elt["from"]) room_jid = from_jid.userhostJID() - nick = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) + nick = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) radio_elt = mess_elt.firstChildElement() radio_data = self.games[room_jid] if "queue" in radio_data: queue = radio_data["queue"] - from_referee = self.isReferee(room_jid, from_jid.resource) - to_referee = self.isReferee(room_jid, jid.JID(mess_elt["to"]).user) - is_player = self.isPlayer(room_jid, nick) + from_referee = self.is_referee(room_jid, from_jid.resource) + to_referee = self.is_referee(room_jid, jid.JID(mess_elt["to"]).user) + is_player = self.is_player(room_jid, nick) for elt in radio_elt.elements(): if not from_referee and not (to_referee and elt.name == "song_added"): continue # sender must be referee, expect when a song is submitted @@ -287,9 +287,9 @@ for player in elt.elements(): players.append(str(player)) signal = ( - self.host.bridge.radiocolStarted + self.host.bridge.radiocol_started if elt.name == "started" - else self.host.bridge.radiocolPlayers + else self.host.bridge.radiocol_players ) signal( room_jid.userhost(), @@ -299,7 +299,7 @@ profile, ) elif elt.name == "preload": # a song is in queue and must be preloaded - self.host.bridge.radiocolPreload( + self.host.bridge.radiocol_preload( room_jid.userhost(), elt["timestamp"], elt["filename"], @@ -310,17 +310,17 @@ profile, ) elif elt.name == "play": - self.host.bridge.radiocolPlay( + self.host.bridge.radiocol_play( room_jid.userhost(), elt["filename"], profile ) elif elt.name == "song_rejected": # a song has been refused - self.host.bridge.radiocolSongRejected( + self.host.bridge.radiocol_song_rejected( room_jid.userhost(), elt["reason"], profile ) elif elt.name == "no_upload": - self.host.bridge.radiocolNoUpload(room_jid.userhost(), profile) + self.host.bridge.radiocol_no_upload(room_jid.userhost(), profile) elif elt.name == "upload_ok": - self.host.bridge.radiocolUploadOk(room_jid.userhost(), profile) + self.host.bridge.radiocol_upload_ok(room_jid.userhost(), profile) elif elt.name == "song_added": # a song has been added # FIXME: we are KISS for the proof of concept: every song is added, to a limit of 3 in queue. # Need to manage some sort of rules to allow peoples to send songs @@ -348,11 +348,11 @@ if not radio_data["playing"] and len(queue) == QUEUE_TO_START: # We have not started playing yet, and we have QUEUE_TO_START # songs in queue. We can now start the party :) - self.playNext(room_jid, profile) + self.play_next(room_jid, profile) else: log.error(_("Unmanaged game element: %s") % elt.name) - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): game_data = self.games[room_jid] elements = [] if game_data["playing"]:
--- a/sat/plugins/plugin_misc_register_account.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_register_account.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,14 +49,14 @@ log.info(_("Plugin Register Account initialization")) self.host = host self._sessions = Sessions() - host.registerCallback( - self.registerNewAccountCB, with_data=True, force_id="registerNewAccount" + host.register_callback( + self.register_new_account_cb, with_data=True, force_id="register_new_account" ) - self.__register_account_id = host.registerCallback( - self._registerConfirmation, with_data=True + self.__register_account_id = host.register_callback( + self._register_confirmation, with_data=True ) - def registerNewAccountCB(self, data, profile): + def register_new_account_cb(self, data, profile): """Called when the user click on the "New account" button.""" session_data = {} @@ -81,7 +81,7 @@ session_data["user"], host, resource = jid.parse(session_data["JabberID"]) session_data["server"] = session_data[C.FORCE_SERVER_PARAM] or host - session_id, __ = self._sessions.newSession(session_data, profile=profile) + session_id, __ = self._sessions.new_session(session_data, profile=profile) form_ui = xml_tools.XMLUI( "form", title=D_("Register new account"), @@ -95,30 +95,30 @@ ) return {"xmlui": form_ui.toXml()} - def _registerConfirmation(self, data, profile): + def _register_confirmation(self, data, profile): """Save the related parameters and proceed the registration.""" - session_data = self._sessions.profileGet(data["session_id"], profile) + session_data = self._sessions.profile_get(data["session_id"], profile) - self.host.memory.setParam( + self.host.memory.param_set( "JabberID", session_data["JabberID"], "Connection", profile_key=profile ) - self.host.memory.setParam( + self.host.memory.param_set( "Password", session_data["Password"], "Connection", profile_key=profile ) - self.host.memory.setParam( + self.host.memory.param_set( C.FORCE_SERVER_PARAM, session_data[C.FORCE_SERVER_PARAM], "Connection", profile_key=profile, ) - self.host.memory.setParam( + self.host.memory.param_set( C.FORCE_PORT_PARAM, session_data[C.FORCE_PORT_PARAM], "Connection", profile_key=profile, ) - d = self._registerNewAccount( + d = self._register_new_account( jid.JID(session_data["JabberID"]), session_data["Password"], None, @@ -127,14 +127,14 @@ del self._sessions[data["session_id"]] return d - def _registerNewAccount(self, client, jid_, password, email, server): + def _register_new_account(self, client, jid_, password, email, server): # FIXME: port is not set here - def registeredCb(__): + def registered_cb(__): xmlui = xml_tools.XMLUI("popup", title=D_("Confirmation")) xmlui.addText(D_("Registration successful.")) return {"xmlui": xmlui.toXml()} - def registeredEb(failure): + def registered_eb(failure): xmlui = xml_tools.XMLUI("popup", title=D_("Failure")) xmlui.addText(D_("Registration failed: %s") % failure.getErrorMessage()) try: @@ -146,8 +146,8 @@ pass return {"xmlui": xmlui.toXml()} - registered_d = self.host.plugins["XEP-0077"].registerNewAccount( + registered_d = self.host.plugins["XEP-0077"].register_new_account( client, jid_, password, email=email, host=server, port=C.XMPP_C2S_PORT ) - registered_d.addCallbacks(registeredCb, registeredEb) + registered_d.addCallbacks(registered_cb, registered_eb) return registered_d
--- a/sat/plugins/plugin_misc_room_game.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_room_game.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,9 +56,9 @@ class RoomGame(object): """This class is used to help launching a MUC game. - Bridge methods callbacks: _prepareRoom, _playerReady, _createGame - Triggered methods: userJoinedTrigger, userLeftTrigger - Also called from subclasses: newRound + bridge methods callbacks: _prepare_room, _player_ready, _create_game + Triggered methods: user_joined_trigger, user_left_trigger + Also called from subclasses: new_round For examples of messages sequences, please look in sub-classes. """ @@ -81,13 +81,13 @@ class MyGame(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type(self.__class__.__name__, (self.__class__, RoomGame, object), {}) def __init__(self, host): - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_(self, host, ...) """ @@ -125,41 +125,41 @@ # by an arbitrary value. If needed, this attribute would be set to True from the testcase. self.testing = False - host.trigger.add("MUC user joined", self.userJoinedTrigger) - host.trigger.add("MUC user left", self.userLeftTrigger) + host.trigger.add("MUC user joined", self.user_joined_trigger) + host.trigger.add("MUC user left", self.user_left_trigger) - def _createOrInvite(self, room_jid, other_players, profile): + def _create_or_invite(self, room_jid, other_players, profile): """ This is called only when someone explicitly wants to play. The game will not be created if one already exists in the room, also its creation could be postponed until all the expected players - join the room (in that case it will be created from userJoinedTrigger). + join the room (in that case it will be created from user_joined_trigger). @param room (wokkel.muc.Room): the room @param other_players (list[jid.JID]): list of the other players JID (bare) """ # FIXME: broken ! raise NotImplementedError("To be fixed") - client = self.host.getClient(profile) - user_jid = self.host.getJidNStream(profile)[0] - nick = self.host.plugins["XEP-0045"].getRoomNick(client, room_jid) + client = self.host.get_client(profile) + user_jid = self.host.get_jid_n_stream(profile)[0] + nick = self.host.plugins["XEP-0045"].get_room_nick(client, room_jid) nicks = [nick] - if self._gameExists(room_jid): - if not self._checkJoinAuth(room_jid, user_jid, nick): + if self._game_exists(room_jid): + if not self._check_join_auth(room_jid, user_jid, nick): return - nicks.extend(self._invitePlayers(room_jid, other_players, nick, profile)) - self._updatePlayers(room_jid, nicks, True, profile) + nicks.extend(self._invite_players(room_jid, other_players, nick, profile)) + self._update_players(room_jid, nicks, True, profile) else: - self._initGame(room_jid, nick) - (auth, waiting, missing) = self._checkWaitAuth(room_jid, other_players) + self._init_game(room_jid, nick) + (auth, waiting, missing) = self._check_wait_auth(room_jid, other_players) nicks.extend(waiting) - nicks.extend(self._invitePlayers(room_jid, missing, nick, profile)) + nicks.extend(self._invite_players(room_jid, missing, nick, profile)) if auth: - self.createGame(room_jid, nicks, profile) + self.create_game(room_jid, nicks, profile) else: - self._updatePlayers(room_jid, nicks, False, profile) + self._update_players(room_jid, nicks, False, profile) - def _initGame(self, room_jid, referee_nick): + def _init_game(self, room_jid, referee_nick): """ @param room_jid (jid.JID): JID of the room @@ -167,7 +167,7 @@ """ # Important: do not add the referee to 'players' yet. For a # <players /> message to be emitted whenever a new player is joining, - # it is necessary to not modify 'players' outside of _updatePlayers. + # it is necessary to not modify 'players' outside of _update_players. referee_jid = jid.JID(room_jid.userhost() + "/" + referee_nick) self.games[room_jid] = { "referee": referee_jid, @@ -178,19 +178,19 @@ self.games[room_jid].update(copy.deepcopy(self.game_init)) self.invitations.setdefault(room_jid, []) - def _gameExists(self, room_jid, started=False): + def _game_exists(self, room_jid, started=False): """Return True if a game has been initialized/started. @param started: if False, the game must be initialized to return True, - otherwise it must be initialized and started with createGame. + otherwise it must be initialized and started with create_game. @return: True if a game is initialized/started in that room""" return room_jid in self.games and (not started or self.games[room_jid]["started"]) - def _checkJoinAuth(self, room_jid, user_jid=None, nick="", verbose=False): + def _check_join_auth(self, room_jid, user_jid=None, nick="", verbose=False): """Checks if this profile is allowed to join the game. The parameter nick is used to check if the user is already a player in that game. When this method is called from - userJoinedTrigger, nick is also used to check the user + user_joined_trigger, nick is also used to check the user identity instead of user_jid_s (see TODO comment below). @param room_jid (jid.JID): the JID of the room hosting the game @param user_jid (jid.JID): JID of the user @@ -198,9 +198,9 @@ @return: True if this profile can join the game """ auth = False - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): auth = False - elif self.join_mode == self.ALL or self.isPlayer(room_jid, nick): + elif self.join_mode == self.ALL or self.is_player(room_jid, nick): auth = True elif self.join_mode == self.INVITED: # considering all the batches of invitations @@ -227,7 +227,7 @@ ) return auth - def _updatePlayers(self, room_jid, nicks, sync, profile): + def _update_players(self, room_jid, nicks, sync, profile): """Update the list of players and signal to the room that some players joined the game. If sync is True, the news players are synchronized with the game data they have missed. Remark: self.games[room_jid]['players'] should not be modified outside this method. @@ -251,16 +251,16 @@ sync = ( sync - and self._gameExists(room_jid, True) + and self._game_exists(room_jid, True) and len(self.games[room_jid]["players"]) > 0 ) setStatus("desync" if sync else "init") self.games[room_jid]["players"].extend(new_nicks) - self._synchronizeRoom(room_jid, [room_jid], profile) + self._synchronize_room(room_jid, [room_jid], profile) if sync: setStatus("init") - def _synchronizeRoom(self, room_jid, recipients, profile): + def _synchronize_room(self, room_jid, recipients, profile): """Communicate the list of players to the whole room or only to some users, also send the synchronization data to the players who recently joined the game. @param room_jid (jid.JID): JID of the room @@ -269,16 +269,16 @@ - room JID + "/" + user nick @param profile (unicode): %(doc_profile)s """ - if self._gameExists(room_jid, started=True): - element = self._createStartElement(self.games[room_jid]["players"]) + if self._game_exists(room_jid, started=True): + element = self._create_start_element(self.games[room_jid]["players"]) else: - element = self._createStartElement( + element = self._create_start_element( self.games[room_jid]["players"], name="players" ) elements = [(element, None, None)] sync_args = [] - sync_data = self._getSyncData(room_jid) + sync_data = self._get_sync_data(room_jid) for nick in sync_data: user_jid = jid.JID(room_jid.userhost() + "/" + nick) if user_jid in recipients: @@ -291,19 +291,19 @@ sync_args.append(([user_jid, user_elements], {"profile": profile})) for recipient in recipients: - self._sendElements(recipient, elements, profile=profile) + self._send_elements(recipient, elements, profile=profile) for args, kwargs in sync_args: - self._sendElements(*args, **kwargs) + self._send_elements(*args, **kwargs) - def _getSyncData(self, room_jid, force_nicks=None): + def _get_sync_data(self, room_jid, force_nicks=None): """The synchronization data are returned for each player who has the state 'desync' or if he's been contained by force_nicks. @param room_jid (jid.JID): JID of the room @param force_nicks: force the synchronization for this list of the nicks @return: a mapping between player nicks and a list of elements to - be sent by self._synchronizeRoom for the game to be synchronized. + be sent by self._synchronize_room for the game to be synchronized. """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return {} data = {} status = self.games[room_jid]["status"] @@ -314,12 +314,12 @@ if nick not in nicks: nicks.append(nick) for nick in nicks: - elements = self.getSyncDataForPlayer(room_jid, nick) + elements = self.get_sync_data_for_player(room_jid, nick) if elements: data[nick] = elements return data - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): """This method may (and should probably) be overwritten by a child class. @param room_jid (jid.JID): JID of the room @param nick: the nick of the player to be synchronized @@ -327,7 +327,7 @@ """ return [] - def _invitePlayers(self, room_jid, other_players, nick, profile): + def _invite_players(self, room_jid, other_players, nick, profile): """Invite players to a room, associated game may exist or not. @param other_players (list[jid.JID]): list of the players to invite @@ -336,7 +336,7 @@ """ raise NotImplementedError("Need to be fixed !") # FIXME: this is broken and unsecure ! - if not self._checkInviteAuth(room_jid, nick): + if not self._check_invite_auth(room_jid, nick): return [] # TODO: remove invitation waiting for too long, using the time data self.invitations[room_jid].append( @@ -356,7 +356,7 @@ nicks.append(other_nick) return nicks - def _checkInviteAuth(self, room_jid, nick, verbose=False): + def _check_invite_auth(self, room_jid, nick, verbose=False): """Checks if this user is allowed to invite players @param room_jid (jid.JID): JID of the room @@ -365,16 +365,16 @@ @return: True if the user is allowed to invite other players """ auth = False - if self.invite_mode == self.FROM_ALL or not self._gameExists(room_jid): + if self.invite_mode == self.FROM_ALL or not self._game_exists(room_jid): auth = True elif self.invite_mode == self.FROM_NONE: - auth = not self._gameExists(room_jid, started=True) and self.isReferee( + auth = not self._game_exists(room_jid, started=True) and self.is_referee( room_jid, nick ) elif self.invite_mode == self.FROM_REFEREE: - auth = self.isReferee(room_jid, nick) + auth = self.is_referee(room_jid, nick) elif self.invite_mode == self.FROM_PLAYERS: - auth = self.isPlayer(room_jid, nick) + auth = self.is_player(room_jid, nick) if not auth and (verbose or _DEBUG): log.debug( _("%(user)s not allowed to invite for the game %(game)s in %(room)s") @@ -382,31 +382,31 @@ ) return auth - def isReferee(self, room_jid, nick): + def is_referee(self, room_jid, nick): """Checks if the player with this nick is the referee for the game in this room" @param room_jid (jid.JID): room JID @param nick: user nick in the room @return: True if the user is the referee of the game in this room """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return False return ( jid.JID(room_jid.userhost() + "/" + nick) == self.games[room_jid]["referee"] ) - def isPlayer(self, room_jid, nick): + def is_player(self, room_jid, nick): """Checks if the user with this nick is a player for the game in this room. @param room_jid (jid.JID): JID of the room @param nick: user nick in the room @return: True if the user is a player of the game in this room """ - if not self._gameExists(room_jid): + if not self._game_exists(room_jid): return False # Important: the referee is not in the 'players' list right after - # the game initialization, that's why we do also check with isReferee - return nick in self.games[room_jid]["players"] or self.isReferee(room_jid, nick) + # the game initialization, that's why we do also check with is_referee + return nick in self.games[room_jid]["players"] or self.is_referee(room_jid, nick) - def _checkWaitAuth(self, room, other_players, verbose=False): + def _check_wait_auth(self, room, other_players, verbose=False): """Check if we must wait for other players before starting the game. @param room (wokkel.muc.Room): the room @@ -441,26 +441,26 @@ ) return result - def getUniqueName(self, muc_service=None, profile_key=C.PROF_KEY_NONE): + def get_unique_name(self, muc_service=None, profile_key=C.PROF_KEY_NONE): """Generate unique room name @param muc_service (jid.JID): you can leave empty to autofind the muc service @param profile_key (unicode): %(doc_profile_key)s @return: jid.JID (unique name for a new room to be created) """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) # FIXME: jid.JID must be used instead of strings - room = self.host.plugins["XEP-0045"].getUniqueName(client, muc_service) + room = self.host.plugins["XEP-0045"].get_unique_name(client, muc_service) return jid.JID("sat_%s_%s" % (self.name.lower(), room.userhost())) - def _prepareRoom( + def _prepare_room( self, other_players=None, room_jid_s="", profile_key=C.PROF_KEY_NONE ): room_jid = jid.JID(room_jid_s) if room_jid_s else None other_players = [jid.JID(player).userhostJID() for player in other_players] - return self.prepareRoom(other_players, room_jid, profile_key) + return self.prepare_room(other_players, room_jid, profile_key) - def prepareRoom(self, other_players=None, room_jid=None, profile_key=C.PROF_KEY_NONE): + def prepare_room(self, other_players=None, room_jid=None, profile_key=C.PROF_KEY_NONE): """Prepare the room for a game: create it if it doesn't exist and invite players. @param other_players (list[JID]): list of other players JID (bare) @@ -468,9 +468,9 @@ @param profile_key (unicode): %(doc_profile_key)s """ # FIXME: need to be refactored - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.debug(_("Preparing room for %s game") % self.name) - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("Unknown profile")) return defer.succeed(None) @@ -479,19 +479,19 @@ # Create/join the given room, or a unique generated one if no room is specified. if room_jid is None: - room_jid = self.getUniqueName(profile_key=profile_key) + room_jid = self.get_unique_name(profile_key=profile_key) else: - self.host.plugins["XEP-0045"].checkRoomJoined(client, room_jid) - self._createOrInvite(client, room_jid, other_players) + self.host.plugins["XEP-0045"].check_room_joined(client, room_jid) + self._create_or_invite(client, room_jid, other_players) return defer.succeed(None) - user_jid = self.host.getJidNStream(profile)[0] + user_jid = self.host.get_jid_n_stream(profile)[0] d = self.host.plugins["XEP-0045"].join(room_jid, user_jid.user, {}, profile) return d.addCallback( - lambda __: self._createOrInvite(client, room_jid, other_players) + lambda __: self._create_or_invite(client, room_jid, other_players) ) - def userJoinedTrigger(self, room, user, profile): + def user_joined_trigger(self, room, user, profile): """This trigger is used to check if the new user can take part of a game, create the game if we were waiting for him or just update the players list. @room: wokkel.muc.Room object. room.roster is a dict{wokkel.muc.User.nick: wokkel.muc.User} @@ -500,13 +500,13 @@ """ room_jid = room.occupantJID.userhostJID() profile_nick = room.occupantJID.resource - if not self.isReferee(room_jid, profile_nick): + if not self.is_referee(room_jid, profile_nick): return True # profile is not the referee - if not self._checkJoinAuth( + if not self._check_join_auth( room_jid, user.entity if user.entity else None, user.nick ): # user not allowed but let him know that we are playing :p - self._synchronizeRoom( + self._synchronize_room( room_jid, [jid.JID(room_jid.userhost() + "/" + user.nick)], profile ) return True @@ -520,17 +520,17 @@ ) return True other_players = self.invitations[room_jid][batch][1] - (auth, nicks, __) = self._checkWaitAuth(room, other_players) + (auth, nicks, __) = self._check_wait_auth(room, other_players) if auth: del self.invitations[room_jid][batch] nicks.insert(0, profile_nick) # add the referee - self.createGame(room_jid, nicks, profile_key=profile) + self.create_game(room_jid, nicks, profile_key=profile) return True # let the room know that a new player joined - self._updatePlayers(room_jid, [user.nick], True, profile) + self._update_players(room_jid, [user.nick], True, profile) return True - def userLeftTrigger(self, room, user, profile): + def user_left_trigger(self, room, user, profile): """This trigger is used to update or stop the game when a user leaves. @room: wokkel.muc.Room object. room.roster is a dict{wokkel.muc.User.nick: wokkel.muc.User} @@ -539,9 +539,9 @@ """ room_jid = room.occupantJID.userhostJID() profile_nick = room.occupantJID.resource - if not self.isReferee(room_jid, profile_nick): + if not self.is_referee(room_jid, profile_nick): return True # profile is not the referee - if self.isPlayer(room_jid, user.nick): + if self.is_player(room_jid, user.nick): try: self.games[room_jid]["players"].remove(user.nick) except ValueError: @@ -559,7 +559,7 @@ self.invitations[room_jid][batch][1].append(user_jid) return True - def _checkCreateGameAndInit(self, room_jid, profile): + def _check_create_game_and_init(self, room_jid, profile): """Check if that profile can create the game. If the game can be created but is not initialized yet, this method will also do the initialization. @@ -569,16 +569,16 @@ - create: set to True to allow the game creation - sync: set to True to advice a game synchronization """ - user_nick = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) + user_nick = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) if not user_nick: log.error( "Internal error: profile %s has not joined the room %s" % (profile, room_jid.userhost()) ) return False, False - if self._gameExists(room_jid): - is_referee = self.isReferee(room_jid, user_nick) - if self._gameExists(room_jid, started=True): + if self._game_exists(room_jid): + is_referee = self.is_referee(room_jid, user_nick) + if self._game_exists(room_jid, started=True): log.info( _("%(game)s game already created in room %(room)s") % {"game": self.name, "room": room_jid.userhost()} @@ -591,13 +591,13 @@ ) return False, False else: - self._initGame(room_jid, user_nick) + self._init_game(room_jid, user_nick) return True, False - def _createGame(self, room_jid_s, nicks=None, profile_key=C.PROF_KEY_NONE): - self.createGame(jid.JID(room_jid_s), nicks, profile_key) + def _create_game(self, room_jid_s, nicks=None, profile_key=C.PROF_KEY_NONE): + self.create_game(jid.JID(room_jid_s), nicks, profile_key) - def createGame(self, room_jid, nicks=None, profile_key=C.PROF_KEY_NONE): + def create_game(self, room_jid, nicks=None, profile_key=C.PROF_KEY_NONE): """Create a new game. This can be called directly from a frontend and skips all the checks and invitation system, @@ -610,19 +610,19 @@ _("Creating %(game)s game in room %(room)s") % {"game": self.name, "room": room_jid} ) - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return - (create, sync) = self._checkCreateGameAndInit(room_jid, profile) + (create, sync) = self._check_create_game_and_init(room_jid, profile) if nicks is None: nicks = [] if not create: if sync: - self._updatePlayers(room_jid, nicks, True, profile) + self._update_players(room_jid, nicks, True, profile) return self.games[room_jid]["started"] = True - self._updatePlayers(room_jid, nicks, False, profile) + self._update_players(room_jid, nicks, False, profile) if self.player_init: # specific data to each player (score, private data) self.games[room_jid].setdefault("players_data", {}) @@ -632,16 +632,16 @@ self.player_init ) - def _playerReady(self, player_nick, referee_jid_s, profile_key=C.PROF_KEY_NONE): - self.playerReady(player_nick, jid.JID(referee_jid_s), profile_key) + def _player_ready(self, player_nick, referee_jid_s, profile_key=C.PROF_KEY_NONE): + self.player_ready(player_nick, jid.JID(referee_jid_s), profile_key) - def playerReady(self, player_nick, referee_jid, profile_key=C.PROF_KEY_NONE): + def player_ready(self, player_nick, referee_jid, profile_key=C.PROF_KEY_NONE): """Must be called when player is ready to start a new game @param player: the player nick in the room @param referee_jid (jid.JID): JID of the referee """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return @@ -649,7 +649,7 @@ # TODO: we probably need to add the game and room names in the sent message self.send(referee_jid, "player_ready", {"player": player_nick}, profile=profile) - def newRound(self, room_jid, data, profile): + def new_round(self, room_jid, data, profile): """Launch a new round (reinit the user data) @param room_jid: room userhost @@ -681,7 +681,7 @@ for player in players: players_data[player].update(copy.deepcopy(common_data)) - def _createGameElt(self, to_jid): + def _create_game_elt(self, to_jid): """Create a generic domish Element for the game messages @param to_jid: JID of the recipient @@ -694,7 +694,7 @@ elt.addElement(self.ns_tag) return elt - def _createStartElement(self, players=None, name="started"): + def _create_start_element(self, players=None, name="started"): """Create a domish Element listing the game users @param players: list of the players @@ -715,7 +715,7 @@ started_elt.addChild(player_elt) return started_elt - def _sendElements(self, to_jid, data, profile=None): + def _send_elements(self, to_jid, data, profile=None): """ TODO @param to_jid: recipient JID @@ -729,8 +729,8 @@ @param profile: the profile from which the message is sent @return: a Deferred instance """ - client = self.host.getClient(profile) - msg = self._createGameElt(to_jid) + client = self.host.get_client(profile) + msg = self._create_game_elt(to_jid) for elem, attrs, content in data: if elem is not None: if isinstance(elem, domish.Element): @@ -757,9 +757,9 @@ @param profile: the profile from which the message is sent @return: a Deferred instance """ - return self._sendElements(to_jid, [(elem, attrs, content)], profile) + return self._send_elements(to_jid, [(elem, attrs, content)], profile) - def getHandler(self, client): + def get_handler(self, client): return RoomGameHandler(self)
--- a/sat/plugins/plugin_misc_static_blog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_static_blog.py Sat Apr 08 13:54:42 2023 +0200 @@ -76,13 +76,13 @@ def __init__(self, host): try: # TODO: remove this attribute when all blogs can be retrieved - self.domain = host.plugins["MISC-ACCOUNT"].getNewAccountDomain() + self.domain = host.plugins["MISC-ACCOUNT"].account_domain_new_get() except KeyError: self.domain = None - host.memory.updateParams(self.params) - # host.importMenu((D_("User"), D_("Public blog")), self._displayPublicBlog, security_limit=1, help_string=D_("Display public blog page"), type_=C.MENU_JID_CONTEXT) + host.memory.update_params(self.params) + # host.import_menu((D_("User"), D_("Public blog")), self._display_public_blog, security_limit=1, help_string=D_("Display public blog page"), type_=C.MENU_JID_CONTEXT) - def _displayPublicBlog(self, menu_data, profile): + def _display_public_blog(self, menu_data, profile): """Check if the blog can be displayed and answer the frontend. @param menu_data: %(menu_data)s
--- a/sat/plugins/plugin_misc_tarot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_tarot.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,7 +49,7 @@ class Tarot(object): - def inheritFromRoomGame(self, host): + def inherit_from_room_game(self, host): global RoomGame RoomGame = host.plugins["ROOM-GAME"].__class__ self.__class__ = type( @@ -59,7 +59,7 @@ def __init__(self, host): log.info(_("Plugin Tarot initialization")) self._sessions = memory.Sessions() - self.inheritFromRoomGame(host) + self.inherit_from_room_game(host) RoomGame._init_( self, host, @@ -81,61 +81,61 @@ _("Garde Sans"), _("Garde Contre"), ] - host.bridge.addMethod( - "tarotGameLaunch", + host.bridge.add_method( + "tarot_game_launch", ".plugin", in_sign="asss", out_sign="", - method=self._prepareRoom, + method=self._prepare_room, async_=True, ) # args: players, room_jid, profile - host.bridge.addMethod( - "tarotGameCreate", + host.bridge.add_method( + "tarot_game_create", ".plugin", in_sign="sass", out_sign="", - method=self._createGame, + method=self._create_game, ) # args: room_jid, players, profile - host.bridge.addMethod( - "tarotGameReady", + host.bridge.add_method( + "tarot_game_ready", ".plugin", in_sign="sss", out_sign="", - method=self._playerReady, + method=self._player_ready, ) # args: player, referee, profile - host.bridge.addMethod( - "tarotGamePlayCards", + host.bridge.add_method( + "tarot_game_play_cards", ".plugin", in_sign="ssa(ss)s", out_sign="", method=self.play_cards, ) # args: player, referee, cards, profile - host.bridge.addSignal( - "tarotGamePlayers", ".plugin", signature="ssass" + host.bridge.add_signal( + "tarot_game_players", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "tarotGameStarted", ".plugin", signature="ssass" + host.bridge.add_signal( + "tarot_game_started", ".plugin", signature="ssass" ) # args: room_jid, referee, players, profile - host.bridge.addSignal( - "tarotGameNew", ".plugin", signature="sa(ss)s" + host.bridge.add_signal( + "tarot_game_new", ".plugin", signature="sa(ss)s" ) # args: room_jid, hand, profile - host.bridge.addSignal( - "tarotGameChooseContrat", ".plugin", signature="sss" + host.bridge.add_signal( + "tarot_game_choose_contrat", ".plugin", signature="sss" ) # args: room_jid, xml_data, profile - host.bridge.addSignal( - "tarotGameShowCards", ".plugin", signature="ssa(ss)a{ss}s" + host.bridge.add_signal( + "tarot_game_show_cards", ".plugin", signature="ssa(ss)a{ss}s" ) # args: room_jid, type ["chien", "poignée",...], cards, data[dict], profile - host.bridge.addSignal( - "tarotGameCardsPlayed", ".plugin", signature="ssa(ss)s" + host.bridge.add_signal( + "tarot_game_cards_played", ".plugin", signature="ssa(ss)s" ) # args: room_jid, player, type ["chien", "poignée",...], cards, data[dict], profile - host.bridge.addSignal( - "tarotGameYourTurn", ".plugin", signature="ss" + host.bridge.add_signal( + "tarot_game_your_turn", ".plugin", signature="ss" ) # args: room_jid, profile - host.bridge.addSignal( - "tarotGameScore", ".plugin", signature="ssasass" + host.bridge.add_signal( + "tarot_game_score", ".plugin", signature="ssasass" ) # args: room_jid, xml_data, winners (list of nicks), loosers (list of nicks), profile - host.bridge.addSignal( - "tarotGameInvalidCards", ".plugin", signature="ssa(ss)a(ss)s" + host.bridge.add_signal( + "tarot_game_invalid_cards", ".plugin", signature="ssa(ss)a(ss)s" ) # args: room_jid, game phase, played_cards, invalid_cards, profile self.deck_ordered = [] for value in ["excuse"] + list(map(str, list(range(1, 22)))): @@ -143,10 +143,10 @@ for suit in ["pique", "coeur", "carreau", "trefle"]: for value in list(map(str, list(range(1, 11)))) + ["valet", "cavalier", "dame", "roi"]: self.deck_ordered.append(TarotCard((suit, value))) - self.__choose_contrat_id = host.registerCallback( - self._contratChoosed, with_data=True + self.__choose_contrat_id = host.register_callback( + self._contrat_choosed, with_data=True ) - self.__score_id = host.registerCallback(self._scoreShowed, with_data=True) + self.__score_id = host.register_callback(self._score_showed, with_data=True) def __card_list_to_xml(self, cards_list, elt_name): """Convert a card list to domish element""" @@ -519,13 +519,13 @@ to_jid = jid.JID(room_jid.userhost() + "/" + next_player) # FIXME: gof: self.send(to_jid, "your_turn", profile=profile) - def _contratChoosed(self, raw_data, profile): + def _contrat_choosed(self, raw_data, profile): """Will be called when the contrat is selected @param raw_data: contains the choosed session id and the chosen contrat @param profile_key: profile """ try: - session_data = self._sessions.profileGet(raw_data["session_id"], profile) + session_data = self._sessions.profile_get(raw_data["session_id"], profile) except KeyError: log.warning(_("session id doesn't exist, session has probably expired")) # TODO: send error dialog @@ -533,8 +533,8 @@ room_jid = session_data["room_jid"] referee_jid = self.games[room_jid]["referee"] - player = self.host.plugins["XEP-0045"].getRoomNick(room_jid, profile) - data = xml_tools.XMLUIResult2DataFormResult(raw_data) + player = self.host.plugins["XEP-0045"].get_room_nick(room_jid, profile) + data = xml_tools.xmlui_result_2_data_form_result(raw_data) contrat = data["contrat"] log.debug( _("contrat [%(contrat)s] choosed by %(profile)s") @@ -551,13 +551,13 @@ del self._sessions[raw_data["session_id"]] return d - def _scoreShowed(self, raw_data, profile): + def _score_showed(self, raw_data, profile): """Will be called when the player closes the score dialog @param raw_data: nothing to retrieve from here but the session id @param profile_key: profile """ try: - session_data = self._sessions.profileGet(raw_data["session_id"], profile) + session_data = self._sessions.profile_get(raw_data["session_id"], profile) except KeyError: log.warning(_("session id doesn't exist, session has probably expired")) # TODO: send error dialog @@ -565,7 +565,7 @@ room_jid_s = session_data["room_jid"].userhost() # XXX: empty hand means to the frontend "reset the display"... - self.host.bridge.tarotGameNew(room_jid_s, [], profile) + self.host.bridge.tarot_game_new(room_jid_s, [], profile) del self._sessions[raw_data["session_id"]] return defer.succeed({}) @@ -576,7 +576,7 @@ @cards: cards played (list of tuples) @profile_key: profile """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error(_("profile %s is unknown") % profile_key) return @@ -587,7 +587,7 @@ elem = self.__card_list_to_xml(TarotCard.from_tuples(cards), "cards_played") self.send(jid.JID(referee), elem, {"player": player}, profile=profile) - def newRound(self, room_jid, profile): + def new_round(self, room_jid, profile): game_data = self.games[room_jid] players = game_data["players"] game_data["first_player"] = None # first player for the current trick @@ -613,7 +613,7 @@ for player in players: msg_elts[player] = self.__card_list_to_xml(hand[player], "hand") - RoomGame.newRound(self, room_jid, (common_data, msg_elts), profile) + RoomGame.new_round(self, room_jid, (common_data, msg_elts), profile) pl_idx = game_data["current_player"] = (game_data["init_player"] + 1) % len( players @@ -626,14 +626,14 @@ """ @param mess_elt: instance of twisted.words.xish.domish.Element """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) from_jid = jid.JID(mess_elt["from"]) room_jid = jid.JID(from_jid.userhost()) - nick = self.host.plugins["XEP-0045"].getRoomNick(client, room_jid) + nick = self.host.plugins["XEP-0045"].get_room_nick(client, room_jid) game_elt = mess_elt.firstChildElement() game_data = self.games[room_jid] - is_player = self.isPlayer(room_jid, nick) + is_player = self.is_player(room_jid, nick) if "players_data" in game_data: players_data = game_data["players_data"] @@ -649,9 +649,9 @@ for player in elt.elements(): players.append(str(player)) signal = ( - self.host.bridge.tarotGameStarted + self.host.bridge.tarot_game_started if elt.name == "started" - else self.host.bridge.tarotGamePlayers + else self.host.bridge.tarot_game_players ) signal(room_jid.userhost(), from_jid.full(), players, profile) @@ -667,21 +667,21 @@ if ( list(status.values()).count("ready") == nb_players ): # everybody is ready, we can start the game - self.newRound(room_jid, profile) + self.new_round(room_jid, profile) elif elt.name == "hand": # a new hand has been received - self.host.bridge.tarotGameNew( + self.host.bridge.tarot_game_new( room_jid.userhost(), self.__xml_to_list(elt), profile ) elif elt.name == "contrat": # it's time to choose contrat form = data_form.Form.fromElement(elt.firstChildElement()) - session_id, session_data = self._sessions.newSession(profile=profile) + session_id, session_data = self._sessions.new_session(profile=profile) session_data["room_jid"] = room_jid - xml_data = xml_tools.dataForm2XMLUI( + xml_data = xml_tools.data_form_2_xmlui( form, self.__choose_contrat_id, session_id ).toXml() - self.host.bridge.tarotGameChooseContrat( + self.host.bridge.tarot_game_choose_contrat( room_jid.userhost(), xml_data, profile ) @@ -752,7 +752,7 @@ data = {"attaquant": elt["attaquant"]} game_data["stage"] = "ecart" game_data["attaquant"] = elt["attaquant"] - self.host.bridge.tarotGameShowCards( + self.host.bridge.tarot_game_show_cards( room_jid.userhost(), "chien", self.__xml_to_list(elt), data, profile ) @@ -790,7 +790,7 @@ cards = TarotCard.from_tuples(self.__xml_to_list(elt)) if mess_elt["type"] == "groupchat": - self.host.bridge.tarotGameCardsPlayed( + self.host.bridge.tarot_game_cards_played( room_jid.userhost(), elt["player"], self.__xml_to_list(elt), @@ -858,7 +858,7 @@ self.send(to_jid, "your_turn", profile=profile) elif elt.name == "your_turn": - self.host.bridge.tarotGameYourTurn(room_jid.userhost(), profile) + self.host.bridge.tarot_game_your_turn(room_jid.userhost(), profile) elif elt.name == "score": form_elt = next(elt.elements(name="x", uri="jabber:x:data")) @@ -869,12 +869,12 @@ for looser in elt.elements(name="looser", uri=NS_CG): loosers.append(str(looser)) form = data_form.Form.fromElement(form_elt) - session_id, session_data = self._sessions.newSession(profile=profile) + session_id, session_data = self._sessions.new_session(profile=profile) session_data["room_jid"] = room_jid - xml_data = xml_tools.dataForm2XMLUI( + xml_data = xml_tools.data_form_2_xmlui( form, self.__score_id, session_id ).toXml() - self.host.bridge.tarotGameScore( + self.host.bridge.tarot_game_score( room_jid.userhost(), xml_data, winners, loosers, profile ) elif elt.name == "error": @@ -885,7 +885,7 @@ invalid_cards = self.__xml_to_list( next(elt.elements(name="invalid", uri=NS_CG)) ) - self.host.bridge.tarotGameInvalidCards( + self.host.bridge.tarot_game_invalid_cards( room_jid.userhost(), elt["phase"], played_cards, @@ -897,5 +897,5 @@ else: log.error(_("Unmanaged card game element: %s") % elt.name) - def getSyncDataForPlayer(self, room_jid, nick): + def get_sync_data_for_player(self, room_jid, nick): return []
--- a/sat/plugins/plugin_misc_text_commands.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_text_commands.py Sat Apr 08 13:54:42 2023 +0200 @@ -66,12 +66,12 @@ log.info(_("Text commands initialization")) self.host = host # this is internal command, so we set high priority - host.trigger.add("sendMessage", self.sendMessageTrigger, priority=1000000) + host.trigger.add("sendMessage", self.send_message_trigger, priority=1000000) self._commands = {} self._whois = [] - self.registerTextCommands(self) + self.register_text_commands(self) - def _parseDocString(self, cmd, cmd_name): + def _parse_doc_string(self, cmd, cmd_name): """Parse a docstring to get text command data @param cmd: function or method callback for the command, @@ -150,7 +150,7 @@ return data - def registerTextCommands(self, instance): + def register_text_commands(self, instance): """ Add a text command @param instance: instance of a class containing text commands @@ -176,10 +176,10 @@ ) cmd_name = new_name self._commands[cmd_name] = cmd_data = {"callback": cmd} - cmd_data.update(self._parseDocString(cmd, cmd_name)) + cmd_data.update(self._parse_doc_string(cmd, cmd_name)) log.info(_("Registered text command [%s]") % cmd_name) - def addWhoIsCb(self, callback, priority=0): + def add_who_is_cb(self, callback, priority=0): """Add a callback which give information to the /whois command @param callback: a callback which will be called with the following arguments @@ -193,14 +193,14 @@ self._whois.append((priority, callback)) self._whois.sort(key=lambda item: item[0], reverse=True) - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """Install SendMessage command hook """ - pre_xml_treatments.addCallback(self._sendMessageCmdHook, client) + pre_xml_treatments.addCallback(self._send_message_cmd_hook, client) return True - def _sendMessageCmdHook(self, mess_data, client): + def _send_message_cmd_hook(self, mess_data, client): """ Check text commands in message, and react consequently msg starting with / are potential command. If a command is found, it is executed, @@ -239,7 +239,7 @@ d = None command = msg[1:].partition(" ")[0].lower().strip() if not command.isidentifier(): - self.feedBack( + self.feed_back( client, _("Invalid command /%s. ") % command + self.HELP_SUGGESTION, mess_data, @@ -247,7 +247,7 @@ raise failure.Failure(exceptions.CancelError()) # looks like an actual command, we try to call the corresponding method - def retHandling(ret): + def ret_handling(ret): """ Handle command return value: if ret is True, normally send message (possibly modified by command) else, abord message sending @@ -258,12 +258,12 @@ log.debug("text command detected ({})".format(command)) raise failure.Failure(exceptions.CancelError()) - def genericErrback(failure): + def generic_errback(failure): try: msg = "with condition {}".format(failure.value.condition) except AttributeError: msg = "with error {}".format(failure.value) - self.feedBack(client, "Command failed {}".format(msg), mess_data) + self.feed_back(client, "Command failed {}".format(msg), mess_data) return False mess_data["unparsed"] = msg[ @@ -272,7 +272,7 @@ try: cmd_data = self._commands[command] except KeyError: - self.feedBack( + self.feed_back( client, _("Unknown command /%s. ") % command + self.HELP_SUGGESTION, mess_data, @@ -280,7 +280,7 @@ log.debug("text command help message") raise failure.Failure(exceptions.CancelError()) else: - if not self._contextValid(mess_data, cmd_data): + if not self._context_valid(mess_data, cmd_data): # The command is not launched in the right context, we throw a message with help instructions context_txt = ( _("group discussions") @@ -290,23 +290,23 @@ feedback = _("/{command} command only applies in {context}.").format( command=command, context=context_txt ) - self.feedBack( + self.feed_back( client, "{} {}".format(feedback, self.HELP_SUGGESTION), mess_data ) log.debug("text command invalid message") raise failure.Failure(exceptions.CancelError()) else: - d = utils.asDeferred(cmd_data["callback"], client, mess_data) - d.addErrback(genericErrback) - d.addCallback(retHandling) + d = utils.as_deferred(cmd_data["callback"], client, mess_data) + d.addErrback(generic_errback) + d.addCallback(ret_handling) return d - def _contextValid(self, mess_data, cmd_data): + def _context_valid(self, mess_data, cmd_data): """Tell if a command can be used in the given context @param mess_data(dict): message data as given in sendMessage trigger - @param cmd_data(dict): command data as returned by self._parseDocString + @param cmd_data(dict): command data as returned by self._parse_doc_string @return (bool): True if command can be used in this context """ if (cmd_data["type"] == "group" and mess_data["type"] != "groupchat") or ( @@ -315,7 +315,7 @@ return False return True - def getRoomJID(self, arg, service_jid): + def get_room_jid(self, arg, service_jid): """Return a room jid with a shortcut @param arg: argument: can be a full room jid (e.g.: sat@chat.jabberfr.org) @@ -329,7 +329,7 @@ return jid.JID(arg + service_jid) return jid.JID(f"{arg}@{service_jid}") - def feedBack(self, client, message, mess_data, info_type=FEEDBACK_INFO_TYPE): + def feed_back(self, client, message, mess_data, info_type=FEEDBACK_INFO_TYPE): """Give a message back to the user""" if mess_data["type"] == "groupchat": to_ = mess_data["to"].userhostJID() @@ -342,7 +342,7 @@ mess_data["type"] = C.MESS_TYPE_INFO mess_data["message"] = {"": message} mess_data["extra"]["info_type"] = info_type - client.messageSendToBridge(mess_data) + client.message_send_to_bridge(mess_data) def cmd_whois(self, client, mess_data): """show informations on entity @@ -358,7 +358,7 @@ if mess_data["type"] == "groupchat": room = mess_data["to"].userhostJID() try: - if self.host.plugins["XEP-0045"].isNickInRoom(client, room, entity): + if self.host.plugins["XEP-0045"].is_nick_in_room(client, room, entity): entity = "%s/%s" % (room, entity) except KeyError: log.warning("plugin XEP-0045 is not present") @@ -371,11 +371,11 @@ if not target_jid.user or not target_jid.host: raise jid.InvalidFormat except (RuntimeError, jid.InvalidFormat, AttributeError): - self.feedBack(client, _("Invalid jid, can't whois"), mess_data) + self.feed_back(client, _("Invalid jid, can't whois"), mess_data) return False if not target_jid.resource: - target_jid.resource = self.host.memory.getMainResource(client, target_jid) + target_jid.resource = self.host.memory.main_resource_get(client, target_jid) whois_msg = [_("whois for %(jid)s") % {"jid": target_jid}] @@ -385,14 +385,14 @@ lambda __: callback(client, whois_msg, mess_data, target_jid) ) - def feedBack(__): - self.feedBack(client, "\n".join(whois_msg), mess_data) + def feed_back(__): + self.feed_back(client, "\n".join(whois_msg), mess_data) return False - d.addCallback(feedBack) + d.addCallback(feed_back) return d - def _getArgsHelp(self, cmd_data): + def _get_args_help(self, cmd_data): """Return help string for args of cmd_name, according to docstring data @param cmd_data: command data @@ -420,7 +420,7 @@ def cmd_whoami(self, client, mess_data): """give your own jid""" - self.feedBack(client, client.jid.full(), mess_data) + self.feed_back(client, client.jid.full(), mess_data) def cmd_help(self, client, mess_data): """show help on available commands @@ -432,7 +432,7 @@ if cmd_name and cmd_name[0] == "/": cmd_name = cmd_name[1:] if cmd_name and cmd_name not in self._commands: - self.feedBack( + self.feed_back( client, _("Invalid command name [{}]\n".format(cmd_name)), mess_data ) cmd_name = "" @@ -443,7 +443,7 @@ for command in sorted(self._commands): cmd_data = self._commands[command] - if not self._contextValid(mess_data, cmd_data): + if not self._context_valid(mess_data, cmd_data): continue spaces = (longuest - len(command)) * " " help_cmds.append( @@ -464,8 +464,8 @@ short_help=cmd_data["doc_short_help"], syntax=_(" " * 4 + "syntax: {}\n").format(syntax) if syntax else "", args_help="\n".join( - [" " * 8 + "{}".format(line) for line in self._getArgsHelp(cmd_data)] + [" " * 8 + "{}".format(line) for line in self._get_args_help(cmd_data)] ), ) - self.feedBack(client, help_mess, mess_data) + self.feed_back(client, help_mess, mess_data)
--- a/sat/plugins/plugin_misc_text_syntaxes.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_text_syntaxes.py Sat Apr 08 13:54:42 2023 +0200 @@ -196,7 +196,7 @@ "syntaxes": self.syntaxes, } - self.addSyntax( + self.add_syntax( self.SYNTAX_XHTML, lambda xhtml: defer.succeed(xhtml), lambda xhtml: defer.succeed(xhtml), @@ -204,10 +204,10 @@ ) # TODO: text => XHTML should add <a/> to url like in frontends # it's probably best to move sat_frontends.tools.strings to sat.tools.common or similar - self.addSyntax( + self.add_syntax( self.SYNTAX_TEXT, lambda text: escape(text), - lambda xhtml: self._removeMarkups(xhtml), + lambda xhtml: self._remove_markups(xhtml), [TextSyntaxes.OPT_HIDDEN], ) try: @@ -217,7 +217,7 @@ # XXX: we disable raw HTML parsing by default, to avoid parsing error # when the user is not aware of markdown and HTML class EscapeHTML(Extension): - def extendMarkdown(self, md): + def extend_markdown(self, md): md.preprocessors.deregister('html_block') md.inlinePatterns.deregister('html') @@ -226,7 +226,7 @@ h.body_width = 0 # do not truncate the lines, it breaks the long URLs return h.handle(html) - self.addSyntax( + self.add_syntax( self.SYNTAX_MARKDOWN, partial(markdown.markdown, extensions=[ @@ -251,22 +251,22 @@ "You can download/install them from https://pythonhosted.org/Markdown/ " "and https://github.com/Alir3z4/html2text/" ) - host.bridge.addMethod( - "syntaxConvert", + host.bridge.add_method( + "syntax_convert", ".plugin", in_sign="sssbs", out_sign="s", async_=True, method=self.convert, ) - host.bridge.addMethod( - "syntaxGet", ".plugin", in_sign="s", out_sign="s", method=self.getSyntax + host.bridge.add_method( + "syntax_get", ".plugin", in_sign="s", out_sign="s", method=self.get_syntax ) - if xml_tools.cleanXHTML is None: + if xml_tools.clean_xhtml is None: log.debug("Installing cleaning method") - xml_tools.cleanXHTML = self.cleanXHTML + xml_tools.clean_xhtml = self.clean_xhtml - def _updateParamOptions(self): + def _update_param_options(self): data_synt = self.syntaxes default_synt = TextSyntaxes.default_syntax syntaxes = [] @@ -284,23 +284,23 @@ options.append('<option value="%s" %s/>' % (syntax, selected)) self.params_data["options"] = "\n".join(options) - self.host.memory.updateParams(self.params % self.params_data) + self.host.memory.update_params(self.params % self.params_data) - def getCurrentSyntax(self, profile): + def get_current_syntax(self, profile): """ Return the selected syntax for the given profile @param profile: %(doc_profile)s @return: profile selected syntax """ - return self.host.memory.getParamA(NAME, CATEGORY, profile_key=profile) + return self.host.memory.param_get_a(NAME, CATEGORY, profile_key=profile) - def _logError(self, failure, action="converting syntax"): + def _log_error(self, failure, action="converting syntax"): log.error( "Error while {action}: {failure}".format(action=action, failure=failure) ) return failure - def cleanStyle(self, styles_raw: str) -> str: + def clean_style(self, styles_raw: str) -> str: """"Clean unsafe CSS styles Remove styles not in the whitelist, or where the value doesn't match the regex @@ -327,7 +327,7 @@ ["%s: %s" % (key_, value_) for key_, value_ in cleaned_styles] ) - def cleanClasses(self, classes_raw: str) -> str: + def clean_classes(self, classes_raw: str) -> str: """Remove any non whitelisted class @param classes_raw: classes set on an element @@ -335,7 +335,7 @@ """ return " ".join(SAFE_CLASSES.intersection(classes_raw.split())) - def cleanXHTML(self, xhtml): + def clean_xhtml(self, xhtml): """Clean XHTML text by removing potentially dangerous/malicious parts @param xhtml(unicode, lxml.etree._Element): raw HTML/XHTML text to clean @@ -360,9 +360,9 @@ ) xhtml_elt = cleaner.clean_html(xhtml_elt) for elt in xhtml_elt.xpath("//*[@style]"): - elt.set("style", self.cleanStyle(elt.get("style"))) + elt.set("style", self.clean_style(elt.get("style"))) for elt in xhtml_elt.xpath("//*[@class]"): - elt.set("class", self.cleanClasses(elt.get("class"))) + elt.set("class", self.clean_classes(elt.get("class"))) # we remove self-closing elements for non-void elements for element in xhtml_elt.iter(tag=etree.Element): if not element.text: @@ -389,11 +389,11 @@ # TODO: a way for parser to return parsing errors/warnings if syntax_from == _SYNTAX_CURRENT: - syntax_from = self.getCurrentSyntax(profile) + syntax_from = self.get_current_syntax(profile) else: syntax_from = syntax_from.lower().strip() if syntax_to == _SYNTAX_CURRENT: - syntax_to = self.getCurrentSyntax(profile) + syntax_to = self.get_current_syntax(profile) else: syntax_to = syntax_to.lower().strip() syntaxes = self.syntaxes @@ -411,7 +411,7 @@ # TODO: keep only body element and change it to a div here ? if safe: - d.addCallback(self.cleanXHTML) + d.addCallback(self.clean_xhtml) if TextSyntaxes.OPT_NO_THREAD in syntaxes[syntax_to]["flags"]: d.addCallback(syntaxes[syntax_to]["from"]) @@ -422,7 +422,7 @@ d.addCallback(lambda text: text.rstrip()) return d - def addSyntax(self, name, to_xhtml_cb, from_xhtml_cb, flags=None): + def add_syntax(self, name, to_xhtml_cb, from_xhtml_cb, flags=None): """Add a new syntax to the manager @param name: unique name of the syntax @@ -456,9 +456,9 @@ if TextSyntaxes.OPT_DEFAULT in flags: TextSyntaxes.default_syntax = key - self._updateParamOptions() + self._update_param_options() - def getSyntax(self, name): + def get_syntax(self, name): """get syntax key corresponding to a name @raise exceptions.NotFound: syntax doesn't exist @@ -468,7 +468,7 @@ return key raise exceptions.NotFound - def _removeMarkups(self, xhtml): + def _remove_markups(self, xhtml): """Remove XHTML markups from the given string. @param xhtml: the XHTML string to be cleaned
--- a/sat/plugins/plugin_misc_upload.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_upload.py Sat Apr 08 13:54:42 2023 +0200 @@ -57,28 +57,28 @@ def __init__(self, host): log.info(_("plugin Upload initialization")) self.host = host - host.bridge.addMethod( - "fileUpload", + host.bridge.add_method( + "file_upload", ".plugin", in_sign="sssss", out_sign="a{ss}", - method=self._fileUpload, + method=self._file_upload, async_=True, ) self._upload_callbacks = [] - def _fileUpload( + def _file_upload( self, filepath, filename, upload_jid_s="", options='', profile=C.PROF_KEY_NONE ): - client = self.host.getClient(profile) + client = self.host.get_client(profile) upload_jid = jid.JID(upload_jid_s) if upload_jid_s else None options = data_format.deserialise(options) - return defer.ensureDeferred(self.fileUpload( + return defer.ensureDeferred(self.file_upload( client, filepath, filename or None, upload_jid, options )) - async def fileUpload(self, client, filepath, filename, upload_jid, options): + async def file_upload(self, client, filepath, filename, upload_jid, options): """Send a file using best available method parameters are the same as for [upload] @@ -159,7 +159,7 @@ components) exceptions.NotFound must be raised if no entity has been found @param upload_cb(callable): method to upload a file - must have the same signature as [fileUpload] + must have the same signature as [file_upload] must return a tuple with progress_id and a Deferred which fire download URL when upload is finished @param priority(int): pririoty of this method, the higher available will be used
--- a/sat/plugins/plugin_misc_uri_finder.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_uri_finder.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ def __init__(self, host): log.info(_("URI finder plugin initialization")) self.host = host - host.bridge.addMethod("URIFind", ".plugin", + host.bridge.add_method("uri_find", ".plugin", in_sign='sas', out_sign='a{sa{ss}}', method=self.find, async_=True)
--- a/sat/plugins/plugin_misc_watched.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_watched.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,25 +61,25 @@ def __init__(self, host): log.info(_("Watched initialisation")) self.host = host - host.memory.updateParams(self.params) - host.trigger.add("presence_received", self._presenceReceivedTrigger) + host.memory.update_params(self.params) + host.trigger.add("presence_received", self._presence_received_trigger) - def _presenceReceivedTrigger(self, client, entity, show, priority, statuses): + def _presence_received_trigger(self, client, entity, show, priority, statuses): if show == C.PRESENCE_UNAVAILABLE: return True # we check that the previous presence was unavailable (no notification else) try: - old_show = self.host.memory.getEntityDatum( + old_show = self.host.memory.get_entity_datum( client, entity, "presence").show except (KeyError, exceptions.UnknownEntityError): old_show = C.PRESENCE_UNAVAILABLE if old_show == C.PRESENCE_UNAVAILABLE: - watched = self.host.memory.getParamA( + watched = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile) if entity in watched or entity.userhostJID() in watched: - self.host.actionNew( + self.host.action_new( { "xmlui": xml_tools.note( _(NOTIF).format(entity=entity.full())
--- a/sat/plugins/plugin_misc_welcome.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_welcome.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,12 +74,12 @@ def __init__(self, host): log.info(_("plugin Welcome initialization")) self.host = host - host.memory.updateParams(PARAMS) + host.memory.update_params(PARAMS) - def profileConnected(self, client): + def profile_connected(self, client): # XXX: if you wan to try first_start again, you'll have to remove manually # the welcome value from your profile params in sat.db - welcome = self.host.memory.params.getParamA( + welcome = self.host.memory.params.param_get_a( WELCOME_PARAM_NAME, WELCOME_PARAM_CATEGORY, use_default=False, @@ -93,8 +93,8 @@ if welcome: xmlui = xml_tools.note(WELCOME_MSG, WELCOME_MSG_TITLE) - self.host.actionNew({"xmlui": xmlui.toXml()}, profile=client.profile) - self.host.memory.setParam( + self.host.action_new({"xmlui": xmlui.toXml()}, profile=client.profile) + self.host.memory.param_set( WELCOME_PARAM_NAME, C.BOOL_FALSE, WELCOME_PARAM_CATEGORY,
--- a/sat/plugins/plugin_misc_xmllog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_misc_xmllog.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,29 +54,29 @@ def __init__(self, host): log.info(_("Plugin XML Log initialization")) self.host = host - host.memory.updateParams(self.params) - host.bridge.addSignal( - "xmlLog", ".plugin", signature="sss" + host.memory.update_params(self.params) + host.bridge.add_signal( + "xml_log", ".plugin", signature="sss" ) # args: direction("IN" or "OUT"), xml_data, profile - host.trigger.add("stream_hooks", self.addHooks) + host.trigger.add("stream_hooks", self.add_hooks) - def addHooks(self, client, receive_hooks, send_hooks): - self.do_log = self.host.memory.getParamA("Xml log", "Debug") + def add_hooks(self, client, receive_hooks, send_hooks): + self.do_log = self.host.memory.param_get_a("Xml log", "Debug") if self.do_log: - receive_hooks.append(partial(self.onReceive, client=client)) - send_hooks.append(partial(self.onSend, client=client)) + receive_hooks.append(partial(self.on_receive, client=client)) + send_hooks.append(partial(self.on_send, client=client)) log.info(_("XML log activated")) return True - def onReceive(self, element, client): - self.host.bridge.xmlLog("IN", element.toXml(), client.profile) + def on_receive(self, element, client): + self.host.bridge.xml_log("IN", element.toXml(), client.profile) - def onSend(self, obj, client): + def on_send(self, obj, client): if isinstance(obj, str): xml_log = obj elif isinstance(obj, domish.Element): xml_log = obj.toXml() else: log.error(_("INTERNAL ERROR: Unmanaged XML type")) - self.host.bridge.xmlLog("OUT", xml_log, client.profile) + self.host.bridge.xml_log("OUT", xml_log, client.profile)
--- a/sat/plugins/plugin_pubsub_cache.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_pubsub_cache.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,7 +65,7 @@ def __init__(self, host): log.info(_("PubSub Cache initialization")) - strategy = host.memory.getConfig(None, "pubsub_cache_strategy") + strategy = host.memory.config_get(None, "pubsub_cache_strategy") if strategy == "no_cache": log.info( _( @@ -81,47 +81,47 @@ self.analysers = {} # map for caching in progress (node, service) => Deferred self.in_progress = {} - self.host.trigger.add("XEP-0060_getItems", self._getItemsTrigger) - self._p.addManagedNode( + self.host.trigger.add("XEP-0060_getItems", self._get_items_trigger) + self._p.add_managed_node( "", - items_cb=self.onItemsEvent, - delete_cb=self.onDeleteEvent, - purge_db=self.onPurgeEvent, + items_cb=self.on_items_event, + delete_cb=self.on_delete_event, + purge_db=self.on_purge_event, ) - host.bridge.addMethod( - "psCacheGet", + host.bridge.add_method( + "ps_cache_get", ".plugin", in_sign="ssiassss", out_sign="s", - method=self._getItemsFromCache, + method=self._get_items_from_cache, async_=True, ) - host.bridge.addMethod( - "psCacheSync", + host.bridge.add_method( + "ps_cache_sync", ".plugin", "sss", out_sign="", method=self._synchronise, async_=True, ) - host.bridge.addMethod( - "psCachePurge", + host.bridge.add_method( + "ps_cache_purge", ".plugin", "s", out_sign="", method=self._purge, async_=True, ) - host.bridge.addMethod( - "psCacheReset", + host.bridge.add_method( + "ps_cache_reset", ".plugin", "", out_sign="", method=self._reset, async_=True, ) - host.bridge.addMethod( - "psCacheSearch", + host.bridge.add_method( + "ps_cache_search", ".plugin", "s", out_sign="s", @@ -129,7 +129,7 @@ async_=True, ) - def registerAnalyser(self, analyser: dict) -> None: + def register_analyser(self, analyser: dict) -> None: """Register a new pubsub node analyser @param analyser: An analyser is a dictionary which may have the following keys @@ -203,7 +203,7 @@ ) self.analysers[name] = analyser - async def cacheItems( + async def cache_items( self, client: SatXMPPEntity, pubsub_node: PubsubNode, @@ -216,7 +216,7 @@ if parser is not None: parsed_items = [ - await utils.asDeferred( + await utils.as_deferred( parser, client, item, @@ -228,16 +228,16 @@ else: parsed_items = None - await self.host.memory.storage.cachePubsubItems( + await self.host.memory.storage.cache_pubsub_items( client, pubsub_node, items, parsed_items ) - async def _cacheNode( + async def _cache_node( self, client: SatXMPPEntity, pubsub_node: PubsubNode ) -> None: - await self.host.memory.storage.updatePubsubNodeSyncState( + await self.host.memory.storage.update_pubsub_node_sync_state( pubsub_node, SyncState.IN_PROGRESS ) service, node = pubsub_node.service, pubsub_node.name @@ -274,7 +274,7 @@ ) try: - await self.host.checkFeatures( + await self.host.check_features( client, [rsm.NS_RSM, self._p.DISCO_RSM], pubsub_node.service ) except error.StanzaError as e: @@ -286,7 +286,7 @@ items, __ = await client.pubsub_client.items( pubsub_node.service, pubsub_node.name, maxItems=20 ) - await self.cacheItems( + await self.cache_items( client, pubsub_node, items ) else: @@ -299,7 +299,7 @@ items, __ = await client.pubsub_client.items( pubsub_node.service, pubsub_node.name, maxItems=20 ) - await self.cacheItems( + await self.cache_items( client, pubsub_node, items ) else: @@ -310,7 +310,7 @@ items, rsm_response = await client.pubsub_client.items( service, node, rsm_request=rsm_request ) - await self.cacheItems( + await self.cache_items( client, pubsub_node, items ) for item in items: @@ -334,11 +334,11 @@ ) rsm_request = None break - rsm_request = rsm_p.getNextRequest(rsm_request, rsm_response) + rsm_request = rsm_p.get_next_request(rsm_request, rsm_response) if rsm_request is None: break - await self.host.memory.storage.updatePubsubNodeSyncState( + await self.host.memory.storage.update_pubsub_node_sync_state( pubsub_node, SyncState.COMPLETED ) except Exception as e: @@ -347,27 +347,27 @@ log.error( f"Can't cache node {node!r} at {service} for {client.profile}: {e}\n{tb}" ) - await self.host.memory.storage.updatePubsubNodeSyncState( + await self.host.memory.storage.update_pubsub_node_sync_state( pubsub_node, SyncState.ERROR ) - await self.host.memory.storage.deletePubsubItems(pubsub_node) + await self.host.memory.storage.delete_pubsub_items(pubsub_node) raise e - def _cacheNodeClean(self, __, pubsub_node): + def _cache_node_clean(self, __, pubsub_node): del self.in_progress[(pubsub_node.service, pubsub_node.name)] - def cacheNode( + def cache_node( self, client: SatXMPPEntity, pubsub_node: PubsubNode ) -> None: """Launch node caching as a background task""" - d = defer.ensureDeferred(self._cacheNode(client, pubsub_node)) - d.addBoth(self._cacheNodeClean, pubsub_node=pubsub_node) + d = defer.ensureDeferred(self._cache_node(client, pubsub_node)) + d.addBoth(self._cache_node_clean, pubsub_node=pubsub_node) self.in_progress[(pubsub_node.service, pubsub_node.name)] = d return d - async def analyseNode( + async def analyse_node( self, client: SatXMPPEntity, service: jid.JID, @@ -451,26 +451,26 @@ except KeyError: pass else: - await utils.asDeferred(match_cb, client, analyse) + await utils.as_deferred(match_cb, client, analyse) return analyse - def _getItemsFromCache( + def _get_items_from_cache( self, service="", node="", max_items=10, item_ids=None, sub_id=None, extra="", profile_key=C.PROF_KEY_NONE ): - d = defer.ensureDeferred(self._aGetItemsFromCache( + d = defer.ensureDeferred(self._a_get_items_from_cache( service, node, max_items, item_ids, sub_id, extra, profile_key )) - d.addCallback(self._p.transItemsData) - d.addCallback(self._p.serialiseItems) + d.addCallback(self._p.trans_items_data) + d.addCallback(self._p.serialise_items) return d - async def _aGetItemsFromCache( + async def _a_get_items_from_cache( self, service, node, max_items, item_ids, sub_id, extra, profile_key ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else client.jid.userhostJID() - pubsub_node = await self.host.memory.storage.getPubsubNode( + pubsub_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) if pubsub_node is None: @@ -478,8 +478,8 @@ f"{node!r} at {service} doesn't exist in cache for {client.profile!r}" ) max_items = None if max_items == C.NO_LIMIT else max_items - extra = self._p.parseExtra(data_format.deserialise(extra)) - items, metadata = await self.getItemsFromCache( + extra = self._p.parse_extra(data_format.deserialise(extra)) + items, metadata = await self.get_items_from_cache( client, pubsub_node, max_items, @@ -490,7 +490,7 @@ ) return [i.data for i in items], metadata - async def getItemsFromCache( + async def get_items_from_cache( self, client: SatXMPPEntity, node: PubsubNode, @@ -507,7 +507,7 @@ raise NotImplementedError("MAM queries are not supported yet") if max_items is None and rsm_request is None: max_items = 20 - pubsub_items, metadata = await self.host.memory.storage.getItems( + pubsub_items, metadata = await self.host.memory.storage.get_items( node, max_items=max_items, item_ids=item_ids or None, order_by=extra.get(C.KEY_ORDER_BY) ) @@ -520,7 +520,7 @@ raise exceptions.InternalError( "Pubsub max items and item IDs must not be used at the same time" ) - pubsub_items, metadata = await self.host.memory.storage.getItems( + pubsub_items, metadata = await self.host.memory.storage.get_items( node, max_items=max_items, order_by=extra.get(C.KEY_ORDER_BY) ) else: @@ -530,7 +530,7 @@ desc = True else: before = rsm_request.before - pubsub_items, metadata = await self.host.memory.storage.getItems( + pubsub_items, metadata = await self.host.memory.storage.get_items( node, max_items=rsm_request.max, before=before, after=rsm_request.after, from_index=rsm_request.index, order_by=extra.get(C.KEY_ORDER_BY), desc=desc, force_rsm=True, @@ -538,8 +538,8 @@ return pubsub_items, metadata - async def onItemsEvent(self, client, event): - node = await self.host.memory.storage.getPubsubNode( + async def on_items_event(self, client, event): + node = await self.host.memory.storage.get_pubsub_node( client, event.sender, event.nodeIdentifier ) if node is None: @@ -555,45 +555,45 @@ if not item_id: log.warning( "Ignoring invalid retract item element: " - f"{xml_tools.pFmtElt(elt)}" + f"{xml_tools.p_fmt_elt(elt)}" ) continue retract_ids.append(elt["id"]) else: log.warning( - f"Unexpected Pubsub event element: {xml_tools.pFmtElt(elt)}" + f"Unexpected Pubsub event element: {xml_tools.p_fmt_elt(elt)}" ) if items: log.debug(f"[{client.profile}] caching new items received from {node}") - await self.cacheItems( + await self.cache_items( client, node, items ) if retract_ids: log.debug(f"deleting retracted items from {node}") - await self.host.memory.storage.deletePubsubItems( + await self.host.memory.storage.delete_pubsub_items( node, items_names=retract_ids ) - async def onDeleteEvent(self, client, event): + async def on_delete_event(self, client, event): log.debug( f"deleting node {event.nodeIdentifier} from {event.sender} for " f"{client.profile}" ) - await self.host.memory.storage.deletePubsubNode( + await self.host.memory.storage.delete_pubsub_node( [client.profile], [event.sender], [event.nodeIdentifier] ) - async def onPurgeEvent(self, client, event): - node = await self.host.memory.storage.getPubsubNode( + async def on_purge_event(self, client, event): + node = await self.host.memory.storage.get_pubsub_node( client, event.sender, event.nodeIdentifier ) if node is None: return log.debug(f"purging node {node} for {client.profile}") - await self.host.memory.storage.deletePubsubItems(node) + await self.host.memory.storage.delete_pubsub_items(node) - async def _getItemsTrigger( + async def _get_items_trigger( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -613,17 +613,17 @@ if service is None: service = client.jid.userhostJID() for __ in range(5): - pubsub_node = await self.host.memory.storage.getPubsubNode( + pubsub_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) if pubsub_node is not None and pubsub_node.sync_state == SyncState.COMPLETED: analyse = {"to_sync": True} else: - analyse = await self.analyseNode(client, service, node) + analyse = await self.analyse_node(client, service, node) if pubsub_node is None: try: - pubsub_node = await self.host.memory.storage.setPubsubNode( + pubsub_node = await self.host.memory.storage.set_pubsub_node( client, service, node, @@ -650,7 +650,7 @@ if "mam" in extra: log.debug("MAM caching is not supported yet, skipping cache") return True, None - pubsub_items, metadata = await self.getItemsFromCache( + pubsub_items, metadata = await self.get_items_from_cache( client, pubsub_node, max_items, item_ids, sub_id, rsm_request, extra ) return False, ([i.data for i in pubsub_items], metadata) @@ -663,7 +663,7 @@ "restarted. Resetting the status, caching will be done again." ) pubsub_node.sync_state = None - await self.host.memory.storage.deletePubsubItems(pubsub_node) + await self.host.memory.storage.delete_pubsub_items(pubsub_node) elif time.time() - pubsub_node.sync_state_updated > PROGRESS_DEADLINE: log.warning( f"{pubsub_node} is in progress for too long " @@ -672,7 +672,7 @@ ) self.in_progress.pop[(service, node)].cancel() pubsub_node.sync_state = None - await self.host.memory.storage.deletePubsubItems(pubsub_node) + await self.host.memory.storage.delete_pubsub_items(pubsub_node) else: log.debug( f"{pubsub_node} synchronisation is already in progress, skipping" @@ -684,7 +684,7 @@ f"There is already a caching in progress for {pubsub_node}, this " "should not happen" ) - self.cacheNode(client, pubsub_node) + self.cache_node(client, pubsub_node) elif pubsub_node.sync_state == SyncState.ERROR: log.debug( f"{pubsub_node} synchronisation has previously failed, skipping" @@ -692,7 +692,7 @@ return True, None - async def _subscribeTrigger( + async def _subscribe_trigger( self, client: SatXMPPEntity, service: jid.JID, @@ -703,7 +703,7 @@ ) -> None: pass - async def _unsubscribeTrigger( + async def _unsubscribe_trigger( self, client: SatXMPPEntity, service: jid.JID, @@ -715,7 +715,7 @@ pass def _synchronise(self, service, node, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = client.jid.userhostJID() if not service else jid.JID(service) return defer.ensureDeferred(self.synchronise(client, service, node)) @@ -735,7 +735,7 @@ resynchronised (all items will be deleted and re-downloaded). """ - pubsub_node = await self.host.memory.storage.getPubsubNode( + pubsub_node = await self.host.memory.storage.get_pubsub_node( client, service, node ) if pubsub_node is None: @@ -744,8 +744,8 @@ "Synchronising the new node {node} at {service}" ).format(node=node, service=service.full) ) - analyse = await self.analyseNode(client, service, node) - pubsub_node = await self.host.memory.storage.setPubsubNode( + analyse = await self.analyse_node(client, service, node) + pubsub_node = await self.host.memory.storage.set_pubsub_node( client, service, node, @@ -772,8 +772,8 @@ ) # we first delete and recreate the node (will also delete its items) await self.host.memory.storage.delete(pubsub_node) - analyse = await self.analyseNode(client, service, node) - pubsub_node = await self.host.memory.storage.setPubsubNode( + analyse = await self.analyse_node(client, service, node) + pubsub_node = await self.host.memory.storage.set_pubsub_node( client, service, node, @@ -781,7 +781,7 @@ type_=analyse.get("type"), ) # then we can put node in cache - await self.cacheNode(client, pubsub_node) + await self.cache_node(client, pubsub_node) async def purge(self, purge_filters: dict) -> None: """Remove items according to filters @@ -804,7 +804,7 @@ datetime before which items must have been updated last to be deleted """ purge_filters["names"] = purge_filters.pop("nodes", None) - await self.host.memory.storage.purgePubsubItems(**purge_filters) + await self.host.memory.storage.purge_pubsub_items(**purge_filters) def _purge(self, purge_filters: str) -> None: purge_filters = data_format.deserialise(purge_filters) @@ -820,16 +820,16 @@ After calling this method, cache will be refilled progressively as if it where new """ - await self.host.memory.storage.deletePubsubNode(None, None, None) + await self.host.memory.storage.delete_pubsub_node(None, None, None) def _reset(self) -> defer.Deferred: return defer.ensureDeferred(self.reset()) async def search(self, query: dict) -> List[PubsubItem]: """Search pubsub items in cache""" - return await self.host.memory.storage.searchPubsubItems(query) + return await self.host.memory.storage.search_pubsub_items(query) - async def serialisableSearch(self, query: dict) -> List[dict]: + async def serialisable_search(self, query: dict) -> List[dict]: """Search pubsub items in cache and returns parsed data The returned data can be serialised. @@ -844,7 +844,7 @@ parsed["pubsub_node"] = item.node.name if query.get("with_payload"): parsed["item_payload"] = item.data.toXml() - parsed["node_profile"] = self.host.memory.storage.getProfileById( + parsed["node_profile"] = self.host.memory.storage.get_profile_by_id( item.node.profile_id ) @@ -856,6 +856,6 @@ services = query.get("services") if services: query["services"] = [jid.JID(s) for s in services] - d = defer.ensureDeferred(self.serialisableSearch(query)) + d = defer.ensureDeferred(self.serialisable_search(query)) d.addCallback(data_format.serialise) return d
--- a/sat/plugins/plugin_sec_aesgcm.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_sec_aesgcm.py Sat Apr 08 13:54:42 2023 +0200 @@ -67,10 +67,10 @@ "aesgcm", self.download ) self._attach.register( - self.canHandleAttachment, self.attach, encrypted=True) + self.can_handle_attachment, self.attach, encrypted=True) host.trigger.add("XEP-0363_upload_pre_slot", self._upload_pre_slot) host.trigger.add("XEP-0363_upload", self._upload_trigger) - host.trigger.add("messageReceived", self._messageReceivedTrigger) + host.trigger.add("messageReceived", self._message_received_trigger) async def download(self, client, uri_parsed, dest_path, options): fragment = bytes.fromhex(uri_parsed.fragment) @@ -123,7 +123,7 @@ resp = await treq_client.get(download_url, unbuffered=True) if resp.code == 200: d = treq.collect(resp, partial( - self.onDataDownload, + self.on_data_download, client=client, file_obj=file_obj, decryptor=decryptor)) @@ -132,9 +132,9 @@ self.host.plugins["DOWNLOAD"].errback_download(file_obj, d, resp) return progress_id, d - async def canHandleAttachment(self, client, data): + async def can_handle_attachment(self, client, data): try: - await self._http_upload.getHTTPUploadEntity(client) + await self._http_upload.get_http_upload_entity(client) except exceptions.NotFound: return False else: @@ -190,7 +190,7 @@ # nothing left to send, we can cancel the message raise exceptions.CancelError("Cancelled by AESGCM attachment handling") - def onDataDownload(self, data, client, file_obj, decryptor): + def on_data_download(self, data, client, file_obj, decryptor): if file_obj.tell() + len(data) > file_obj.size: # we're reaching end of file with this bunch of data # we may still have a last bunch if the tag is incomplete @@ -277,20 +277,20 @@ return True - def _popAESGCMLinks(self, match, links): + def _pop_aesgcm_links(self, match, links): link = match.group() if link not in links: links.append(link) return "" - def _checkAESGCMAttachments(self, client, data): + def _check_aesgcm_attachments(self, client, data): if not data.get('message'): return data links = [] for lang, message in list(data['message'].items()): message = AESGCM_RE.sub( - partial(self._popAESGCMLinks, links=links), + partial(self._pop_aesgcm_links, links=links), message) if links: message = message.strip() @@ -319,9 +319,9 @@ return data - def _messageReceivedTrigger(self, client, message_elt, post_treat): + def _message_received_trigger(self, client, message_elt, post_treat): # we use a post_treat callback instead of "message_parse" trigger because we need # to check if the "encrypted" flag is set to decide if we add the same flag to the # attachment - post_treat.addCallback(partial(self._checkAESGCMAttachments, client)) + post_treat.addCallback(partial(self._check_aesgcm_attachments, client)) return True
--- a/sat/plugins/plugin_sec_otr.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_sec_otr.py Sat Apr 08 13:54:42 2023 +0200 @@ -92,7 +92,7 @@ def _p_carbons(self): return self.context_manager.parent._p_carbons - def getPolicy(self, key): + def get_policy(self, key): if key in DEFAULT_POLICY_FLAGS: return DEFAULT_POLICY_FLAGS[key] else: @@ -123,10 +123,10 @@ "extra": {}, "timestamp": time.time(), } - client.generateMessageXML(mess_data) + client.generate_message_xml(mess_data) xml = mess_data['xml'] - self._p_carbons.setPrivate(xml) - self._p_hints.addHintElements(xml, [ + self._p_carbons.set_private(xml) + self._p_hints.add_hint_elements(xml, [ self._p_hints.HINT_NO_COPY, self._p_hints.HINT_NO_PERMANENT_STORE]) client.send(mess_data["xml"]) @@ -135,19 +135,19 @@ assert message_elt.name == "message" message_elt.addElement("body", content=msg) - def stopCb(self, __, feedback): + def stop_cb(self, __, feedback): client = self.user.client - self.host.bridge.otrState( + self.host.bridge.otr_state( OTR_STATE_UNENCRYPTED, self.peer.full(), client.profile ) client.feedback(self.peer, feedback) - def stopEb(self, failure_): + def stop_eb(self, failure_): # encryption may be already stopped in case of manual stop if not failure_.check(exceptions.NotFound): log.error("Error while stopping OTR encryption: {msg}".format(msg=failure_)) - def isTrusted(self): + def is_trusted(self): # we have to check value because potr code says that a 2-tuples should be # returned while in practice it's either None or u"trusted" trusted = self.getCurrentTrust() @@ -160,24 +160,24 @@ value=trusted)) return False - def setState(self, state): + def set_state(self, state): client = self.user.client old_state = self.state - super(Context, self).setState(state) - log.debug("setState: %s (old_state=%s)" % (state, old_state)) + super(Context, self).set_state(state) + log.debug("set_state: %s (old_state=%s)" % (state, old_state)) if state == potr.context.STATE_PLAINTEXT: feedback = _("/!\\ conversation with %(other_jid)s is now UNENCRYPTED") % { "other_jid": self.peer.full() } d = defer.ensureDeferred(client.encryption.stop(self.peer, NS_OTR)) - d.addCallback(self.stopCb, feedback=feedback) - d.addErrback(self.stopEb) + d.addCallback(self.stop_cb, feedback=feedback) + d.addErrback(self.stop_eb) return elif state == potr.context.STATE_ENCRYPTED: defer.ensureDeferred(client.encryption.start(self.peer, NS_OTR)) try: - trusted = self.isTrusted() + trusted = self.is_trusted() except TypeError: trusted = False trusted_str = _("trusted") if trusted else _("untrusted") @@ -195,7 +195,7 @@ other_jid=self.peer.full(), extra_info=NO_ADV_FEATURES, ) - self.host.bridge.otrState( + self.host.bridge.otr_state( OTR_STATE_ENCRYPTED, self.peer.full(), client.profile ) elif state == potr.context.STATE_FINISHED: @@ -203,8 +203,8 @@ other_jid=self.peer.full() ) d = defer.ensureDeferred(client.encryption.stop(self.peer, NS_OTR)) - d.addCallback(self.stopCb, feedback=feedback) - d.addErrback(self.stopEb) + d.addCallback(self.stop_cb, feedback=feedback) + d.addErrback(self.stop_eb) return else: log.error(D_("Unknown OTR state")) @@ -240,19 +240,19 @@ self.host = host self.client = client - def loadPrivkey(self): - log.debug("loadPrivkey") + def load_privkey(self): + log.debug("load_privkey") return self.privkey - def savePrivkey(self): - log.debug("savePrivkey") + def save_privkey(self): + log.debug("save_privkey") if self.privkey is None: raise exceptions.InternalError(_("Save is called but privkey is None !")) priv_key = hexlify(self.privkey.serializePrivateKey()) - encrypted_priv_key = self.host.memory.encryptValue(priv_key, self.client.profile) + encrypted_priv_key = self.host.memory.encrypt_value(priv_key, self.client.profile) self.client._otr_data[PRIVATE_KEY] = encrypted_priv_key - def loadTrusts(self): + def load_trusts(self): trust_data = self.client._otr_data.get("trust", {}) for jid_, jid_data in trust_data.items(): for fingerprint, trust_level in jid_data.items(): @@ -263,12 +263,12 @@ ) self.trusts.setdefault(jid.JID(jid_), {})[fingerprint] = trust_level - def saveTrusts(self): + def save_trusts(self): log.debug("saving trusts for {profile}".format(profile=self.client.profile)) log.debug("trusts = {}".format(self.client._otr_data["trust"])) self.client._otr_data.force("trust") - def setTrust(self, other_jid, fingerprint, trustLevel): + def set_trust(self, other_jid, fingerprint, trustLevel): try: trust_data = self.client._otr_data["trust"] except KeyError: @@ -276,7 +276,7 @@ self.client._otr_data["trust"] = trust_data jid_data = trust_data.setdefault(other_jid.full(), {}) jid_data[fingerprint] = trustLevel - super(Account, self).setTrust(other_jid, fingerprint, trustLevel) + super(Account, self).set_trust(other_jid, fingerprint, trustLevel) class ContextManager(object): @@ -289,18 +289,18 @@ def host(self): return self.parent.host - def startContext(self, other_jid): + def start_context(self, other_jid): assert isinstance(other_jid, jid.JID) context = self.contexts.setdefault( other_jid, Context(self, other_jid) ) return context - def getContextForUser(self, other): - log.debug("getContextForUser [%s]" % other) + def get_context_for_user(self, other): + log.debug("get_context_for_user [%s]" % other) if not other.resource: - log.warning("getContextForUser called with a bare jid: %s" % other.full()) - return self.startContext(other) + log.warning("get_context_for_user called with a bare jid: %s" % other.full()) + return self.start_context(other) class OTR(object): @@ -314,48 +314,48 @@ ) # FIXME: OTR should not be skipped per profile, this need to be refactored self._p_hints = host.plugins["XEP-0334"] self._p_carbons = host.plugins["XEP-0280"] - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=100000) - host.trigger.add("sendMessage", self.sendMessageTrigger, priority=100000) - host.trigger.add("sendMessageData", self._sendMessageDataTrigger) - host.bridge.addMethod( - "skipOTR", ".plugin", in_sign="s", out_sign="", method=self._skipOTR + host.trigger.add("messageReceived", self.message_received_trigger, priority=100000) + host.trigger.add("sendMessage", self.send_message_trigger, priority=100000) + host.trigger.add("send_message_data", self._send_message_data_trigger) + host.bridge.add_method( + "skip_otr", ".plugin", in_sign="s", out_sign="", method=self._skip_otr ) # FIXME: must be removed, must be done on per-message basis - host.bridge.addSignal( - "otrState", ".plugin", signature="sss" + host.bridge.add_signal( + "otr_state", ".plugin", signature="sss" ) # args: state, destinee_jid, profile # XXX: menus are disabled in favor to the new more generic encryption menu # there are let here commented for a little while as a reference - # host.importMenu( + # host.import_menu( # (OTR_MENU, D_(u"Start/Refresh")), - # self._otrStartRefresh, + # self._otr_start_refresh, # security_limit=0, # help_string=D_(u"Start or refresh an OTR session"), # type_=C.MENU_SINGLE, # ) - # host.importMenu( + # host.import_menu( # (OTR_MENU, D_(u"End session")), - # self._otrSessionEnd, + # self._otr_session_end, # security_limit=0, # help_string=D_(u"Finish an OTR session"), # type_=C.MENU_SINGLE, # ) - # host.importMenu( + # host.import_menu( # (OTR_MENU, D_(u"Authenticate")), - # self._otrAuthenticate, + # self._otr_authenticate, # security_limit=0, # help_string=D_(u"Authenticate user/see your fingerprint"), # type_=C.MENU_SINGLE, # ) - # host.importMenu( + # host.import_menu( # (OTR_MENU, D_(u"Drop private key")), - # self._dropPrivKey, + # self._drop_priv_key, # security_limit=0, # type_=C.MENU_SINGLE, # ) - host.trigger.add("presence_received", self._presenceReceivedTrigger) - self.host.registerEncryptionPlugin(self, "OTR", NS_OTR, directed=True) + host.trigger.add("presence_received", self._presence_received_trigger) + self.host.register_encryption_plugin(self, "OTR", NS_OTR, directed=True) - def _skipOTR(self, profile): + def _skip_otr(self, profile): """Tell the backend to not handle OTR for this profile. @param profile (str): %(doc_profile)s @@ -366,7 +366,7 @@ self.skipped_profiles.add(profile) @defer.inlineCallbacks - def profileConnecting(self, client): + def profile_connecting(self, client): if client.profile in self.skipped_profiles: return ctxMng = client._otr_context_manager = ContextManager(self, client) @@ -374,7 +374,7 @@ yield client._otr_data.load() encrypted_priv_key = client._otr_data.get(PRIVATE_KEY, None) if encrypted_priv_key is not None: - priv_key = self.host.memory.decryptValue( + priv_key = self.host.memory.decrypt_value( encrypted_priv_key, client.profile ) ctxMng.account.privkey = potr.crypt.PK.parsePrivateKey( @@ -382,9 +382,9 @@ )[0] else: ctxMng.account.privkey = None - ctxMng.account.loadTrusts() + ctxMng.account.load_trusts() - def profileDisconnected(self, client): + def profile_disconnected(self, client): if client.profile in self.skipped_profiles: self.skipped_profiles.remove(client.profile) return @@ -394,20 +394,20 @@ # encryption plugin methods - def startEncryption(self, client, entity_jid): - self.startRefresh(client, entity_jid) + def start_encryption(self, client, entity_jid): + self.start_refresh(client, entity_jid) - def stopEncryption(self, client, entity_jid): - self.endSession(client, entity_jid) + def stop_encryption(self, client, entity_jid): + self.end_session(client, entity_jid) - def getTrustUI(self, client, entity_jid): + def get_trust_ui(self, client, entity_jid): if not entity_jid.resource: - entity_jid.resource = self.host.memory.getMainResource( + entity_jid.resource = self.host.memory.main_resource_get( client, entity_jid ) # FIXME: temporary and unsecure, must be changed when frontends # are refactored ctxMng = client._otr_context_manager - otrctx = ctxMng.getContextForUser(entity_jid) + otrctx = ctxMng.get_context_for_user(entity_jid) priv_key = ctxMng.account.privkey if priv_key is None: @@ -444,21 +444,21 @@ ) return dialog - def setTrust(raw_data, profile): - if xml_tools.isXMLUICancelled(raw_data): + def set_trust(raw_data, profile): + if xml_tools.is_xmlui_cancelled(raw_data): return {} # This method is called when authentication form is submited - data = xml_tools.XMLUIResult2DataFormResult(raw_data) + data = xml_tools.xmlui_result_2_data_form_result(raw_data) if data["match"] == "yes": otrctx.setCurrentTrust(OTR_STATE_TRUSTED) note_msg = _("Your correspondent {correspondent} is now TRUSTED") - self.host.bridge.otrState( + self.host.bridge.otr_state( OTR_STATE_TRUSTED, entity_jid.full(), client.profile ) else: otrctx.setCurrentTrust("") note_msg = _("Your correspondent {correspondent} is now UNTRUSTED") - self.host.bridge.otrState( + self.host.bridge.otr_state( OTR_STATE_UNTRUSTED, entity_jid.full(), client.profile ) note = xml_tools.XMLUI( @@ -470,8 +470,8 @@ ) return {"xmlui": note.toXml()} - submit_id = self.host.registerCallback(setTrust, with_data=True, one_shot=True) - trusted = otrctx.isTrusted() + submit_id = self.host.register_callback(set_trust, with_data=True, one_shot=True) + trusted = otrctx.is_trusted() xmlui = xml_tools.XMLUI( C.XMLUI_FORM, @@ -489,29 +489,29 @@ ) ) xmlui.addDivider("blank") - xmlui.changeContainer("pairs") + xmlui.change_container("pairs") xmlui.addLabel(D_("Is your correspondent fingerprint the same as here ?")) xmlui.addList( "match", [("yes", _("yes")), ("no", _("no"))], ["yes" if trusted else "no"] ) return xmlui - def _otrStartRefresh(self, menu_data, profile): + def _otr_start_refresh(self, menu_data, profile): """Start or refresh an OTR session @param menu_data: %(menu_data)s @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: to_jid = jid.JID(menu_data["jid"]) except KeyError: log.error(_("jid key is not present !")) return defer.fail(exceptions.DataError) - self.startRefresh(client, to_jid) + self.start_refresh(client, to_jid) return {} - def startRefresh(self, client, to_jid): + def start_refresh(self, client, to_jid): """Start or refresh an OTR session @param to_jid(jid.JID): jid to start encrypted session with @@ -522,47 +522,47 @@ "Can't start an OTR session, there is already an encrypted session " "with {name}").format(name=encrypted_session['plugin'].name)) if not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource( + to_jid.resource = self.host.memory.main_resource_get( client, to_jid ) # FIXME: temporary and unsecure, must be changed when frontends # are refactored - otrctx = client._otr_context_manager.getContextForUser(to_jid) + otrctx = client._otr_context_manager.get_context_for_user(to_jid) query = otrctx.sendMessage(0, b"?OTRv?") otrctx.inject(query) - def _otrSessionEnd(self, menu_data, profile): + def _otr_session_end(self, menu_data, profile): """End an OTR session @param menu_data: %(menu_data)s @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: to_jid = jid.JID(menu_data["jid"]) except KeyError: log.error(_("jid key is not present !")) return defer.fail(exceptions.DataError) - self.endSession(client, to_jid) + self.end_session(client, to_jid) return {} - def endSession(self, client, to_jid): + def end_session(self, client, to_jid): """End an OTR session""" if not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource( + to_jid.resource = self.host.memory.main_resource_get( client, to_jid ) # FIXME: temporary and unsecure, must be changed when frontends # are refactored - otrctx = client._otr_context_manager.getContextForUser(to_jid) + otrctx = client._otr_context_manager.get_context_for_user(to_jid) otrctx.disconnect() return {} - def _otrAuthenticate(self, menu_data, profile): + def _otr_authenticate(self, menu_data, profile): """End an OTR session @param menu_data: %(menu_data)s @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: to_jid = jid.JID(menu_data["jid"]) except KeyError: @@ -572,20 +572,20 @@ def authenticate(self, client, to_jid): """Authenticate other user and see our own fingerprint""" - xmlui = self.getTrustUI(client, to_jid) + xmlui = self.get_trust_ui(client, to_jid) return {"xmlui": xmlui.toXml()} - def _dropPrivKey(self, menu_data, profile): + def _drop_priv_key(self, menu_data, profile): """Drop our private Key @param menu_data: %(menu_data)s @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: to_jid = jid.JID(menu_data["jid"]) if not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource( + to_jid.resource = self.host.memory.main_resource_get( client, to_jid ) # FIXME: temporary and unsecure, must be changed when frontends # are refactored @@ -599,7 +599,7 @@ "xmlui": xml_tools.note(_("You don't have a private key yet !")).toXml() } - def dropKey(data, profile): + def drop_key(data, profile): if C.bool(data["answer"]): # we end all sessions for context in list(ctxMng.contexts.values()): @@ -614,7 +614,7 @@ } return {} - submit_id = self.host.registerCallback(dropKey, with_data=True, one_shot=True) + submit_id = self.host.register_callback(drop_key, with_data=True, one_shot=True) confirm = xml_tools.XMLUI( C.XMLUI_DIALOG, @@ -624,10 +624,10 @@ ) return {"xmlui": confirm.toXml()} - def _receivedTreatment(self, data, client): + def _received_treatment(self, data, client): from_jid = data["from"] - log.debug("_receivedTreatment [from_jid = %s]" % from_jid) - otrctx = client._otr_context_manager.getContextForUser(from_jid) + log.debug("_received_treatment [from_jid = %s]" % from_jid) + otrctx = client._otr_context_manager.get_context_for_user(from_jid) try: message = ( @@ -699,17 +699,17 @@ exceptions.CancelError("Cancelled by OTR") ) # no message at all (no history, no signal) - client.encryption.markAsEncrypted(data, namespace=NS_OTR) - trusted = otrctx.isTrusted() + client.encryption.mark_as_encrypted(data, namespace=NS_OTR) + trusted = otrctx.is_trusted() if trusted: - client.encryption.markAsTrusted(data) + client.encryption.mark_as_trusted(data) else: - client.encryption.markAsUntrusted(data) + client.encryption.mark_as_untrusted(data) return data - def _receivedTreatmentForSkippedProfiles(self, data): + def _received_treatment_for_skipped_profiles(self, data): """This profile must be skipped because the frontend manages OTR itself, but we still need to check if the message must be stored in history or not @@ -731,7 +731,7 @@ data["history"] = C.HISTORY_SKIP return data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): if client.is_component: return True if message_elt.getAttribute("type") == C.MESS_TYPE_GROUPCHAT: @@ -742,12 +742,12 @@ # OTR is only usable when resources are present return True if client.profile in self.skipped_profiles: - post_treat.addCallback(self._receivedTreatmentForSkippedProfiles) + post_treat.addCallback(self._received_treatment_for_skipped_profiles) else: - post_treat.addCallback(self._receivedTreatment, client) + post_treat.addCallback(self._received_treatment, client) return True - def _sendMessageDataTrigger(self, client, mess_data): + def _send_message_data_trigger(self, client, mess_data): if client.is_component: return True encryption = mess_data.get(C.MESS_KEY_ENCRYPTION) @@ -755,10 +755,10 @@ return to_jid = mess_data['to'] if not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource( + to_jid.resource = self.host.memory.main_resource_get( client, to_jid ) # FIXME: temporary and unsecure, must be changed when frontends - otrctx = client._otr_context_manager.getContextForUser(to_jid) + otrctx = client._otr_context_manager.get_context_for_user(to_jid) message_elt = mess_data["xml"] if otrctx.state == potr.context.STATE_ENCRYPTED: log.debug("encrypting message") @@ -776,8 +776,8 @@ if body is None: log.warning("No message found") else: - self._p_carbons.setPrivate(message_elt) - self._p_hints.addHintElements(message_elt, [ + self._p_carbons.set_private(message_elt) + self._p_hints.add_hint_elements(message_elt, [ self._p_hints.HINT_NO_COPY, self._p_hints.HINT_NO_PERMANENT_STORE]) otrctx.sendMessage(0, str(body).encode("utf-8"), appdata=mess_data) @@ -791,7 +791,7 @@ client.feedback(to_jid, feedback) raise failure.Failure(exceptions.CancelError("Cancelled by OTR plugin")) - def sendMessageTrigger(self, client, mess_data, pre_xml_treatments, + def send_message_trigger(self, client, mess_data, pre_xml_treatments, post_xml_treatments): if client.is_component: return True @@ -808,32 +808,32 @@ return True if not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource( + to_jid.resource = self.host.memory.main_resource_get( client, to_jid ) # FIXME: full jid may not be known - otrctx = client._otr_context_manager.getContextForUser(to_jid) + otrctx = client._otr_context_manager.get_context_for_user(to_jid) if otrctx.state != potr.context.STATE_PLAINTEXT: defer.ensureDeferred(client.encryption.start(to_jid, NS_OTR)) - client.encryption.setEncryptionFlag(mess_data) + client.encryption.set_encryption_flag(mess_data) if not mess_data["to"].resource: # if not resource was given, we force it here mess_data["to"] = to_jid return True - def _presenceReceivedTrigger(self, client, entity, show, priority, statuses): + def _presence_received_trigger(self, client, entity, show, priority, statuses): if show != C.PRESENCE_UNAVAILABLE: return True if not entity.resource: try: - entity.resource = self.host.memory.getMainResource( + entity.resource = self.host.memory.main_resource_get( client, entity ) # FIXME: temporary and unsecure, must be changed when frontends # are refactored except exceptions.UnknownEntityError: return True # entity was not connected if entity in client._otr_context_manager.contexts: - otrctx = client._otr_context_manager.getContextForUser(entity) + otrctx = client._otr_context_manager.get_context_for_user(entity) otrctx.disconnect() return True
--- a/sat/plugins/plugin_sec_oxps.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_sec_oxps.py Sat Apr 08 13:54:42 2023 +0200 @@ -83,7 +83,7 @@ def __init__(self, host): log.info(_("OpenPGP for XMPP Pubsub plugin initialization")) - host.registerNamespace("oxps", NS_OXPS) + host.register_namespace("oxps", NS_OXPS) self.host = host self._p = host.plugins["XEP-0060"] self._h = host.plugins["XEP-0334"] @@ -94,32 +94,32 @@ "messageReceived", self._message_received_trigger, ) - host.bridge.addMethod( - "psSecretShare", + host.bridge.add_method( + "ps_secret_share", ".plugin", in_sign="sssass", out_sign="", method=self._ps_secret_share, async_=True, ) - host.bridge.addMethod( - "psSecretRevoke", + host.bridge.add_method( + "ps_secret_revoke", ".plugin", in_sign="sssass", out_sign="", method=self._ps_secret_revoke, async_=True, ) - host.bridge.addMethod( - "psSecretRotate", + host.bridge.add_method( + "ps_secret_rotate", ".plugin", in_sign="ssass", out_sign="", method=self._ps_secret_rotate, async_=True, ) - host.bridge.addMethod( - "psSecretsList", + host.bridge.add_method( + "ps_secrets_list", ".plugin", in_sign="sss", out_sign="s", @@ -127,10 +127,10 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return PubsubEncryption_Handler() - async def profileConnecting(self, client): + async def profile_connecting(self, client): client.__storage = persistent.LazyPersistentBinaryDict( IMPORT_NAME, client.profile ) @@ -239,7 +239,7 @@ ) -> defer.Deferred: return defer.ensureDeferred( self.revoke( - self.host.getClient(profile_key), + self.host.get_client(profile_key), jid.JID(service) if service else None, node, secret_id, @@ -267,7 +267,7 @@ """ if service is None: service = client.jid.userhostJID() - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if not shared_secrets: raise exceptions.NotFound(f"No shared secret is known for {node_uri}") @@ -327,7 +327,7 @@ message_elt["from"] = client.jid.full() message_elt["to"] = recipient.full() message_elt.addChild((openpgp_elt)) - self._h.addHintElements(message_elt, [self._h.HINT_STORE]) + self._h.add_hint_elements(message_elt, [self._h.HINT_STORE]) client.send(message_elt) def _ps_secret_share( @@ -340,7 +340,7 @@ ) -> defer.Deferred: return defer.ensureDeferred( self.share_secrets( - self.host.getClient(profile_key), + self.host.get_client(profile_key), jid.JID(recipient), jid.JID(service) if service else None, node, @@ -377,7 +377,7 @@ message_elt["from"] = client.jid.full() message_elt["to"] = recipient.full() message_elt.addChild((openpgp_elt)) - self._h.addHintElements(message_elt, [self._h.HINT_STORE]) + self._h.add_hint_elements(message_elt, [self._h.HINT_STORE]) client.send(message_elt) shared_secret.shared_with.add(recipient) @@ -399,7 +399,7 @@ """ if service is None: service = client.jid.userhostJID() - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: # no secret shared yet, let's generate one @@ -429,7 +429,7 @@ ) -> defer.Deferred: return defer.ensureDeferred( self.rotate_secret( - self.host.getClient(profile_key), + self.host.get_client(profile_key), jid.JID(service) if service else None, node, [jid.JID(r) for r in recipients] or None @@ -453,7 +453,7 @@ """ if service is None: service = client.jid.userhostJID() - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: shared_secrets = {} @@ -489,7 +489,7 @@ ) -> defer.Deferred: d = defer.ensureDeferred( self.list_shared_secrets( - self.host.getClient(profile_key), + self.host.get_client(profile_key), jid.JID(service) if service else None, node, ) @@ -512,7 +512,7 @@ """ if service is None: service = client.jid.userhostJID() - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: raise exceptions.NotFound(f"No shared secrets found for {node_uri}") @@ -541,7 +541,7 @@ f"ignoring invalid <revoke> element: {e}\n{revoke_elt.toXml()}" ) return - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: log.warning( @@ -604,7 +604,7 @@ shared_secret = SharedSecret( id=secret_id, key=key, timestamp=timestamp, origin=sender, revoked=revoked ) - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: shared_secrets = {} @@ -636,7 +636,7 @@ ) -> bool: if not items or not extra.get("encrypted"): return True - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: shared_secrets = {} @@ -713,7 +713,7 @@ ) continue if shared_secrets is None: - node_uri = uri.buildXMPPUri("pubsub", path=service.full(), node=node) + node_uri = uri.build_xmpp_uri("pubsub", path=service.full(), node=node) shared_secrets = await self.load_secrets(client, node_uri) if shared_secrets is None: log.warning(
--- a/sat/plugins/plugin_sec_pte.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_sec_pte.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,13 +55,13 @@ def __init__(self, host): log.info(_("Pubsub Targeted Encryption plugin initialization")) - host.registerNamespace("pte", NS_PTE) + host.register_namespace("pte", NS_PTE) self.host = host self._o = host.plugins["XEP-0384"] host.trigger.add("XEP-0060_publish", self._publish_trigger) host.trigger.add("XEP-0060_items", self._items_trigger) - def getHandler(self, client): + def get_handler(self, client): return PTE_Handler() async def _publish_trigger(
--- a/sat/plugins/plugin_sec_pubsub_signing.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_sec_pubsub_signing.py Sat Apr 08 13:54:42 2023 +0200 @@ -66,7 +66,7 @@ def __init__(self, host): log.info(_("Pubsub Signing plugin initialization")) - host.registerNamespace("pubsub-signing", NS_PUBSUB_SIGNING) + host.register_namespace("pubsub-signing", NS_PUBSUB_SIGNING) self.host = host self._p = host.plugins["XEP-0060"] self._ox = host.plugins["XEP-0373"] @@ -75,8 +75,8 @@ "signature", NS_PUBSUB_SIGNING, self.signature_get, self.signature_set ) host.trigger.add("XEP-0060_publish", self._publish_trigger) - host.bridge.addMethod( - "psSignatureCheck", + host.bridge.add_method( + "ps_signature_check", ".plugin", in_sign="sssss", out_sign="s", @@ -84,7 +84,7 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return PubsubSigning_Handler() def get_data_to_sign( @@ -137,7 +137,7 @@ ) -> defer.Deferred: d = defer.ensureDeferred( self.check( - self.host.getClient(profile_key), + self.host.get_client(profile_key), jid.JID(service), node, item_id, @@ -155,7 +155,7 @@ item_id: str, signature_data: Dict[str, Any], ) -> Dict[str, Any]: - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, node, item_ids=[item_id] ) if not items != 1: @@ -254,7 +254,7 @@ if item_elt is None: node = attachments_data["node"] item_id = attachments_data["id"] - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, node, item_ids=[item_id] ) if not items != 1:
--- a/sat/plugins/plugin_syntax_wiki_dotclear.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_syntax_wiki_dotclear.py Sat Apr 08 13:54:42 2023 +0200 @@ -317,7 +317,7 @@ setattr( self, "parser_h{}".format(i), - lambda elt, buf, level=i: self.parserHeading(elt, buf, level), + lambda elt, buf, level=i: self.parser_heading(elt, buf, level), ) def parser_a(self, elt, buf): @@ -331,7 +331,7 @@ # we don't want empty values raise KeyError except KeyError: - self.parserGeneric(elt, buf) + self.parser_generic(elt, buf) else: buf.append("~~{}~~".format(id_)) return @@ -358,7 +358,7 @@ title = elt["title"] except KeyError: log.debug("Acronyme without title, using generic parser") - self.parserGeneric(elt, buf) + self.parser_generic(elt, buf) return buf.append("??{}|{}??".format(str(elt), title)) @@ -370,7 +370,7 @@ if len(children) == 1 and children[0].name == "p": elt = children[0] tmp_buf = [] - self.parseChildren(elt, tmp_buf) + self.parse_children(elt, tmp_buf) blockquote = "> " + "\n> ".join("".join(tmp_buf).split("\n")) buf.append(blockquote) @@ -379,23 +379,23 @@ def parser_code(self, elt, buf): buf.append("@@") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("@@") def parser_del(self, elt, buf): buf.append("--") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("--") def parser_div(self, elt, buf): if elt.getAttribute("class") == "footnotes": - self.parserFootnote(elt, buf) + self.parser_footnote(elt, buf) else: - self.parseChildren(elt, buf, block=True) + self.parse_children(elt, buf, block=True) def parser_em(self, elt, buf): buf.append("''") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("''") def parser_h6(self, elt, buf): @@ -449,7 +449,7 @@ def parser_ins(self, elt, buf): buf.append("++") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("++") def parser_li(self, elt, buf): @@ -474,14 +474,14 @@ buf.extend(bullets) buf.append(" ") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("\n") def parser_ol(self, elt, buf): - self.parserList(elt, buf, FLAG_OL) + self.parser_list(elt, buf, FLAG_OL) def parser_p(self, elt, buf): - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("\n\n") def parser_pre(self, elt, buf): @@ -513,11 +513,11 @@ buf.append("}}") def parser_span(self, elt, buf): - self.parseChildren(elt, buf, block=True) + self.parse_children(elt, buf, block=True) def parser_strong(self, elt, buf): buf.append("__") - self.parseChildren(elt, buf) + self.parse_children(elt, buf) buf.append("__") def parser_sup(self, elt, buf): @@ -535,7 +535,7 @@ note_id = url[url.find("#") + 1 :] if not note_id: log.warning("bad link found in footnote") - self.parserGeneric(elt, buf) + self.parser_generic(elt, buf) return # this looks like a footnote buf.append("$$") @@ -543,14 +543,14 @@ self.footnotes[note_id] = len(buf) - 1 buf.append("$$") else: - self.parserGeneric(elt, buf) + self.parser_generic(elt, buf) def parser_ul(self, elt, buf): - self.parserList(elt, buf, FLAG_UL) + self.parser_list(elt, buf, FLAG_UL) - def parserList(self, elt, buf, type_): + def parser_list(self, elt, buf, type_): self.flags.append(type_) - self.parseChildren(elt, buf, block=True) + self.parse_children(elt, buf, block=True) idx = 0 for flag in reversed(self.flags): idx -= 1 @@ -561,14 +561,14 @@ if idx == 0: raise exceptions.InternalError("flag has been removed by an other parser") - def parserHeading(self, elt, buf, level): + def parser_heading(self, elt, buf, level): buf.append((6 - level) * "!") for child in elt.children: # we ignore other elements for a Hx title - self.parserText(child, buf) + self.parser_text(child, buf) buf.append("\n") - def parserFootnote(self, elt, buf): + def parser_footnote(self, elt, buf): for elt in elt.elements(): # all children other than <p/> are ignored if elt.name == "p": @@ -587,11 +587,11 @@ a_idx = elt.children.index(a_elt) dummy_elt.children = elt.children[a_idx + 1 :] note_buf = [] - self.parseChildren(dummy_elt, note_buf) + self.parse_children(dummy_elt, note_buf) # now we can replace the placeholder buf[note_idx] = "".join(note_buf) - def parserText(self, txt, buf, keep_whitespaces=False): + def parser_text(self, txt, buf, keep_whitespaces=False): txt = str(txt) if not keep_whitespaces: # we get text and only let one inter word space @@ -601,12 +601,12 @@ buf.append(txt) return txt - def parserGeneric(self, elt, buf): + def parser_generic(self, elt, buf): # as dotclear wiki syntax handle arbitrary XHTML code # we use this feature to add elements that we don't know buf.append("\n\n///html\n{}\n///\n\n".format(elt.toXml())) - def parseChildren(self, elt, buf, block=False): + def parse_children(self, elt, buf, block=False): first_visible = True for child in elt.children: if not block and not first_visible and buf and buf[-1][-1] not in (" ", "\n"): @@ -616,7 +616,7 @@ self._parse(child, buf) first_visible = False else: - appended = self.parserText(child, buf) + appended = self.parser_text(child, buf) if appended: first_visible = False @@ -626,7 +626,7 @@ if style and elt_name not in ELT_WITH_STYLE: # if we have style we use generic parser to put raw HTML # to avoid losing it - parser = self.parserGeneric + parser = self.parser_generic else: try: parser = getattr(self, "parser_{}".format(elt_name)) @@ -634,7 +634,7 @@ log.debug( "Can't find parser for {} element, using generic one".format(elt.name) ) - parser = self.parserGeneric + parser = self.parser_generic parser(elt, buf) def parse(self, elt): @@ -666,13 +666,13 @@ self._dc_parser = DCWikiParser() self._xhtml_parser = XHTMLParser() self._stx = self.host.plugins["TEXT_SYNTAXES"] - self._stx.addSyntax( - self.SYNTAX_NAME, self.parseWiki, self.parseXHTML, [self._stx.OPT_NO_THREAD] + self._stx.add_syntax( + self.SYNTAX_NAME, self.parse_wiki, self.parse_xhtml, [self._stx.OPT_NO_THREAD] ) - def parseWiki(self, wiki_stx): + def parse_wiki(self, wiki_stx): div_elt = self._dc_parser.parse(wiki_stx) return div_elt.toXml() - def parseXHTML(self, xhtml): + def parse_xhtml(self, xhtml): return self._xhtml_parser.parseString(xhtml)
--- a/sat/plugins/plugin_tickets_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_tickets_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,7 +54,7 @@ OPT_DEFAULTS = {} def __init__(self, host): - log.info(_("plugin Tickets Import initialization")) + log.info(_("plugin Tickets import initialization")) self.host = host self._importers = {} self._p = host.plugins["XEP-0060"] @@ -63,7 +63,7 @@ host.plugins["IMPORT"].initialize(self, "tickets") @defer.inlineCallbacks - def importItem( + def import_item( self, client, item_import_data, session, options, return_data, service, node ): """ @@ -121,19 +121,19 @@ if session["root_node"] is None: session["root_node"] = NS_TICKETS if not "schema" in session: - session["schema"] = yield self._s.getSchemaForm( + session["schema"] = yield self._s.get_schema_form( client, service, node or session["root_node"] ) defer.returnValue(item_import_data) @defer.inlineCallbacks - def importSubItems(self, client, item_import_data, ticket_data, session, options): + def import_sub_items(self, client, item_import_data, ticket_data, session, options): # TODO: force "open" permission (except if private, check below) # TODO: handle "private" metadata, to have non public access for node # TODO: node access/publish model should be customisable comments = ticket_data.get("comments", []) - service = yield self._m.getCommentsService(client) - node = self._m.getCommentsNode(session["root_node"] + "_" + ticket_data["id"]) + service = yield self._m.get_comments_service(client) + node = self._m.get_comments_node(session["root_node"] + "_" + ticket_data["id"]) node_options = { self._p.OPT_ACCESS_MODEL: self._p.ACCESS_OPEN, self._p.OPT_PERSIST_ITEMS: 1, @@ -141,8 +141,8 @@ self._p.OPT_SEND_ITEM_SUBSCRIBE: 1, self._p.OPT_PUBLISH_MODEL: self._p.ACCESS_OPEN, } - yield self._p.createIfNewNode(client, service, node, options=node_options) - ticket_data["comments_uri"] = uri.buildXMPPUri( + yield self._p.create_if_new_node(client, service, node, options=node_options) + ticket_data["comments_uri"] = uri.build_xmpp_uri( "pubsub", subtype="microblog", path=service.full(), node=node ) for comment in comments: @@ -151,7 +151,7 @@ comment["updated"] = comment["published"] yield self._m.send(client, comment, service, node) - def publishItem(self, client, ticket_data, service, node, session): + def publish_item(self, client, ticket_data, service, node, session): if node is None: node = NS_TICKETS id_ = ticket_data.pop("id", None) @@ -161,12 +161,12 @@ ) ) return defer.ensureDeferred( - self._s.sendDataFormItem( + self._s.send_data_form_item( client, service, node, ticket_data, session["schema"], id_ ) ) - def itemFilters(self, client, ticket_data, session, options): + def item_filters(self, client, ticket_data, session, options): mapping = options.get(OPT_MAPPING) if mapping is not None: if not isinstance(mapping, dict):
--- a/sat/plugins/plugin_tickets_import_bugzilla.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_tickets_import_bugzilla.py Sat Apr 08 13:54:42 2023 +0200 @@ -125,13 +125,13 @@ class BugzillaImport(object): def __init__(self, host): - log.info(_("Bugilla Import plugin initialization")) + log.info(_("Bugilla import plugin initialization")) self.host = host host.plugins["TICKETS_IMPORT"].register( - "bugzilla", self.Import, SHORT_DESC, LONG_DESC + "bugzilla", self.import_, SHORT_DESC, LONG_DESC ) - def Import(self, client, location, options=None): + def import_(self, client, location, options=None): if not os.path.isabs(location): raise exceptions.DataError( "An absolute path to XML data need to be given as location"
--- a/sat/plugins/plugin_tmp_directory_subscription.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_tmp_directory_subscription.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,7 +46,7 @@ def __init__(self, host): log.info(_("Directory subscription plugin initialization")) self.host = host - host.importMenu( + host.import_menu( (D_("Service"), D_("Directory subscription")), self.subscribe, security_limit=1, @@ -60,16 +60,16 @@ @param profile (unicode): %(doc_profile)s @return: a deferred dict{unicode: unicode} """ - d = self.host.plugins["XEP-0055"]._getHostServices(profile) + d = self.host.plugins["XEP-0055"]._get_host_services(profile) def got_services(services): service_jid = services[0] session_id, session_data = self.host.plugins[ "XEP-0050" - ].requesting.newSession(profile=profile) + ].requesting.new_session(profile=profile) session_data["jid"] = service_jid session_data["node"] = CMD_UPDATE_SUBSCRIBTION data = {"session_id": session_id} - return self.host.plugins["XEP-0050"]._requestingEntity(data, profile) + return self.host.plugins["XEP-0050"]._requesting_entity(data, profile) return d.addCallback(got_services)
--- a/sat/plugins/plugin_xep_0020.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0020.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,10 +51,10 @@ def __init__(self, host): log.info(_("Plugin XEP_0020 initialization")) - def getHandler(self, client): + def get_handler(self, client): return XEP_0020_handler() - def getFeatureElt(self, elt): + def get_feature_elt(self, elt): """Check element's children to find feature elements @param elt(domish.Element): parent element of the feature element @@ -67,7 +67,7 @@ raise exceptions.NotFound return feature_elt - def _getForm(self, elt, namespace): + def _get_form(self, elt, namespace): """Return the first child data form @param elt(domish.Element): parent of the data form @@ -84,7 +84,7 @@ else: return data_form.findForm(elt, namespace) - def getChoosedOptions(self, feature_elt, namespace): + def get_choosed_options(self, feature_elt, namespace): """Return choosed feature for feature element @param feature_elt(domish.Element): feature domish element @@ -92,7 +92,7 @@ @return (dict): feature name as key, and choosed option as value @raise exceptions.NotFound: not data form is found """ - form = self._getForm(feature_elt, namespace) + form = self._get_form(feature_elt, namespace) if form is None: raise exceptions.NotFound result = {} @@ -117,14 +117,14 @@ @param namespace (None, unicode): form namespace or None to ignore @raise KeyError: name is not found in data form fields """ - form = self._getForm(feature_elt, namespace) + form = self._get_form(feature_elt, namespace) options = [option.value for option in form.fields[name].options] for value in negotiable_values: if value in options: return value return None - def chooseOption(self, options, namespace): + def choose_option(self, options, namespace): """Build a feature element with choosed options @param options(dict): dict with feature as key and choosed option as value @@ -136,7 +136,7 @@ feature_elt.addChild(x_form.toElement()) return feature_elt - def proposeFeatures(self, options_dict, namespace): + def propose_features(self, options_dict, namespace): """Build a feature element with options to propose @param options_dict(dict): dict with feature as key and iterable of acceptable options as value
--- a/sat/plugins/plugin_xep_0033.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0033.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,11 +77,11 @@ self.host = host self.internal_data = {} host.trigger.add( - "sendMessage", self.sendMessageTrigger, trigger.TriggerManager.MIN_PRIORITY + "sendMessage", self.send_message_trigger, trigger.TriggerManager.MIN_PRIORITY ) - host.trigger.add("messageReceived", self.messageReceivedTrigger) + host.trigger.add("messageReceived", self.message_received_trigger) - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """Process the XEP-0033 related data to be sent""" @@ -91,7 +91,7 @@ if not "address" in mess_data["extra"]: return mess_data - def discoCallback(entities): + def disco_callback(entities): if not entities: log.warning( _("XEP-0033 is being used but the server doesn't support it!") @@ -126,18 +126,18 @@ ) ) # when the prosody plugin is completed, we can immediately return mess_data from here - self.sendAndStoreMessage(mess_data, entries, profile) + self.send_and_store_message(mess_data, entries, profile) log.debug("XEP-0033 took over") raise failure.Failure(exceptions.CancelError("Cancelled by XEP-0033")) - d = self.host.findFeaturesSet(client, [NS_ADDRESS]) - d.addCallbacks(discoCallback, lambda __: discoCallback(None)) + d = self.host.find_features_set(client, [NS_ADDRESS]) + d.addCallbacks(disco_callback, lambda __: disco_callback(None)) return d post_xml_treatments.addCallback(treatment) return True - def sendAndStoreMessage(self, mess_data, entries, profile): + def send_and_store_message(self, mess_data, entries, profile): """Check if target servers support XEP-0033, send and store the messages @return: a friendly failure to let the core know that we sent the message already @@ -148,24 +148,24 @@ Ideas: - fix Prosody plugin to check if target server support the feature - redesign the database to save only one entry to the database - - change the messageNew signal to eventually pass more than one recipient + - change the message_new signal to eventually pass more than one recipient """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) def send(mess_data, skip_send=False): d = defer.Deferred() if not skip_send: d.addCallback( - lambda ret: defer.ensureDeferred(client.sendMessageData(ret)) + lambda ret: defer.ensureDeferred(client.send_message_data(ret)) ) d.addCallback( - lambda ret: defer.ensureDeferred(client.messageAddToHistory(ret)) + lambda ret: defer.ensureDeferred(client.message_add_to_history(ret)) ) - d.addCallback(client.messageSendToBridge) + d.addCallback(client.message_send_to_bridge) d.addErrback(lambda failure: failure.trap(exceptions.CancelError)) return d.callback(mess_data) - def discoCallback(entities, to_jid_s): + def disco_callback(entities, to_jid_s): history_data = copy.deepcopy(mess_data) history_data["to"] = JID(to_jid_s) history_data["xml"]["to"] = to_jid_s @@ -183,7 +183,7 @@ send(history_data) def errback(failure, to_jid): - discoCallback(None, to_jid) + disco_callback(None, to_jid) timestamp = time() self.internal_data[timestamp] = [] @@ -191,17 +191,17 @@ for type_, jid_ in entries: d = defer.Deferred() d.addCallback( - self.host.findFeaturesSet, client=client, jid_=JID(JID(jid_).host) + self.host.find_features_set, client=client, jid_=JID(JID(jid_).host) ) d.addCallbacks( - discoCallback, errback, callbackArgs=[jid_], errbackArgs=[jid_] + disco_callback, errback, callbackArgs=[jid_], errbackArgs=[jid_] ) d.callback([NS_ADDRESS]) defer_list.append(d) d = defer.Deferred().addCallback(lambda __: self.internal_data.pop(timestamp)) defer.DeferredList(defer_list).chainDeferred(d) - def messageReceivedTrigger(self, client, message, post_treat): + def message_received_trigger(self, client, message, post_treat): """In order to save the addressing information in the history""" def post_treat_addr(data, addresses): @@ -224,7 +224,7 @@ post_treat.addCallback(post_treat_addr, addresses.children) return True - def getHandler(self, client): + def get_handler(self, client): return XEP_0033_handler(self, client.profile)
--- a/sat/plugins/plugin_xep_0045.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0045.py Sat Apr 08 13:54:42 2023 +0200 @@ -87,93 +87,93 @@ log.info(_("Plugin XEP_0045 initialization")) self.host = host self._sessions = memory.Sessions() - # return same arguments as mucRoomJoined + a boolean set to True is the room was + # return same arguments as muc_room_joined + a boolean set to True is the room was # already joined (first argument) - host.bridge.addMethod( - "mucJoin", ".plugin", in_sign='ssa{ss}s', out_sign='(bsa{sa{ss}}ssass)', + host.bridge.add_method( + "muc_join", ".plugin", in_sign='ssa{ss}s', out_sign='(bsa{sa{ss}}ssass)', method=self._join, async_=True) - host.bridge.addMethod( - "mucNick", ".plugin", in_sign='sss', out_sign='', method=self._nick) - host.bridge.addMethod( - "mucNickGet", ".plugin", in_sign='ss', out_sign='s', method=self._getRoomNick) - host.bridge.addMethod( - "mucLeave", ".plugin", in_sign='ss', out_sign='', method=self._leave, + host.bridge.add_method( + "muc_nick", ".plugin", in_sign='sss', out_sign='', method=self._nick) + host.bridge.add_method( + "muc_nick_get", ".plugin", in_sign='ss', out_sign='s', method=self._get_room_nick) + host.bridge.add_method( + "muc_leave", ".plugin", in_sign='ss', out_sign='', method=self._leave, async_=True) - host.bridge.addMethod( - "mucOccupantsGet", ".plugin", in_sign='ss', out_sign='a{sa{ss}}', - method=self._getRoomOccupants) - host.bridge.addMethod( - "mucSubject", ".plugin", in_sign='sss', out_sign='', method=self._subject) - host.bridge.addMethod( - "mucGetRoomsJoined", ".plugin", in_sign='s', out_sign='a(sa{sa{ss}}ssas)', - method=self._getRoomsJoined) - host.bridge.addMethod( - "mucGetUniqueRoomName", ".plugin", in_sign='ss', out_sign='s', - method=self._getUniqueName) - host.bridge.addMethod( - "mucConfigureRoom", ".plugin", in_sign='ss', out_sign='s', - method=self._configureRoom, async_=True) - host.bridge.addMethod( - "mucGetDefaultService", ".plugin", in_sign='', out_sign='s', - method=self.getDefaultMUC) - host.bridge.addMethod( - "mucGetService", ".plugin", in_sign='ss', out_sign='s', - method=self._getMUCService, async_=True) + host.bridge.add_method( + "muc_occupants_get", ".plugin", in_sign='ss', out_sign='a{sa{ss}}', + method=self._get_room_occupants) + host.bridge.add_method( + "muc_subject", ".plugin", in_sign='sss', out_sign='', method=self._subject) + host.bridge.add_method( + "muc_get_rooms_joined", ".plugin", in_sign='s', out_sign='a(sa{sa{ss}}ssas)', + method=self._get_rooms_joined) + host.bridge.add_method( + "muc_get_unique_room_name", ".plugin", in_sign='ss', out_sign='s', + method=self._get_unique_name) + host.bridge.add_method( + "muc_configure_room", ".plugin", in_sign='ss', out_sign='s', + method=self._configure_room, async_=True) + host.bridge.add_method( + "muc_get_default_service", ".plugin", in_sign='', out_sign='s', + method=self.get_default_muc) + host.bridge.add_method( + "muc_get_service", ".plugin", in_sign='ss', out_sign='s', + method=self._get_muc_service, async_=True) # called when a room will be joined but must be locked until join is received # (room is prepared, history is getting retrieved) # args: room_jid, profile - host.bridge.addSignal( - "mucRoomPrepareJoin", ".plugin", signature='ss') + host.bridge.add_signal( + "muc_room_prepare_join", ".plugin", signature='ss') # args: room_jid, occupants, user_nick, subject, profile - host.bridge.addSignal( - "mucRoomJoined", ".plugin", signature='sa{sa{ss}}ssass') + host.bridge.add_signal( + "muc_room_joined", ".plugin", signature='sa{sa{ss}}ssass') # args: room_jid, profile - host.bridge.addSignal( - "mucRoomLeft", ".plugin", signature='ss') + host.bridge.add_signal( + "muc_room_left", ".plugin", signature='ss') # args: room_jid, old_nick, new_nick, profile - host.bridge.addSignal( - "mucRoomUserChangedNick", ".plugin", signature='ssss') + host.bridge.add_signal( + "muc_room_user_changed_nick", ".plugin", signature='ssss') # args: room_jid, subject, profile - host.bridge.addSignal( - "mucRoomNewSubject", ".plugin", signature='sss') - self.__submit_conf_id = host.registerCallback( - self._submitConfiguration, with_data=True) - self._room_join_id = host.registerCallback(self._UIRoomJoinCb, with_data=True) - host.importMenu( - (D_("MUC"), D_("configure")), self._configureRoomMenu, security_limit=0, + host.bridge.add_signal( + "muc_room_new_subject", ".plugin", signature='sss') + self.__submit_conf_id = host.register_callback( + self._submit_configuration, with_data=True) + self._room_join_id = host.register_callback(self._ui_room_join_cb, with_data=True) + host.import_menu( + (D_("MUC"), D_("configure")), self._configure_room_menu, security_limit=0, help_string=D_("Configure Multi-User Chat room"), type_=C.MENU_ROOM) try: self.text_cmds = self.host.plugins[C.TEXT_CMDS] except KeyError: log.info(_("Text commands not available")) else: - self.text_cmds.registerTextCommands(self) - self.text_cmds.addWhoIsCb(self._whois, 100) + self.text_cmds.register_text_commands(self) + self.text_cmds.add_who_is_cb(self._whois, 100) self._mam = self.host.plugins.get("XEP-0313") self._si = self.host.plugins["XEP-0359"] - host.trigger.add("presence_available", self.presenceTrigger) - host.trigger.add("presence_received", self.presenceReceivedTrigger) - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=1000000) - host.trigger.add("message_parse", self._message_parseTrigger) + host.trigger.add("presence_available", self.presence_trigger) + host.trigger.add("presence_received", self.presence_received_trigger) + host.trigger.add("messageReceived", self.message_received_trigger, priority=1000000) + host.trigger.add("message_parse", self._message_parse_trigger) - async def profileConnected(self, client): - client.muc_service = await self.get_MUC_service(client) + async def profile_connected(self, client): + client.muc_service = await self.get_muc_service(client) - def _message_parseTrigger(self, client, message_elt, data): + def _message_parse_trigger(self, client, message_elt, data): """Add stanza-id from the room if present""" if message_elt.getAttribute("type") != C.MESS_TYPE_GROUPCHAT: return True - # stanza_id will not be filled by parseMessage because the emitter + # stanza_id will not be filled by parse_message because the emitter # is the room and not our server, so we have to parse it here room_jid = data["from"].userhostJID() - stanza_id = self._si.getStanzaId(message_elt, room_jid) + stanza_id = self._si.get_stanza_id(message_elt, room_jid) if stanza_id: data["extra"]["stanza_id"] = stanza_id - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): if message_elt.getAttribute("type") == C.MESS_TYPE_GROUPCHAT: if message_elt.subject or message_elt.delay: return False @@ -200,7 +200,7 @@ return False return True - def getRoom(self, client: SatXMPPEntity, room_jid: jid.JID) -> muc.Room: + def get_room(self, client: SatXMPPEntity, room_jid: jid.JID) -> muc.Room: """Retrieve Room instance from its jid @param room_jid: jid of the room @@ -211,7 +211,7 @@ except KeyError: raise exceptions.NotFound(_("This room has not been joined")) - def checkRoomJoined(self, client, room_jid): + def check_room_joined(self, client, room_jid): """Check that given room has been joined in current session @param room_jid (JID): room JID @@ -219,78 +219,78 @@ if room_jid not in client._muc_client.joined_rooms: raise exceptions.NotFound(_("This room has not been joined")) - def isJoinedRoom(self, client: SatXMPPEntity, room_jid: jid.JID) -> bool: + def is_joined_room(self, client: SatXMPPEntity, room_jid: jid.JID) -> bool: """Tell if a jid is a known and joined room @room_jid: jid of the room """ try: - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) except exceptions.NotFound: return False else: return True - def isRoom(self, client, entity_jid): + def is_room(self, client, entity_jid): """Tell if a jid is a joined MUC - similar to isJoinedRoom but returns a boolean + similar to is_joined_room but returns a boolean @param entity_jid(jid.JID): full or bare jid of the entity check @return (bool): True if the bare jid of the entity is a room jid """ try: - self.checkRoomJoined(client, entity_jid.userhostJID()) + self.check_room_joined(client, entity_jid.userhostJID()) except exceptions.NotFound: return False else: return True - def getBareOrFull(self, client, peer_jid): + def get_bare_or_full(self, client, peer_jid): """use full jid if peer_jid is an occupant of a room, bare jid else @param peer_jid(jid.JID): entity to test @return (jid.JID): bare or full jid """ if peer_jid.resource: - if not self.isRoom(client, peer_jid): + if not self.is_room(client, peer_jid): return peer_jid.userhostJID() return peer_jid - def _getRoomJoinedArgs(self, room, profile): + def _get_room_joined_args(self, room, profile): return [ room.roomJID.userhost(), - XEP_0045._getOccupants(room), + XEP_0045._get_occupants(room), room.nick, room.subject, [s.name for s in room.statuses], profile ] - def _UIRoomJoinCb(self, data, profile): + def _ui_room_join_cb(self, data, profile): room_jid = jid.JID(data['index']) - client = self.host.getClient(profile) + client = self.host.get_client(profile) self.join(client, room_jid) return {} - def _passwordUICb(self, data, client, room_jid, nick): + def _password_ui_cb(self, data, client, room_jid, nick): """Called when the user has given room password (or cancelled)""" if C.bool(data.get(C.XMLUI_DATA_CANCELLED, "false")): log.info("room join for {} is cancelled".format(room_jid.userhost())) raise failure.Failure(exceptions.CancelError(D_("Room joining cancelled by user"))) - password = data[xml_tools.formEscape('password')] - return client._muc_client.join(room_jid, nick, password).addCallbacks(self._joinCb, self._joinEb, (client, room_jid, nick), errbackArgs=(client, room_jid, nick, password)) + password = data[xml_tools.form_escape('password')] + return client._muc_client.join(room_jid, nick, password).addCallbacks(self._join_cb, self._join_eb, (client, room_jid, nick), errbackArgs=(client, room_jid, nick, password)) - def _showListUI(self, items, client, service): + def _show_list_ui(self, items, client, service): xmlui = xml_tools.XMLUI(title=D_('Rooms in {}'.format(service.full()))) - adv_list = xmlui.changeContainer('advanced_list', columns=1, selectable='single', callback_id=self._room_join_id) + adv_list = xmlui.change_container('advanced_list', columns=1, selectable='single', callback_id=self._room_join_id) items = sorted(items, key=lambda i: i.name.lower()) for item in items: - adv_list.setRowIndex(item.entity.full()) + adv_list.set_row_index(item.entity.full()) xmlui.addText(item.name) adv_list.end() - self.host.actionNew({'xmlui': xmlui.toXml()}, profile=client.profile) + self.host.action_new({'xmlui': xmlui.toXml()}, profile=client.profile) - def _joinCb(self, room, client, room_jid, nick): + def _join_cb(self, room, client, room_jid, nick): """Called when the user is in the requested room""" if room.locked: # FIXME: the current behaviour is to create an instant room @@ -298,11 +298,11 @@ # a proper configuration management should be done log.debug(_("room locked !")) d = client._muc_client.configure(room.roomJID, {}) - d.addErrback(self.host.logErrback, + d.addErrback(self.host.log_errback, msg=_('Error while configuring the room: {failure_}')) return room.fully_joined - def _joinEb(self, failure_, client, room_jid, nick, password): + def _join_eb(self, failure_, client, room_jid, nick, password): """Called when something is going wrong when joining the room""" try: condition = failure_.value.condition @@ -312,14 +312,14 @@ if condition == 'conflict': # we have a nickname conflict, we try again with "_" suffixed to current nickname nick += '_' - return client._muc_client.join(room_jid, nick, password).addCallbacks(self._joinCb, self._joinEb, (client, room_jid, nick), errbackArgs=(client, room_jid, nick, password)) + return client._muc_client.join(room_jid, nick, password).addCallbacks(self._join_cb, self._join_eb, (client, room_jid, nick), errbackArgs=(client, room_jid, nick, password)) elif condition == 'not-allowed': # room is restricted, we need a password password_ui = xml_tools.XMLUI("form", title=D_('Room {} is restricted').format(room_jid.userhost()), submit_id='') password_ui.addText(D_("This room is restricted, please enter the password")) password_ui.addPassword('password') - d = xml_tools.deferXMLUI(self.host, password_ui, profile=client.profile) - d.addCallback(self._passwordUICb, client, room_jid, nick) + d = xml_tools.defer_xmlui(self.host, password_ui, profile=client.profile) + d.addCallback(self._password_ui_cb, client, room_jid, nick) return d msg_suffix = ' with condition "{}"'.format(failure_.value.condition) @@ -328,34 +328,34 @@ room = room_jid.userhost(), suffix = msg_suffix)) log.warning(mess) xmlui = xml_tools.note(mess, D_("Group chat error"), level=C.XMLUI_DATA_LVL_ERROR) - self.host.actionNew({'xmlui': xmlui.toXml()}, profile=client.profile) + self.host.action_new({'xmlui': xmlui.toXml()}, profile=client.profile) @staticmethod - def _getOccupants(room): + def _get_occupants(room): """Get occupants of a room in a form suitable for bridge""" return {u.nick: {k:str(getattr(u,k) or '') for k in OCCUPANT_KEYS} for u in list(room.roster.values())} - def _getRoomOccupants(self, room_jid_s, profile_key): - client = self.host.getClient(profile_key) + def _get_room_occupants(self, room_jid_s, profile_key): + client = self.host.get_client(profile_key) room_jid = jid.JID(room_jid_s) - return self.getRoomOccupants(client, room_jid) + return self.get_room_occupants(client, room_jid) - def getRoomOccupants(self, client, room_jid): - room = self.getRoom(client, room_jid) - return self._getOccupants(room) + def get_room_occupants(self, client, room_jid): + room = self.get_room(client, room_jid) + return self._get_occupants(room) - def _getRoomsJoined(self, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - return self.getRoomsJoined(client) + def _get_rooms_joined(self, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + return self.get_rooms_joined(client) - def getRoomsJoined(self, client): + def get_rooms_joined(self, client): """Return rooms where user is""" result = [] for room in list(client._muc_client.joined_rooms.values()): if room.state == ROOM_STATE_LIVE: result.append( (room.roomJID.userhost(), - self._getOccupants(room), + self._get_occupants(room), room.nick, room.subject, [s.name for s in room.statuses], @@ -363,11 +363,11 @@ ) return result - def _getRoomNick(self, room_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - return self.getRoomNick(client, jid.JID(room_jid_s)) + def _get_room_nick(self, room_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + return self.get_room_nick(client, jid.JID(room_jid_s)) - def getRoomNick(self, client, room_jid): + def get_room_nick(self, client, room_jid): """return nick used in room by user @param room_jid (jid.JID): JID of the room @@ -375,70 +375,70 @@ @return: nick or empty string in case of error @raise exceptions.Notfound: use has not joined the room """ - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.joined_rooms[room_jid].nick - def _configureRoom(self, room_jid_s, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - d = self.configureRoom(client, jid.JID(room_jid_s)) + def _configure_room(self, room_jid_s, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + d = self.configure_room(client, jid.JID(room_jid_s)) d.addCallback(lambda xmlui: xmlui.toXml()) return d - def _configureRoomMenu(self, menu_data, profile): + def _configure_room_menu(self, menu_data, profile): """Return room configuration form @param menu_data: %(menu_data)s @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: room_jid = jid.JID(menu_data['room_jid']) except KeyError: log.error(_("room_jid key is not present !")) return defer.fail(exceptions.DataError) - def xmluiReceived(xmlui): + def xmlui_received(xmlui): if not xmlui: msg = D_("No configuration available for this room") return {"xmlui": xml_tools.note(msg).toXml()} return {"xmlui": xmlui.toXml()} - return self.configureRoom(client, room_jid).addCallback(xmluiReceived) + return self.configure_room(client, room_jid).addCallback(xmlui_received) - def configureRoom(self, client, room_jid): + def configure_room(self, client, room_jid): """return the room configuration form @param room: jid of the room to configure @return: configuration form as XMLUI """ - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) - def config2XMLUI(result): + def config_2_xmlui(result): if not result: return "" - session_id, session_data = self._sessions.newSession(profile=client.profile) + session_id, session_data = self._sessions.new_session(profile=client.profile) session_data["room_jid"] = room_jid - xmlui = xml_tools.dataForm2XMLUI(result, submit_id=self.__submit_conf_id) + xmlui = xml_tools.data_form_2_xmlui(result, submit_id=self.__submit_conf_id) xmlui.session_id = session_id return xmlui d = client._muc_client.getConfiguration(room_jid) - d.addCallback(config2XMLUI) + d.addCallback(config_2_xmlui) return d - def _submitConfiguration(self, raw_data, profile): + def _submit_configuration(self, raw_data, profile): cancelled = C.bool(raw_data.get("cancelled", C.BOOL_FALSE)) if cancelled: return defer.succeed({}) - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: - session_data = self._sessions.profileGet(raw_data["session_id"], profile) + session_data = self._sessions.profile_get(raw_data["session_id"], profile) except KeyError: log.warning(D_("Session ID doesn't exist, session has probably expired.")) _dialog = xml_tools.XMLUI('popup', title=D_('Room configuration failed')) _dialog.addText(D_("Session ID doesn't exist, session has probably expired.")) return defer.succeed({'xmlui': _dialog.toXml()}) - data = xml_tools.XMLUIResult2DataFormResult(raw_data) + data = xml_tools.xmlui_result_2_data_form_result(raw_data) d = client._muc_client.configure(session_data['room_jid'], data) _dialog = xml_tools.XMLUI('popup', title=D_('Room configuration succeed')) _dialog.addText(D_("The new settings have been saved.")) @@ -446,18 +446,18 @@ del self._sessions[raw_data["session_id"]] return d - def isNickInRoom(self, client, room_jid, nick): + def is_nick_in_room(self, client, room_jid, nick): """Tell if a nick is currently present in a room""" - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.joined_rooms[room_jid].inRoster(muc.User(nick)) - def _getMUCService(self, jid_=None, profile=C.PROF_KEY_NONE): - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.get_MUC_service(client, jid_ or None)) + def _get_muc_service(self, jid_=None, profile=C.PROF_KEY_NONE): + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.get_muc_service(client, jid_ or None)) d.addCallback(lambda service_jid: service_jid.full() if service_jid is not None else '') return d - async def get_MUC_service( + async def get_muc_service( self, client: SatXMPPEntity, jid_: Optional[jid.JID] = None) -> Optional[jid.JID]: @@ -474,7 +474,7 @@ else: # we have a cached value, we return it return muc_service - services = await self.host.findServiceEntities(client, "conference", "text", jid_) + services = await self.host.find_service_entities(client, "conference", "text", jid_) for service in services: if ".irc." not in service.userhost(): # FIXME: @@ -486,11 +486,11 @@ muc_service = None return muc_service - def _getUniqueName(self, muc_service="", profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) - return self.getUniqueName(client, muc_service or None).full() + def _get_unique_name(self, muc_service="", profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) + return self.get_unique_name(client, muc_service or None).full() - def getUniqueName(self, client, muc_service=None): + def get_unique_name(self, client, muc_service=None): """Return unique name for a room, avoiding collision @param muc_service (jid.JID) : leave empty string to use the default service @@ -510,24 +510,24 @@ muc_service = muc_service.userhost() return jid.JID("{}@{}".format(room_name, muc_service)) - def getDefaultMUC(self): + def get_default_muc(self): """Return the default MUC. @return: unicode """ - return self.host.memory.getConfig(CONFIG_SECTION, 'default_muc', default_conf['default_muc']) + return self.host.memory.config_get(CONFIG_SECTION, 'default_muc', default_conf['default_muc']) def _join_eb(self, failure_, client): failure_.trap(AlreadyJoined) room = failure_.value.room - return [True] + self._getRoomJoinedArgs(room, client.profile) + return [True] + self._get_room_joined_args(room, client.profile) def _join(self, room_jid_s, nick, options, profile_key=C.PROF_KEY_NONE): """join method used by bridge - @return (tuple): already_joined boolean + room joined arguments (see [_getRoomJoinedArgs]) + @return (tuple): already_joined boolean + room joined arguments (see [_get_room_joined_args]) """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if room_jid_s: muc_service = client.muc_service try: @@ -539,10 +539,10 @@ if not room_jid.user: room_jid.user, room_jid.host = room_jid.host, muc_service else: - room_jid = self.getUniqueName(profile_key=client.profile) + room_jid = self.get_unique_name(profile_key=client.profile) # TODO: error management + signal in bridge d = self.join(client, room_jid, nick, options or None) - d.addCallback(lambda room: [False] + self._getRoomJoinedArgs(room, client.profile)) + d.addCallback(lambda room: [False] + self._get_room_joined_args(room, client.profile)) d.addErrback(self._join_eb, client) return d @@ -564,23 +564,23 @@ raise AlreadyJoined(room) log.info(_("[{profile}] is joining room {room} with nick {nick}").format( profile=client.profile, room=room_jid.userhost(), nick=nick)) - self.host.bridge.mucRoomPrepareJoin(room_jid.userhost(), client.profile) + self.host.bridge.muc_room_prepare_join(room_jid.userhost(), client.profile) password = options.get("password") try: room = await client._muc_client.join(room_jid, nick, password) except Exception as e: - room = await utils.asDeferred( - self._joinEb(failure.Failure(e), client, room_jid, nick, password) + room = await utils.as_deferred( + self._join_eb(failure.Failure(e), client, room_jid, nick, password) ) else: await defer.ensureDeferred( - self._joinCb(room, client, room_jid, nick) + self._join_cb(room, client, room_jid, nick) ) return room - def popRooms(self, client): + def pop_rooms(self, client): """Remove rooms and return data needed to re-join them This methods is to be called before a hot reconnection @@ -594,31 +594,31 @@ return args_list def _nick(self, room_jid_s, nick, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return self.nick(client, jid.JID(room_jid_s), nick) def nick(self, client, room_jid, nick): """Change nickname in a room""" - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.nick(room_jid, nick) def _leave(self, room_jid, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return self.leave(client, jid.JID(room_jid)) def leave(self, client, room_jid): - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.leave(room_jid) def _subject(self, room_jid_s, new_subject, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return self.subject(client, jid.JID(room_jid_s), new_subject) def subject(self, client, room_jid, subject): - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.subject(room_jid, subject) - def getHandler(self, client): + def get_handler(self, client): # create a MUC client and associate it with profile' session muc_client = client._muc_client = LiberviaMUCClient(self) return muc_client @@ -632,7 +632,7 @@ """ if options is None: options = {} - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) return client._muc_client.kick(room_jid, nick, reason=options.get('reason', None)) def ban(self, client, entity_jid, room_jid, options=None): @@ -642,7 +642,7 @@ @param room_jid (JID): jid of the room @param options: attribute with extra info (reason, password) as in #XEP-0045 """ - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) if options is None: options = {} assert not entity_jid.resource @@ -656,7 +656,7 @@ @param room_jid_s (JID): jid of the room @param options: attribute with extra info (reason, nick) as in #XEP-0045 """ - self.checkRoomJoined(client, room_jid) + self.check_room_joined(client, room_jid) assert not entity_jid.resource assert not room_jid.resource assert 'affiliation' in options @@ -686,17 +686,17 @@ """ room_raw = mess_data["unparsed"].strip() if room_raw: - if self.isJoinedRoom(client, mess_data["to"]): + if self.is_joined_room(client, mess_data["to"]): # we use the same service as the one from the room where the command has # been entered if full jid is not entered muc_service = mess_data["to"].host - nick = self.getRoomNick(client, mess_data["to"]) or client.jid.user + nick = self.get_room_nick(client, mess_data["to"]) or client.jid.user else: # the command has been entered in a one2one conversation, so we use # our server MUC service as default service muc_service = client.muc_service or "" nick = client.jid.user - room_jid = self.text_cmds.getRoomJID(room_raw, muc_service) + room_jid = self.text_cmds.get_room_jid(room_raw, muc_service) self.join(client, room_jid, nick, {}) return False @@ -709,7 +709,7 @@ """ room_raw = mess_data["unparsed"].strip() if room_raw: - room = self.text_cmds.getRoomJID(room_raw, mess_data["to"].host) + room = self.text_cmds.get_room_jid(room_raw, mess_data["to"].host) else: room = mess_data["to"] @@ -734,10 +734,10 @@ options = mess_data["unparsed"].strip().split() try: nick = options[0] - assert self.isNickInRoom(client, mess_data["to"], nick) + assert self.is_nick_in_room(client, mess_data["to"], nick) except (IndexError, AssertionError): feedback = _("You must provide a member's nick to kick.") - self.text_cmds.feedBack(client, feedback, mess_data) + self.text_cmds.feed_back(client, feedback, mess_data) return False reason = ' '.join(options[1:]) if len(options) > 1 else None @@ -750,7 +750,7 @@ feedback_msg += _(' for the following reason: {reason}').format( reason=reason ) - self.text_cmds.feedBack(client, feedback_msg, mess_data) + self.text_cmds.feed_back(client, feedback_msg, mess_data) return True d.addCallback(cb) return d @@ -773,7 +773,7 @@ feedback = _( "You must provide a valid JID to ban, like in '/ban contact@example.net'" ) - self.text_cmds.feedBack(client, feedback, mess_data) + self.text_cmds.feed_back(client, feedback, mess_data) return False reason = ' '.join(options[1:]) if len(options) > 1 else None @@ -786,7 +786,7 @@ feedback_msg += _(' for the following reason: {reason}').format( reason=reason ) - self.text_cmds.feedBack(client, feedback_msg, mess_data) + self.text_cmds.feed_back(client, feedback_msg, mess_data) return True d.addCallback(cb) return d @@ -810,13 +810,13 @@ assert(entity_jid.host) except (RuntimeError, jid.InvalidFormat, AttributeError, IndexError, AssertionError): feedback = _("You must provide a valid JID to affiliate, like in '/affiliate contact@example.net member'") - self.text_cmds.feedBack(client, feedback, mess_data) + self.text_cmds.feed_back(client, feedback, mess_data) return False affiliation = options[1] if len(options) > 1 else 'none' if affiliation not in AFFILIATIONS: feedback = _("You must provide a valid affiliation: %s") % ' '.join(AFFILIATIONS) - self.text_cmds.feedBack(client, feedback, mess_data) + self.text_cmds.feed_back(client, feedback, mess_data) return False d = self.affiliate(client, entity_jid, mess_data["to"], {'affiliation': affiliation}) @@ -824,7 +824,7 @@ def cb(__): feedback_msg = _('New affiliation for {entity}: {affiliation}').format( entity=entity_jid, affiliation=affiliation) - self.text_cmds.feedBack(client, feedback_msg, mess_data) + self.text_cmds.feed_back(client, feedback_msg, mess_data) return True d.addCallback(cb) return d @@ -871,14 +871,14 @@ else: msg = D_("No known default MUC service {unparsed}").format( unparsed=unparsed) - self.text_cmds.feedBack(client, msg, mess_data) + self.text_cmds.feed_back(client, msg, mess_data) return False except jid.InvalidFormat: msg = D_("{} is not a valid JID!".format(unparsed)) - self.text_cmds.feedBack(client, msg, mess_data) + self.text_cmds.feed_back(client, msg, mess_data) return False d = self.host.getDiscoItems(client, service) - d.addCallback(self._showListUI, client, service) + d.addCallback(self._show_list_ui, client, service) return False @@ -904,17 +904,17 @@ if user.show: whois_msg.append(_("Show: %s") % user.show) - def presenceTrigger(self, presence_elt, client): + def presence_trigger(self, presence_elt, client): # FIXME: should we add a privacy parameters in settings to activate before # broadcasting the presence to all MUC rooms ? muc_client = client._muc_client for room_jid, room in muc_client.joined_rooms.items(): - elt = xml_tools.elementCopy(presence_elt) + elt = xml_tools.element_copy(presence_elt) elt['to'] = room_jid.userhost() + '/' + room.nick client.presence.send(elt) return True - def presenceReceivedTrigger(self, client, entity, show, priority, statuses): + def presence_received_trigger(self, client, entity, show, priority, statuses): entity_bare = entity.userhostJID() muc_client = client._muc_client if entity_bare in muc_client.joined_rooms: @@ -953,7 +953,7 @@ def _si(self): return self.plugin_parent._si - def changeRoomState(self, room, new_state): + def change_room_state(self, room, new_state): """Check that room is in expected state, and change it @param new_state: one of ROOM_STATE_* @@ -995,7 +995,7 @@ password: Optional[str] ) -> muc.Room: """Join room an retrieve history with legacy method""" - mess_data_list = await self.host.memory.historyGet( + mess_data_list = await self.host.memory.history_get( room_jid, client.jid.userhostJID(), limit=1, @@ -1018,7 +1018,7 @@ room._history_d.callback(None) return room - async def _get_MAM_history( + async def _get_mam_history( self, client: SatXMPPEntity, room: muc.Room, @@ -1030,7 +1030,7 @@ # and in order history_d.callback(None) - last_mess = await self.host.memory.historyGet( + last_mess = await self.host.memory.history_get( room_jid, None, limit=1, @@ -1056,7 +1056,7 @@ count = 0 while not complete: try: - mam_data = await self._mam.getArchives(client, mam_req, + mam_data = await self._mam.get_archives(client, mam_req, service=room_jid) except xmpp_error.StanzaError as e: if last_mess and e.condition == 'item-not-found': @@ -1083,7 +1083,7 @@ for mess_elt in elt_list: try: - fwd_message_elt = self._mam.getMessageFromResult( + fwd_message_elt = self._mam.get_message_from_result( client, mess_elt, mam_req, service=room_jid) except exceptions.DataError: continue @@ -1093,7 +1093,7 @@ 'forbidden by specifications') fwd_message_elt["to"] = client.jid.full() try: - mess_data = client.messageProt.parseMessage(fwd_message_elt) + mess_data = client.messageProt.parse_message(fwd_message_elt) except Exception as e: log.error( f"Can't parse message, ignoring it: {e}\n" @@ -1101,7 +1101,7 @@ ) continue # we attache parsed message data to element, to avoid parsing - # again in _addToHistory + # again in _add_to_history fwd_message_elt._mess_data = mess_data # and we inject to MUC workflow client._muc_client._onGroupChat(fwd_message_elt) @@ -1118,14 +1118,14 @@ # for legacy history, the following steps are done in receivedSubject but for MAM # the order is different (we have to join then get MAM archive, so subject # is received before archive), so we change state and add the callbacks here. - self.changeRoomState(room, ROOM_STATE_LIVE) - history_d.addCallbacks(self._historyCb, self._historyEb, [room], + self.change_room_state(room, ROOM_STATE_LIVE) + history_d.addCallbacks(self._history_cb, self._history_eb, [room], errbackArgs=[room]) # we wait for all callbacks to be processed await history_d - async def _join_MAM( + async def _join_mam( self, client: SatXMPPEntity, room_jid: jid.JID, @@ -1140,7 +1140,7 @@ room._history_type = HISTORY_MAM # MAM history retrieval can be very long, and doesn't need to be sync, so we don't # wait for it - defer.ensureDeferred(self._get_MAM_history(client, room, room_jid)) + defer.ensureDeferred(self._get_mam_history(client, room, room_jid)) room.fully_joined.callback(room) return room @@ -1151,7 +1151,7 @@ if not self._mam or not has_mam: return await self._join_legacy(self.client, room_jid, nick, password) else: - return await self._join_MAM(self.client, room_jid, nick, password) + return await self._join_mam(self.client, room_jid, nick, password) ## presence/roster ## @@ -1213,7 +1213,7 @@ if muc.STATUS_CODE.NEW_NICK in presence.mucStatuses: self._changing_nicks.add(presence.nick) - self.userChangedNick(room, user, presence.nick) + self.user_changed_nick(room, user, presence.nick) else: self._changing_nicks.discard(presence.nick) self.userLeftRoom(room, user) @@ -1222,12 +1222,12 @@ if user.nick == room.nick: # we have received our own nick, # this mean that the full room roster was received - self.changeRoomState(room, ROOM_STATE_SELF_PRESENCE) + self.change_room_state(room, ROOM_STATE_SELF_PRESENCE) log.debug("room {room} joined with nick {nick}".format( room=room.occupantJID.userhost(), nick=user.nick)) # we set type so we don't have to use a deferred # with disco to check entity type - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( self.client, room.roomJID, C.ENTITY_TYPE, C.ENTITY_TYPE_MUC ) elif room.state not in (ROOM_STATE_OCCUPANTS, ROOM_STATE_LIVE): @@ -1272,8 +1272,8 @@ # FIXME: we disable presence in history as it's taking a lot of space # while not indispensable. In the future an option may allow # to re-enable it - # self.client.messageAddToHistory(mess_data) - self.client.messageSendToBridge(mess_data) + # self.client.message_add_to_history(mess_data) + self.client.message_send_to_bridge(mess_data) def userLeftRoom(self, room, user): @@ -1284,8 +1284,8 @@ room_jid_s = room.roomJID.userhost() log.info(_("Room ({room}) left ({profile})").format( room = room_jid_s, profile = self.client.profile)) - self.host.memory.delEntityCache(room.roomJID, profile_key=self.client.profile) - self.host.bridge.mucRoomLeft(room.roomJID.userhost(), self.client.profile) + self.host.memory.del_entity_cache(room.roomJID, profile_key=self.client.profile) + self.host.bridge.muc_room_left(room.roomJID.userhost(), self.client.profile) elif room.state != ROOM_STATE_LIVE: log.warning("Received user presence data in a room before its initialisation (current state: {state})," "this is not standard! Ignoring it: {room} ({nick})".format( @@ -1315,11 +1315,11 @@ "timestamp": time.time(), } # FIXME: disable history, see userJoinRoom comment - # self.client.messageAddToHistory(mess_data) - self.client.messageSendToBridge(mess_data) + # self.client.message_add_to_history(mess_data) + self.client.message_send_to_bridge(mess_data) - def userChangedNick(self, room, user, new_nick): - self.host.bridge.mucRoomUserChangedNick(room.roomJID.userhost(), user.nick, new_nick, self.client.profile) + def user_changed_nick(self, room, user, new_nick): + self.host.bridge.muc_room_user_changed_nick(room.roomJID.userhost(), user.nick, new_nick, self.client.profile) def userUpdatedStatus(self, room, user, show, status): entity = jid.JID(tuple=(room.roomJID.user, room.roomJID.host, user.nick)) @@ -1337,7 +1337,7 @@ } return statuses = {C.PRESENCE_STATUSES_DEFAULT: status or ''} - self.host.bridge.presenceUpdate( + self.host.bridge.presence_update( entity.full(), show or '', 0, statuses, self.client.profile) ## messages ## @@ -1345,21 +1345,21 @@ def receivedGroupChat(self, room, user, body): log.debug('receivedGroupChat: room=%s user=%s body=%s' % (room.roomJID.full(), user, body)) - def _addToHistory(self, __, user, message): + def _add_to_history(self, __, user, message): try: # message can be already parsed (with MAM), in this case mess_data # it attached to the element mess_data = message.element._mess_data except AttributeError: - mess_data = self.client.messageProt.parseMessage(message.element) + mess_data = self.client.messageProt.parse_message(message.element) if mess_data['message'] or mess_data['subject']: return defer.ensureDeferred( - self.host.memory.addToHistory(self.client, mess_data) + self.host.memory.add_to_history(self.client, mess_data) ) else: return defer.succeed(None) - def _addToHistoryEb(self, failure): + def _add_to_history_eb(self, failure): failure.trap(exceptions.CancelError) def receivedHistory(self, room, user, message): @@ -1386,17 +1386,17 @@ for c in message.element.elements(): if c.uri is None: c.uri = C.NS_CLIENT - mess_data = self.client.messageProt.parseMessage(message.element) + mess_data = self.client.messageProt.parse_message(message.element) message.element._mess_data = mess_data - self._addToHistory(None, user, message) + self._add_to_history(None, user, message) if mess_data['message'] or mess_data['subject']: - self.host.bridge.messageNew( - *self.client.messageGetBridgeArgs(mess_data), + self.host.bridge.message_new( + *self.client.message_get_bridge_args(mess_data), profile=self.client.profile ) return - room._history_d.addCallback(self._addToHistory, user, message) - room._history_d.addErrback(self._addToHistoryEb) + room._history_d.addCallback(self._add_to_history, user, message) + room._history_d.addErrback(self._add_to_history_eb) ## subject ## @@ -1426,7 +1426,7 @@ def subject(self, room, subject): return muc.MUCClientProtocol.subject(self, room, subject) - def _historyCb(self, __, room): + def _history_cb(self, __, room): """Called when history have been written to database and subject is received this method will finish joining by: @@ -1435,8 +1435,8 @@ - sending stanza put in cache - cleaning variables not needed anymore """ - args = self.plugin_parent._getRoomJoinedArgs(room, self.client.profile) - self.host.bridge.mucRoomJoined(*args) + args = self.plugin_parent._get_room_joined_args(room, self.client.profile) + self.host.bridge.muc_room_joined(*args) if room._history_type == HISTORY_LEGACY: room.fully_joined.callback(room) del room._history_d @@ -1449,15 +1449,15 @@ self.client.xmlstream.dispatch(elem) for presence_data in cache_presence.values(): if not presence_data['show'] and not presence_data['status']: - # occupants are already sent in mucRoomJoined, so if we don't have + # occupants are already sent in muc_room_joined, so if we don't have # extra information like show or statuses, we can discard the signal continue else: self.userUpdatedStatus(**presence_data) - def _historyEb(self, failure_, room): + def _history_eb(self, failure_, room): log.error("Error while managing history: {}".format(failure_)) - self._historyCb(None, room) + self._history_cb(None, room) def receivedSubject(self, room, user, subject): # when subject is received, we know that we have whole roster and history @@ -1465,12 +1465,12 @@ room.subject = subject # FIXME: subject doesn't handle xml:lang if room.state != ROOM_STATE_LIVE: if room._history_type == HISTORY_LEGACY: - self.changeRoomState(room, ROOM_STATE_LIVE) - room._history_d.addCallbacks(self._historyCb, self._historyEb, [room], errbackArgs=[room]) + self.change_room_state(room, ROOM_STATE_LIVE) + room._history_d.addCallbacks(self._history_cb, self._history_eb, [room], errbackArgs=[room]) else: # the subject has been changed log.debug(_("New subject for room ({room_id}): {subject}").format(room_id = room.roomJID.full(), subject = subject)) - self.host.bridge.mucRoomNewSubject(room.roomJID.userhost(), subject, self.client.profile) + self.host.bridge.muc_room_new_subject(room.roomJID.userhost(), subject, self.client.profile) ## disco ##
--- a/sat/plugins/plugin_xep_0047.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0047.py Sat Apr 08 13:54:42 2023 +0200 @@ -71,13 +71,13 @@ log.info(_("In-Band Bytestreams plugin initialization")) self.host = host - def getHandler(self, client): + def get_handler(self, client): return XEP_0047_handler(self) - def profileConnected(self, client): + def profile_connected(self, client): client.xep_0047_current_stream = {} # key: stream_id, value: data(dict) - def _timeOut(self, sid, client): + def _time_out(self, sid, client): """Delete current_stream id, called after timeout @param sid(unicode): session id of client.xep_0047_current_stream @@ -88,9 +88,9 @@ sid=sid, profile=client.profile ) ) - self._killSession(sid, client, "TIMEOUT") + self._kill_session(sid, client, "TIMEOUT") - def _killSession(self, sid, client, failure_reason=None): + def _kill_session(self, sid, client, failure_reason=None): """Delete a current_stream id, clean up associated observers @param sid(unicode): session id @@ -124,18 +124,18 @@ else: stream_d.errback(failure.Failure(exceptions.DataError(failure_reason))) - def createSession(self, *args, **kwargs): - """like [_createSession] but return the session deferred instead of the whole session + def create_session(self, *args, **kwargs): + """like [_create_session] but return the session deferred instead of the whole session session deferred is fired when transfer is finished """ - return self._createSession(*args, **kwargs)[DEFER_KEY] + return self._create_session(*args, **kwargs)[DEFER_KEY] - def _createSession(self, client, stream_object, local_jid, to_jid, sid): + def _create_session(self, client, stream_object, local_jid, to_jid, sid): """Called when a bytestream is imminent @param stream_object(IConsumer): stream object where data will be written - @param local_jid(jid.JID): same as [startStream] + @param local_jid(jid.JID): same as [start_stream] @param to_jid(jid.JId): jid of the other peer @param sid(unicode): session id @return (dict): session data @@ -149,12 +149,12 @@ "to": to_jid, "stream_object": stream_object, "seq": -1, - "timer": reactor.callLater(TIMEOUT, self._timeOut, sid, client), + "timer": reactor.callLater(TIMEOUT, self._time_out, sid, client), } return session_data - def _onIBBOpen(self, iq_elt, client): + def _on_ibb_open(self, iq_elt, client): """"Called when an IBB <open> element is received @param iq_elt(domish.Element): the whole <iq> stanza @@ -186,18 +186,18 @@ session_data["event_data"] = event_data = ( IBB_MESSAGE_DATA if stanza == "message" else IBB_IQ_DATA ).format(sid) - session_data["observer_cb"] = observer_cb = self._onIBBData + session_data["observer_cb"] = observer_cb = self._on_ibb_data event_close = IBB_CLOSE.format(sid) # we now set the stream observer to look after data packet # FIXME: if we never get the events, the observers stay. # would be better to have generic observer and check id once triggered client.xmlstream.addObserver(event_data, observer_cb, client=client) - client.xmlstream.addOnetimeObserver(event_close, self._onIBBClose, client=client) + client.xmlstream.addOnetimeObserver(event_close, self._on_ibb_close, client=client) # finally, we send the accept stanza iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) - def _onIBBClose(self, iq_elt, client): + def _on_ibb_close(self, iq_elt, client): """"Called when an IBB <close> element is received @param iq_elt(domish.Element): the whole <iq> stanza @@ -210,9 +210,9 @@ iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) - self._killSession(sid, client) + self._kill_session(sid, client) - def _onIBBData(self, element, client): + def _on_ibb_data(self, element, client): """Observer called on <iq> or <message> stanzas with data element Manage the data elelement (check validity and write to the stream_object) @@ -247,7 +247,7 @@ if element.name == "iq": reason = "not-acceptable" self._sendError(reason, sid, element, client) - self.terminateStream(session_data, client, reason) + self.terminate_stream(session_data, client, reason) return # we reset the timeout: @@ -261,7 +261,7 @@ log.warning(_("Invalid base64 data")) if element.name == "iq": self._sendError("not-acceptable", sid, element, client) - self.terminateStream(session_data, client, reason) + self.terminate_stream(session_data, client, reason) return # we can now ack success @@ -284,10 +284,10 @@ ) ) if sid is not None: - self._killSession(sid, client, error_condition) + self._kill_session(sid, client, error_condition) client.send(iq_elt) - def startStream(self, client, stream_object, local_jid, to_jid, sid, block_size=None): + def start_stream(self, client, stream_object, local_jid, to_jid, sid, block_size=None): """Launch the stream workflow @param stream_object(ifaces.IStreamProducer): stream object to send @@ -299,7 +299,7 @@ @param sid(unicode): Stream session id @param block_size(int, None): size of the block (or None for default) """ - session_data = self._createSession(client, stream_object, local_jid, to_jid, sid) + session_data = self._create_session(client, stream_object, local_jid, to_jid, sid) if block_size is None: block_size = XEP_0047.BLOCK_SIZE @@ -315,10 +315,10 @@ open_elt["stanza"] = "iq" # TODO: manage <message> stanza ? args = [session_data, client] d = iq_elt.send() - d.addCallbacks(self._IQDataStreamCb, self._IQDataStreamEb, args, None, args) + d.addCallbacks(self._iq_data_stream_cb, self._iq_data_stream_eb, args, None, args) return session_data[DEFER_KEY] - def _IQDataStreamCb(self, iq_elt, session_data, client): + def _iq_data_stream_cb(self, iq_elt, session_data, client): """Called during the whole data streaming @param iq_elt(domish.Element): iq result @@ -340,18 +340,18 @@ data_elt.addContent(base64.b64encode(buffer_).decode()) args = [session_data, client] d = next_iq_elt.send() - d.addCallbacks(self._IQDataStreamCb, self._IQDataStreamEb, args, None, args) + d.addCallbacks(self._iq_data_stream_cb, self._iq_data_stream_eb, args, None, args) else: - self.terminateStream(session_data, client) + self.terminate_stream(session_data, client) - def _IQDataStreamEb(self, failure, session_data, client): + def _iq_data_stream_eb(self, failure, session_data, client): if failure.check(error.StanzaError): log.warning("IBB transfer failed: {}".format(failure.value)) else: log.error("IBB transfer failed: {}".format(failure.value)) - self.terminateStream(session_data, client, "IQ_ERROR") + self.terminate_stream(session_data, client, "IQ_ERROR") - def terminateStream(self, session_data, client, failure_reason=None): + def terminate_stream(self, session_data, client, failure_reason=None): """Terminate the stream session @param session_data(dict): data of this streaming session @@ -364,7 +364,7 @@ close_elt = iq_elt.addElement((NS_IBB, "close")) close_elt["sid"] = session_data["id"] iq_elt.send() - self._killSession(session_data["id"], client, failure_reason) + self._kill_session(session_data["id"], client, failure_reason) @implementer(iwokkel.IDisco) @@ -375,7 +375,7 @@ def connectionInitialized(self): self.xmlstream.addObserver( - IBB_OPEN, self.plugin_parent._onIBBOpen, client=self.parent + IBB_OPEN, self.plugin_parent._on_ibb_open, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0048.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0048.py Sat Apr 08 13:54:42 2023 +0200 @@ -57,39 +57,39 @@ def __init__(self, host): log.info(_("Bookmarks plugin initialization")) self.host = host - # self.__menu_id = host.registerCallback(self._bookmarksMenu, with_data=True) - self.__bm_save_id = host.registerCallback(self._bookmarksSaveCb, with_data=True) - host.importMenu( + # self.__menu_id = host.register_callback(self._bookmarks_menu, with_data=True) + self.__bm_save_id = host.register_callback(self._bookmarks_save_cb, with_data=True) + host.import_menu( (D_("Groups"), D_("Bookmarks")), - self._bookmarksMenu, + self._bookmarks_menu, security_limit=0, help_string=D_("Use and manage bookmarks"), ) - self.__selected_id = host.registerCallback( - self._bookmarkSelectedCb, with_data=True + self.__selected_id = host.register_callback( + self._bookmark_selected_cb, with_data=True ) - host.bridge.addMethod( - "bookmarksList", + host.bridge.add_method( + "bookmarks_list", ".plugin", in_sign="sss", out_sign="a{sa{sa{ss}}}", - method=self._bookmarksList, + method=self._bookmarks_list, async_=True, ) - host.bridge.addMethod( - "bookmarksRemove", + host.bridge.add_method( + "bookmarks_remove", ".plugin", in_sign="ssss", out_sign="", - method=self._bookmarksRemove, + method=self._bookmarks_remove, async_=True, ) - host.bridge.addMethod( - "bookmarksAdd", + host.bridge.add_method( + "bookmarks_add", ".plugin", in_sign="ssa{ss}ss", out_sign="", - method=self._bookmarksAdd, + method=self._bookmarks_add, async_=True, ) try: @@ -97,11 +97,11 @@ except KeyError: self.private_plg = None try: - self.host.plugins[C.TEXT_CMDS].registerTextCommands(self) + self.host.plugins[C.TEXT_CMDS].register_text_commands(self) except KeyError: log.info(_("Text commands not available")) - async def profileConnected(self, client): + async def profile_connected(self, client): local = client.bookmarks_local = PersistentBinaryDict( NS_BOOKMARKS, client.profile ) @@ -109,7 +109,7 @@ if not local: local[XEP_0048.MUC_TYPE] = dict() local[XEP_0048.URL_TYPE] = dict() - private = await self._getServerBookmarks("private", client.profile) + private = await self._get_server_bookmarks("private", client.profile) pubsub = client.bookmarks_pubsub = None for bookmarks in (local, private, pubsub): @@ -125,7 +125,7 @@ # slow down a lot the connection process, and result in a bad user experience. @defer.inlineCallbacks - def _getServerBookmarks(self, storage_type, profile): + def _get_server_bookmarks(self, storage_type, profile): """Get distants bookmarks update also the client.bookmarks_[type] key, with None if service is not available @@ -135,13 +135,13 @@ @param profile: %(doc_profile)s @return: data dictionary, or None if feature is not available """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) if storage_type == "private": try: - bookmarks_private_xml = yield self.private_plg.privateXMLGet( + bookmarks_private_xml = yield self.private_plg.private_xml_get( "storage", NS_BOOKMARKS, profile ) - data = client.bookmarks_private = self._bookmarkElt2Dict( + data = client.bookmarks_private = self._bookmark_elt_2_dict( bookmarks_private_xml ) except (StanzaError, AttributeError): @@ -154,7 +154,7 @@ defer.returnValue(data) @defer.inlineCallbacks - def _setServerBookmarks(self, storage_type, bookmarks_elt, profile): + def _set_server_bookmarks(self, storage_type, bookmarks_elt, profile): """Save bookmarks on server @param storage_type: storage type, can be: @@ -164,19 +164,19 @@ @param profile: %(doc_profile)s """ if storage_type == "private": - yield self.private_plg.privateXMLStore(bookmarks_elt, profile) + yield self.private_plg.private_xml_store(bookmarks_elt, profile) elif storage_type == "pubsub": raise NotImplementedError else: raise ValueError("storage_type must be 'private' or 'pubsub'") - def _bookmarkElt2Dict(self, storage_elt): + def _bookmark_elt_2_dict(self, storage_elt): """Parse bookmarks to get dictionary @param storage_elt (domish.Element): bookmarks storage @return (dict): bookmark data (key: bookmark type, value: list) where key can be: - XEP_0048.MUC_TYPE - XEP_0048.URL_TYPE - - value (dict): data as for addBookmark + - value (dict): data as for add_bookmark """ conf_data = {} url_data = {} @@ -218,12 +218,12 @@ return {XEP_0048.MUC_TYPE: conf_data, XEP_0048.URL_TYPE: url_data} - def _dict2BookmarkElt(self, type_, data): + def _dict_2_bookmark_elt(self, type_, data): """Construct a bookmark element from a data dict @param data (dict): bookmark data (key: bookmark type, value: list) where key can be: - XEP_0048.MUC_TYPE - XEP_0048.URL_TYPE - - value (dict): data as for addBookmark + - value (dict): data as for add_bookmark @return (domish.Element): bookmark element """ rooms_data = data.get(XEP_0048.MUC_TYPE, {}) @@ -253,7 +253,7 @@ return storage_elt - def _bookmarkSelectedCb(self, data, profile): + def _bookmark_selected_cb(self, data, profile): try: room_jid_s, nick = data["index"].split(" ", 1) room_jid = jid.JID(room_jid_s) @@ -261,7 +261,7 @@ log.warning(_("No room jid selected")) return {} - client = self.host.getClient(profile) + client = self.host.get_client(profile) d = self.host.plugins["XEP-0045"].join(client, room_jid, nick, {}) def join_eb(failure): @@ -272,14 +272,14 @@ d.addCallbacks(lambda __: {}, join_eb) return d - def _bookmarksMenu(self, data, profile): + def _bookmarks_menu(self, data, profile): """ XMLUI activated by menu: return Gateways UI @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) xmlui = xml_tools.XMLUI(title=_("Bookmarks manager")) - adv_list = xmlui.changeContainer( + adv_list = xmlui.change_container( "advanced_list", columns=3, selectable="single", @@ -297,7 +297,7 @@ key=lambda item: item[1].get("name", item[0].user), ): room_jid_s = room_jid.full() - adv_list.setRowIndex( + adv_list.set_row_index( "%s %s" % (room_jid_s, data.get("nick") or client.jid.user) ) xmlui.addText(data.get("name", "")) @@ -309,7 +309,7 @@ adv_list.end() xmlui.addDivider("dash") xmlui.addText(_("add a bookmark")) - xmlui.changeContainer("pairs") + xmlui.change_container("pairs") xmlui.addLabel(_("Name")) xmlui.addString("name") xmlui.addLabel(_("jid")) @@ -318,22 +318,22 @@ xmlui.addString("nick", client.jid.user) xmlui.addLabel(_("Autojoin")) xmlui.addBool("autojoin") - xmlui.changeContainer("vertical") + xmlui.change_container("vertical") xmlui.addButton(self.__bm_save_id, _("Save"), ("name", "jid", "nick", "autojoin")) return {"xmlui": xmlui.toXml()} - def _bookmarksSaveCb(self, data, profile): - bm_data = xml_tools.XMLUIResult2DataFormResult(data) + def _bookmarks_save_cb(self, data, profile): + bm_data = xml_tools.xmlui_result_2_data_form_result(data) try: location = jid.JID(bm_data.pop("jid")) except KeyError: raise exceptions.InternalError("Can't find mandatory key") - d = self.addBookmark(XEP_0048.MUC_TYPE, location, bm_data, profile_key=profile) + d = self.add_bookmark(XEP_0048.MUC_TYPE, location, bm_data, profile_key=profile) d.addCallback(lambda __: {}) return d @defer.inlineCallbacks - def addBookmark( + def add_bookmark( self, type_, location, data, storage_type="auto", profile_key=C.PROF_KEY_NONE ): """Store a new bookmark @@ -357,7 +357,7 @@ assert storage_type in ("auto", "pubsub", "private", "local") if type_ == XEP_0048.URL_TYPE and {"autojoin", "nick"}.intersection(list(data.keys())): raise ValueError("autojoin or nick can't be used with URLs") - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if storage_type == "auto": if client.bookmarks_pubsub is not None: storage_type = "pubsub" @@ -372,13 +372,13 @@ client.bookmarks_local[type_][location] = data yield client.bookmarks_local.force(type_) else: - bookmarks = yield self._getServerBookmarks(storage_type, client.profile) + bookmarks = yield self._get_server_bookmarks(storage_type, client.profile) bookmarks[type_][location] = data - bookmark_elt = self._dict2BookmarkElt(type_, bookmarks) - yield self._setServerBookmarks(storage_type, bookmark_elt, client.profile) + bookmark_elt = self._dict_2_bookmark_elt(type_, bookmarks) + yield self._set_server_bookmarks(storage_type, bookmark_elt, client.profile) @defer.inlineCallbacks - def removeBookmark( + def remove_bookmark( self, type_, location, storage_type="all", profile_key=C.PROF_KEY_NONE ): """Remove a stored bookmark @@ -395,7 +395,7 @@ @param profile_key: %(doc_profile_key)s """ assert storage_type in ("all", "pubsub", "private", "local") - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if storage_type in ("all", "local"): try: @@ -405,18 +405,18 @@ log.debug("Bookmark is not present in local storage") if storage_type in ("all", "private"): - bookmarks = yield self._getServerBookmarks("private", client.profile) + bookmarks = yield self._get_server_bookmarks("private", client.profile) try: del bookmarks[type_][location] - bookmark_elt = self._dict2BookmarkElt(type_, bookmarks) - yield self._setServerBookmarks("private", bookmark_elt, client.profile) + bookmark_elt = self._dict_2_bookmark_elt(type_, bookmarks) + yield self._set_server_bookmarks("private", bookmark_elt, client.profile) except KeyError: log.debug("Bookmark is not present in private storage") if storage_type == "pubsub": raise NotImplementedError - def _bookmarksList(self, type_, storage_location, profile_key=C.PROF_KEY_NONE): + def _bookmarks_list(self, type_, storage_location, profile_key=C.PROF_KEY_NONE): """Return stored bookmarks @param type_: bookmark type, one of: @@ -431,11 +431,11 @@ @param return (dict): (key: storage_location, value dict) with: - value (dict): (key: bookmark_location, value: bookmark data) """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) ret = {} ret_d = defer.succeed(ret) - def fillBookmarks(__, _storage_location): + def fill_bookmarks(__, _storage_location): bookmarks_ori = getattr(client, "bookmarks_" + _storage_location) if bookmarks_ori is None: return ret @@ -452,15 +452,15 @@ ret[_storage_location] = {} if _storage_location in ("private",): # we update distant bookmarks, just in case an other client added something - d = self._getServerBookmarks(_storage_location, client.profile) + d = self._get_server_bookmarks(_storage_location, client.profile) else: d = defer.succeed(None) - d.addCallback(fillBookmarks, _storage_location) + d.addCallback(fill_bookmarks, _storage_location) ret_d.addCallback(lambda __: d) return ret_d - def _bookmarksRemove( + def _bookmarks_remove( self, type_, location, storage_location, profile_key=C.PROF_KEY_NONE ): """Return stored bookmarks @@ -478,14 +478,14 @@ """ if type_ == XEP_0048.MUC_TYPE: location = jid.JID(location) - return self.removeBookmark(type_, location, storage_location, profile_key) + return self.remove_bookmark(type_, location, storage_location, profile_key) - def _bookmarksAdd( + def _bookmarks_add( self, type_, location, data, storage_type="auto", profile_key=C.PROF_KEY_NONE ): if type_ == XEP_0048.MUC_TYPE: location = jid.JID(location) - return self.addBookmark(type_, location, data, storage_type, profile_key) + return self.add_bookmark(type_, location, data, storage_type, profile_key) def cmd_bookmark(self, client, mess_data): """(Un)bookmark a MUC room @@ -498,14 +498,14 @@ options = mess_data["unparsed"].strip().split() if options and options[0] not in ("autojoin", "remove"): - txt_cmd.feedBack(client, _("Bad arguments"), mess_data) + txt_cmd.feed_back(client, _("Bad arguments"), mess_data) return False room_jid = mess_data["to"].userhostJID() if "remove" in options: - self.removeBookmark(XEP_0048.MUC_TYPE, room_jid, profile_key=client.profile) - txt_cmd.feedBack( + self.remove_bookmark(XEP_0048.MUC_TYPE, room_jid, profile_key=client.profile) + txt_cmd.feed_back( client, _("All [%s] bookmarks are being removed") % room_jid.full(), mess_data, @@ -517,7 +517,7 @@ "nick": client.jid.user, "autojoin": "true" if "autojoin" in options else "false", } - self.addBookmark(XEP_0048.MUC_TYPE, room_jid, data, profile_key=client.profile) - txt_cmd.feedBack(client, _("Bookmark added"), mess_data) + self.add_bookmark(XEP_0048.MUC_TYPE, room_jid, data, profile_key=client.profile) + txt_cmd.feed_back(client, _("Bookmark added"), mess_data) return False
--- a/sat/plugins/plugin_xep_0049.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0049.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,21 +45,21 @@ log.info(_("Plugin XEP-0049 initialization")) self.host = host - def privateXMLStore(self, element, profile_key): + def private_xml_store(self, element, profile_key): """Store private data @param element: domish.Element to store (must have a namespace) @param profile_key: %(doc_profile_key)s """ assert isinstance(element, domish.Element) - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) # XXX: feature announcement in disco#info is not mandatory in XEP-0049, so we have to try to use private XML, and react according to the answer iq_elt = compat.IQ(client.xmlstream) query_elt = iq_elt.addElement("query", XEP_0049.NS_PRIVATE) query_elt.addChild(element) return iq_elt.send() - def privateXMLGet(self, node_name, namespace, profile_key): + def private_xml_get(self, node_name, namespace, profile_key): """Store private data @param node_name: name of the node to get @param namespace: namespace of the node to get @@ -67,16 +67,16 @@ @return (domish.Element): a deferred which fire the stored data """ - client = self.host.getClient(profile_key) - # XXX: see privateXMLStore note about feature checking + client = self.host.get_client(profile_key) + # XXX: see private_xml_store note about feature checking iq_elt = compat.IQ(client.xmlstream, "get") query_elt = iq_elt.addElement("query", XEP_0049.NS_PRIVATE) query_elt.addElement(node_name, namespace) - def getCb(answer_iq_elt): + def get_cb(answer_iq_elt): answer_query_elt = next(answer_iq_elt.elements(XEP_0049.NS_PRIVATE, "query")) return answer_query_elt.firstChildElement() d = iq_elt.send() - d.addCallback(getCb) + d.addCallback(get_cb) return d
--- a/sat/plugins/plugin_xep_0050.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0050.py Sat Apr 08 13:54:42 2023 +0200 @@ -102,18 +102,18 @@ def getName(self, xml_lang=None): return self.label - def isAuthorised(self, requestor): + def is_authorised(self, requestor): if "@ALL@" in self.allowed_magics: return True forbidden = set(self.forbidden_jids) for group in self.forbidden_groups: - forbidden.update(self.client.roster.getJidsFromGroup(group)) + forbidden.update(self.client.roster.get_jids_from_group(group)) if requestor.userhostJID() in forbidden: return False allowed = set(self.allowed_jids) for group in self.allowed_groups: try: - allowed.update(self.client.roster.getJidsFromGroup(group)) + allowed.update(self.client.roster.get_jids_from_group(group)) except exceptions.UnknownGroupError: log.warning(_("The groups [{group}] is unknown for profile [{profile}])") .format(group=group, profile=self.client.profile)) @@ -194,7 +194,7 @@ self.client.send(iq_elt) del self.sessions[session_id] - def _requestEb(self, failure_, request, session_id): + def _request_eb(self, failure_, request, session_id): if failure_.check(AdHocError): error_constant = failure_.value.callback_error else: @@ -203,8 +203,8 @@ self._sendError(error_constant, session_id, request) - def onRequest(self, command_elt, requestor, action, session_id): - if not self.isAuthorised(requestor): + def on_request(self, command_elt, requestor, action, session_id): + if not self.is_authorised(requestor): return self._sendError( XEP_0050.ERROR.FORBIDDEN, session_id, command_elt.parent ) @@ -220,12 +220,12 @@ XEP_0050.ERROR.FORBIDDEN, session_id, command_elt.parent ) else: - session_id, session_data = self.sessions.newSession() + session_id, session_data = self.sessions.new_session() session_data["requestor"] = requestor if action == XEP_0050.ACTION.CANCEL: d = defer.succeed((None, XEP_0050.STATUS.CANCELED, None, None)) else: - d = utils.asDeferred( + d = utils.as_deferred( self.callback, self.client, command_elt, @@ -234,7 +234,7 @@ self.node, ) d.addCallback(self._sendAnswer, session_id, command_elt.parent) - d.addErrback(self._requestEb, command_elt.parent, session_id) + d.addErrback(self._request_eb, command_elt.parent, session_id) class XEP_0050(object): @@ -276,49 +276,49 @@ log.info(_("plugin XEP-0050 initialization")) self.host = host self.requesting = Sessions() - host.bridge.addMethod( - "adHocRun", + host.bridge.add_method( + "ad_hoc_run", ".plugin", in_sign="sss", out_sign="s", method=self._run, async_=True, ) - host.bridge.addMethod( - "adHocList", + host.bridge.add_method( + "ad_hoc_list", ".plugin", in_sign="ss", out_sign="s", - method=self._listUI, + method=self._list_ui, async_=True, ) - host.bridge.addMethod( - "adHocSequence", + host.bridge.add_method( + "ad_hoc_sequence", ".plugin", in_sign="ssss", out_sign="s", method=self._sequence, async_=True, ) - self.__requesting_id = host.registerCallback( - self._requestingEntity, with_data=True + self.__requesting_id = host.register_callback( + self._requesting_entity, with_data=True ) - host.importMenu( + host.import_menu( (D_("Service"), D_("Commands")), - self._commandsMenu, + self._commands_menu, security_limit=2, help_string=D_("Execute ad-hoc commands"), ) - host.registerNamespace('commands', NS_COMMANDS) + host.register_namespace('commands', NS_COMMANDS) - def getHandler(self, client): + def get_handler(self, client): return XEP_0050_handler(self) - def profileConnected(self, client): + def profile_connected(self, client): # map from node to AdHocCommand instance client._XEP_0050_commands = {} if not client.is_component: - self.addAdHocCommand(client, self._statusCallback, _("Status")) + self.add_ad_hoc_command(client, self._status_callback, _("Status")) def do(self, client, entity, node, action=ACTION.EXECUTE, session_id=None, form_values=None, timeout=30): @@ -349,20 +349,20 @@ d = iq_elt.send() return d - def getCommandElt(self, iq_elt): + def get_command_elt(self, iq_elt): try: return next(iq_elt.elements(NS_COMMANDS, "command")) except StopIteration: raise exceptions.NotFound(_("Missing command element")) - def adHocError(self, error_type): + def ad_hoc_error(self, error_type): """Shortcut to raise an AdHocError @param error_type(unicode): one of XEP_0050.ERROR """ raise AdHocError(error_type) - def _items2XMLUI(self, items, no_instructions): + def _items_2_xmlui(self, items, no_instructions): """Convert discovery items to XMLUI dialog """ # TODO: manage items on different jids form_ui = xml_tools.XMLUI("form", submit_id=self.__requesting_id) @@ -374,7 +374,7 @@ form_ui.addList("node", options) return form_ui - def _getDataLvl(self, type_): + def _get_data_lvl(self, type_): """Return the constant corresponding to <note/> type attribute value @param type_: note type (see XEP-0050 §4.3) @@ -389,7 +389,7 @@ log.warning(_("Invalid note type [%s], using info") % type_) return C.XMLUI_DATA_LVL_INFO - def _mergeNotes(self, notes): + def _merge_notes(self, notes): """Merge notes with level prefix (e.g. "ERROR: the message") @param notes (list): list of tuple (level, message) @@ -402,8 +402,8 @@ } return ["%s%s" % (lvl_map[lvl], msg) for lvl, msg in notes] - def parseCommandAnswer(self, iq_elt): - command_elt = self.getCommandElt(iq_elt) + def parse_command_answer(self, iq_elt): + command_elt = self.get_command_elt(iq_elt) data = {} data["status"] = command_elt.getAttribute("status", XEP_0050.STATUS.EXECUTING) data["session_id"] = command_elt.getAttribute("sessionid") @@ -411,7 +411,7 @@ for note_elt in command_elt.elements(NS_COMMANDS, "note"): notes.append( ( - self._getDataLvl(note_elt.getAttribute("type", "info")), + self._get_data_lvl(note_elt.getAttribute("type", "info")), str(note_elt), ) ) @@ -419,14 +419,14 @@ return command_elt, data - def _commandsAnswer2XMLUI(self, iq_elt, session_id, session_data): + def _commands_answer_2_xmlui(self, iq_elt, session_id, session_data): """Convert command answer to an ui for frontend @param iq_elt: command result @param session_id: id of the session used with the frontend @param profile_key: %(doc_profile_key)s """ - command_elt, answer_data = self.parseCommandAnswer(iq_elt) + command_elt, answer_data = self.parse_command_answer(iq_elt) status = answer_data["status"] if status in [XEP_0050.STATUS.COMPLETED, XEP_0050.STATUS.CANCELED]: # the command session is finished, we purge our session @@ -468,68 +468,68 @@ C.XMLUI_DIALOG, dialog_opt={ C.XMLUI_DATA_TYPE: C.XMLUI_DIALOG_NOTE, - C.XMLUI_DATA_MESS: "\n".join(self._mergeNotes(notes)), + C.XMLUI_DATA_MESS: "\n".join(self._merge_notes(notes)), C.XMLUI_DATA_LVL: dlg_level, }, session_id=session_id, ) if session_id is None: - xmlui = xml_tools.dataFormEltResult2XMLUI(data_elt) + xmlui = xml_tools.data_form_elt_result_2_xmlui(data_elt) if notes: for level, note in notes: if level != "info": note = f"[{level}] {note}" - xmlui.addWidget("text", note) + xmlui.add_widget("text", note) return xmlui form = data_form.Form.fromElement(data_elt) # we add any present note to the instructions - form.instructions.extend(self._mergeNotes(notes)) - return xml_tools.dataForm2XMLUI(form, self.__requesting_id, session_id=session_id) + form.instructions.extend(self._merge_notes(notes)) + return xml_tools.data_form_2_xmlui(form, self.__requesting_id, session_id=session_id) - def _requestingEntity(self, data, profile): + def _requesting_entity(self, data, profile): def serialise(ret_data): if "xmlui" in ret_data: ret_data["xmlui"] = ret_data["xmlui"].toXml() return ret_data - d = self.requestingEntity(data, profile) + d = self.requesting_entity(data, profile) d.addCallback(serialise) return d - def requestingEntity(self, data, profile): + def requesting_entity(self, data, profile): """Request and entity and create XMLUI accordingly. @param data: data returned by previous XMLUI (first one must come from - self._commandsMenu) + self._commands_menu) @param profile: %(doc_profile)s @return: callback dict result (with "xmlui" corresponding to the answering dialog, or empty if it's finished without error) """ if C.bool(data.get("cancelled", C.BOOL_FALSE)): return defer.succeed({}) - data_form_values = xml_tools.XMLUIResult2DataFormResult(data) - client = self.host.getClient(profile) + data_form_values = xml_tools.xmlui_result_2_data_form_result(data) + client = self.host.get_client(profile) # TODO: cancel, prev and next are not managed # TODO: managed answerer errors # TODO: manage nodes with a non data form payload if "session_id" not in data: # we just had the jid, we now request it for the available commands - session_id, session_data = self.requesting.newSession(profile=client.profile) + session_id, session_data = self.requesting.new_session(profile=client.profile) entity = jid.JID(data[xml_tools.SAT_FORM_PREFIX + "jid"]) session_data["jid"] = entity - d = self.listUI(client, entity) + d = self.list_ui(client, entity) - def sendItems(xmlui): + def send_items(xmlui): xmlui.session_id = session_id # we need to keep track of the session return {"xmlui": xmlui} - d.addCallback(sendItems) + d.addCallback(send_items) else: # we have started a several forms sessions try: - session_data = self.requesting.profileGet( + session_data = self.requesting.profile_get( data["session_id"], client.profile ) except KeyError: @@ -552,24 +552,24 @@ # we request execute node's command d = self.do(client, entity, session_data["node"], action=XEP_0050.ACTION.EXECUTE, session_id=remote_id, form_values=data_form_values) - d.addCallback(self._commandsAnswer2XMLUI, session_id, session_data) + d.addCallback(self._commands_answer_2_xmlui, session_id, session_data) d.addCallback(lambda xmlui: {"xmlui": xmlui} if xmlui is not None else {}) return d - def _commandsMenu(self, menu_data, profile): + def _commands_menu(self, menu_data, profile): """First XMLUI activated by menu: ask for target jid @param profile: %(doc_profile)s """ form_ui = xml_tools.XMLUI("form", submit_id=self.__requesting_id) form_ui.addText(_("Please enter target jid"), "instructions") - form_ui.changeContainer("pairs") + form_ui.change_container("pairs") form_ui.addLabel("jid") - form_ui.addString("jid", value=self.host.getClient(profile).jid.host) + form_ui.addString("jid", value=self.host.get_client(profile).jid.host) return {"xmlui": form_ui.toXml()} - def _statusCallback(self, client, command_elt, session_data, action, node): + def _status_callback(self, client, command_elt, session_data, action, node): """Ad-hoc command used to change the "show" part of status""" actions = session_data.setdefault("actions", []) actions.append(action) @@ -596,25 +596,25 @@ answer_form = data_form.Form.fromElement(x_elt) show = answer_form["show"] except (KeyError, StopIteration): - self.adHocError(XEP_0050.ERROR.BAD_PAYLOAD) + self.ad_hoc_error(XEP_0050.ERROR.BAD_PAYLOAD) if show not in SHOWS: - self.adHocError(XEP_0050.ERROR.BAD_PAYLOAD) + self.ad_hoc_error(XEP_0050.ERROR.BAD_PAYLOAD) if show == "disconnect": self.host.disconnect(client.profile) else: - self.host.setPresence(show=show, profile_key=client.profile) + self.host.presence_set(show=show, profile_key=client.profile) # job done, we can end the session status = XEP_0050.STATUS.COMPLETED payload = None note = (self.NOTE.INFO, _("Status updated")) else: - self.adHocError(XEP_0050.ERROR.INTERNAL) + self.ad_hoc_error(XEP_0050.ERROR.INTERNAL) return (payload, status, None, note) def _run(self, service_jid_s="", node="", profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service_jid = jid.JID(service_jid_s) if service_jid_s else None d = defer.ensureDeferred(self.run(client, service_jid, node or None)) d.addCallback(lambda xmlui: xmlui.toXml()) @@ -631,13 +631,13 @@ """ if service_jid is None: service_jid = jid.JID(client.jid.host) - session_id, session_data = self.requesting.newSession(profile=client.profile) + session_id, session_data = self.requesting.new_session(profile=client.profile) session_data["jid"] = service_jid if node is None: - xmlui = await self.listUI(client, service_jid) + xmlui = await self.list_ui(client, service_jid) else: session_data["node"] = node - cb_data = await self.requestingEntity( + cb_data = await self.requesting_entity( {"session_id": session_id}, client.profile ) xmlui = cb_data["xmlui"] @@ -655,14 +655,14 @@ d = self.host.getDiscoItems(client, to_jid, NS_COMMANDS) return d - def _listUI(self, to_jid_s, profile_key): - client = self.host.getClient(profile_key) + def _list_ui(self, to_jid_s, profile_key): + client = self.host.get_client(profile_key) to_jid = jid.JID(to_jid_s) if to_jid_s else None - d = self.listUI(client, to_jid, no_instructions=True) + d = self.list_ui(client, to_jid, no_instructions=True) d.addCallback(lambda xmlui: xmlui.toXml()) return d - def listUI(self, client, to_jid, no_instructions=False): + def list_ui(self, client, to_jid, no_instructions=False): """Request available commands and generate XMLUI @param to_jid(jid.JID, None): the entity answering the commands @@ -671,12 +671,12 @@ @return D(xml_tools.XMLUI): UI with the commands """ d = self.list(client, to_jid) - d.addCallback(self._items2XMLUI, no_instructions) + d.addCallback(self._items_2_xmlui, no_instructions) return d def _sequence(self, sequence, node, service_jid_s="", profile_key=C.PROF_KEY_NONE): sequence = data_format.deserialise(sequence, type_check=list) - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service_jid = jid.JID(service_jid_s) if service_jid_s else None d = defer.ensureDeferred(self.sequence(client, sequence, node, service_jid)) d.addCallback(lambda data: data_format.serialise(data)) @@ -711,12 +711,12 @@ session_id=session_id, form_values=data_to_send, ) - __, answer_data = self.parseCommandAnswer(iq_result_elt) + __, answer_data = self.parse_command_answer(iq_result_elt) session_id = answer_data.pop("session_id") return answer_data - def addAdHocCommand(self, client, callback, label, node=None, features=None, + def add_ad_hoc_command(self, client, callback, label, node=None, features=None, timeout=600, allowed_jids=None, allowed_groups=None, allowed_magics=None, forbidden_jids=None, forbidden_groups=None, ): @@ -782,7 +782,7 @@ commands = client._XEP_0050_commands commands[node] = ad_hoc_command - def onCmdRequest(self, request, client): + def on_cmd_request(self, request, client): request.handled = True requestor = jid.JID(request["from"]) command_elt = next(request.elements(NS_COMMANDS, "command")) @@ -798,7 +798,7 @@ except KeyError: client.sendError(request, "item-not-found") return - command.onRequest(command_elt, requestor, action, sessionid) + command.on_request(command_elt, requestor, action, sessionid) @implementer(iwokkel.IDisco) @@ -813,7 +813,7 @@ def connectionInitialized(self): self.xmlstream.addObserver( - CMD_REQUEST, self.plugin_parent.onCmdRequest, client=self.parent + CMD_REQUEST, self.plugin_parent.on_cmd_request, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""): @@ -828,7 +828,7 @@ if nodeIdentifier == NS_COMMANDS: commands = self.client._XEP_0050_commands for command in list(commands.values()): - if command.isAuthorised(requestor): + if command.is_authorised(requestor): ret.append( disco.DiscoItem(self.parent.jid, command.node, command.getName()) ) # TODO: manage name language
--- a/sat/plugins/plugin_xep_0054.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0054.py Sat Apr 08 13:54:42 2023 +0200 @@ -79,14 +79,14 @@ log.info(_("Plugin XEP_0054 initialization")) self.host = host self._i = host.plugins['IDENTITY'] - self._i.register(IMPORT_NAME, 'avatar', self.getAvatar, self.setAvatar) - self._i.register(IMPORT_NAME, 'nicknames', self.getNicknames, self.setNicknames) - host.trigger.add("presence_available", self.presenceAvailableTrigger) + self._i.register(IMPORT_NAME, 'avatar', self.get_avatar, self.set_avatar) + self._i.register(IMPORT_NAME, 'nicknames', self.get_nicknames, self.set_nicknames) + host.trigger.add("presence_available", self.presence_available_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP_0054_handler(self) - def presenceAvailableTrigger(self, presence_elt, client): + def presence_available_trigger(self, presence_elt, client): try: avatar_hash = client._xep_0054_avatar_hashes[client.jid.userhost()] except KeyError: @@ -99,12 +99,12 @@ presence_elt.addChild(x_elt) return True - async def profileConnecting(self, client): + async def profile_connecting(self, client): client._xep_0054_avatar_hashes = persistent.PersistentDict( NS_VCARD, client.profile) await client._xep_0054_avatar_hashes.load() - def savePhoto(self, client, photo_elt, entity): + def save_photo(self, client, photo_elt, entity): """Parse a <PHOTO> photo_elt and save the picture""" # XXX: this method is launched in a separate thread try: @@ -149,7 +149,7 @@ raise Failure(exceptions.DataError(msg)) image_hash = sha1(decoded).hexdigest() - with self.host.common_cache.cacheData( + with self.host.common_cache.cache_data( PLUGIN_INFO["import_name"], image_hash, mime_type, @@ -157,7 +157,7 @@ f.write(decoded) return image_hash - async def vCard2Dict(self, client, vcard_elt, entity_jid): + async def v_card_2_dict(self, client, vcard_elt, entity_jid): """Convert a VCard_elt to a dict, and save binaries""" log.debug(("parsing vcard_elt")) vcard_dict = {} @@ -184,7 +184,7 @@ # TODO: handle EXTVAL try: avatar_hash = await threads.deferToThread( - self.savePhoto, client, elem, entity_jid + self.save_photo, client, elem, entity_jid ) except (exceptions.DataError, exceptions.NotFound): avatar_hash = "" @@ -199,7 +199,7 @@ entity_jid.full(), avatar_hash) if avatar_hash: - avatar_cache = self.host.common_cache.getMetadata(avatar_hash) + avatar_cache = self.host.common_cache.get_metadata(avatar_hash) await self._i.update( client, IMPORT_NAME, @@ -220,7 +220,7 @@ return vcard_dict - async def getVCardElement(self, client, entity_jid): + async def get_vcard_element(self, client, entity_jid): """Retrieve domish.Element of a VCard @param entity_jid(jid.JID): entity from who we need the vCard @@ -239,14 +239,14 @@ ).format(entity_jid=entity_jid, xml=iq_ret_elt.toXml())) raise exceptions.DataError(f"no vCard element found for {entity_jid}") - async def updateVCardElt(self, client, entity_jid, to_replace): + async def update_vcard_elt(self, client, entity_jid, to_replace): """Create a vcard element to replace some metadata @param to_replace(list[str]): list of vcard element names to remove """ try: # we first check if a vcard already exists, to keep data - vcard_elt = await self.getVCardElement(client, entity_jid) + vcard_elt = await self.get_vcard_element(client, entity_jid) except error.StanzaError as e: if e.condition == "item-not-found": vcard_elt = domish.Element((NS_VCARD, "vCard")) @@ -266,16 +266,16 @@ return vcard_elt - async def getCard(self, client, entity_jid): + async def get_card(self, client, entity_jid): """Ask server for VCard @param entity_jid(jid.JID): jid from which we want the VCard @result(dict): vCard data """ - entity_jid = self._i.getIdentityJid(client, entity_jid) + entity_jid = self._i.get_identity_jid(client, entity_jid) log.debug(f"Asking for {entity_jid}'s VCard") try: - vcard_elt = await self.getVCardElement(client, entity_jid) + vcard_elt = await self.get_vcard_element(client, entity_jid) except exceptions.DataError: self._i.update(client, IMPORT_NAME, "avatar", None, entity_jid) except Exception as e: @@ -284,9 +284,9 @@ ).format(entity_jid=entity_jid, e=e)) else: log.debug(_("VCard found")) - return await self.vCard2Dict(client, vcard_elt, entity_jid) + return await self.v_card_2_dict(client, vcard_elt, entity_jid) - async def getAvatar( + async def get_avatar( self, client: SatXMPPEntity, entity_jid: jid.JID @@ -296,9 +296,9 @@ @param entity: entity to get avatar from @return: avatar metadata, or None if no avatar has been found """ - entity_jid = self._i.getIdentityJid(client, entity_jid) + entity_jid = self._i.get_identity_jid(client, entity_jid) hashes_cache = client._xep_0054_avatar_hashes - vcard = await self.getCard(client, entity_jid) + vcard = await self.get_card(client, entity_jid) if vcard is None: return None try: @@ -312,18 +312,18 @@ if not avatar_hash: return None - avatar_cache = self.host.common_cache.getMetadata(avatar_hash) - return self._i.avatarBuildMetadata( + avatar_cache = self.host.common_cache.get_metadata(avatar_hash) + return self._i.avatar_build_metadata( avatar_cache['path'], avatar_cache['mime_type'], avatar_hash) - async def setAvatar(self, client, avatar_data, entity): + async def set_avatar(self, client, avatar_data, entity): """Set avatar of the profile @param avatar_data(dict): data of the image to use as avatar, as built by IDENTITY plugin. @param entity(jid.JID): entity whose avatar must be changed """ - vcard_elt = await self.updateVCardElt(client, entity, ['PHOTO']) + vcard_elt = await self.update_vcard_elt(client, entity, ['PHOTO']) iq_elt = client.IQ() iq_elt.addChild(vcard_elt) @@ -337,19 +337,19 @@ # FIXME: should send the current presence, not always "available" ! await client.presence.available() - async def getNicknames(self, client, entity): + async def get_nicknames(self, client, entity): """get nick from cache, or check vCard @param entity(jid.JID): entity to get nick from @return(list[str]): nicknames found """ - vcard_data = await self.getCard(client, entity) + vcard_data = await self.get_card(client, entity) try: return [vcard_data['nickname']] except (KeyError, TypeError): return [] - async def setNicknames(self, client, nicknames, entity): + async def set_nicknames(self, client, nicknames, entity): """Update our vCard and set a nickname @param nicknames(list[str]): new nicknames to use @@ -357,7 +357,7 @@ """ nick = nicknames[0].strip() - vcard_elt = await self.updateVCardElt(client, entity, ['NICKNAME']) + vcard_elt = await self.update_vcard_elt(client, entity, ['NICKNAME']) if nick: vcard_elt.addElement((NS_VCARD, "NICKNAME"), content=nick) @@ -389,7 +389,7 @@ @param presence(domish.Element): <presence/> stanza """ client = self.parent - entity_jid = self.plugin_parent._i.getIdentityJid( + entity_jid = self.plugin_parent._i.get_identity_jid( client, jid.JID(presence["from"])) try: @@ -414,14 +414,14 @@ # no change, we can return… if given_hash: # …but we double check that avatar is in cache - avatar_cache = self.host.common_cache.getMetadata(given_hash) + avatar_cache = self.host.common_cache.get_metadata(given_hash) if avatar_cache is None: log.debug( f"Avatar for [{entity_jid}] is known but not in cache, we get " f"it" ) - # getCard will put the avatar in cache - await self.plugin_parent.getCard(client, entity_jid) + # get_card will put the avatar in cache + await self.plugin_parent.get_card(client, entity_jid) else: log.debug(f"avatar for {entity_jid} is already in cache") return @@ -438,7 +438,7 @@ # the avatar has been removed, no need to go further return - avatar_cache = self.host.common_cache.getMetadata(given_hash) + avatar_cache = self.host.common_cache.get_metadata(given_hash) if avatar_cache is not None: log.debug( f"New avatar found for [{entity_jid}], it's already in cache, we use it" @@ -458,7 +458,7 @@ log.debug( "New avatar found for [{entity_jid}], requesting vcard" ) - vcard = await self.plugin_parent.getCard(client, entity_jid) + vcard = await self.plugin_parent.get_card(client, entity_jid) if vcard is None: log.warning(f"Unexpected empty vCard for {entity_jid}") return
--- a/sat/plugins/plugin_xep_0055.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0055.py Sat Apr 08 13:54:42 2023 +0200 @@ -73,49 +73,49 @@ # default search services (config file + hard-coded lists) self.services = [ jid.JID(entry) - for entry in host.memory.getConfig( + for entry in host.memory.config_get( CONFIG_SECTION, CONFIG_SERVICE_LIST, DEFAULT_SERVICE_LIST ) ] - host.bridge.addMethod( - "searchGetFieldsUI", + host.bridge.add_method( + "search_fields_ui_get", ".plugin", in_sign="ss", out_sign="s", - method=self._getFieldsUI, + method=self._get_fields_ui, async_=True, ) - host.bridge.addMethod( - "searchRequest", + host.bridge.add_method( + "search_request", ".plugin", in_sign="sa{ss}s", out_sign="s", - method=self._searchRequest, + method=self._search_request, async_=True, ) - self.__search_menu_id = host.registerCallback(self._getMainUI, with_data=True) - host.importMenu( + self.__search_menu_id = host.register_callback(self._get_main_ui, with_data=True) + host.import_menu( (D_("Contacts"), D_("Search directory")), - self._getMainUI, + self._get_main_ui, security_limit=1, help_string=D_("Search user directory"), ) - def _getHostServices(self, profile): + def _get_host_services(self, profile): """Return the jabber search services associated to the user host. @param profile (unicode): %(doc_profile)s @return: list[jid.JID] """ - client = self.host.getClient(profile) - d = self.host.findFeaturesSet(client, [NS_SEARCH]) + client = self.host.get_client(profile) + d = self.host.find_features_set(client, [NS_SEARCH]) return d.addCallback(lambda set_: list(set_)) ## Main search UI (menu item callback) ## - def _getMainUI(self, raw_data, profile): + def _get_main_ui(self, raw_data, profile): """Get the XMLUI for selecting a service and searching the directory. @param raw_data (dict): data received from the frontend @@ -123,10 +123,10 @@ @return: a deferred XMLUI string representation """ # check if the user's server offers some search services - d = self._getHostServices(profile) - return d.addCallback(lambda services: self.getMainUI(services, raw_data, profile)) + d = self._get_host_services(profile) + return d.addCallback(lambda services: self.get_main_ui(services, raw_data, profile)) - def getMainUI(self, services, raw_data, profile): + def get_main_ui(self, services, raw_data, profile): """Get the XMLUI for selecting a service and searching the directory. @param services (list[jid.JID]): search services offered by the user server @@ -136,7 +136,7 @@ """ # extend services offered by user's server with the default services services.extend([service for service in self.services if service not in services]) - data = xml_tools.XMLUIResult2DataFormResult(raw_data) + data = xml_tools.xmlui_result_2_data_form_result(raw_data) main_ui = xml_tools.XMLUI( C.XMLUI_WINDOW, container="tabs", @@ -144,13 +144,13 @@ submit_id=self.__search_menu_id, ) - d = self._addSimpleSearchUI(services, main_ui, data, profile) + d = self._add_simple_search_ui(services, main_ui, data, profile) d.addCallback( - lambda __: self._addAdvancedSearchUI(services, main_ui, data, profile) + lambda __: self._add_advanced_search_ui(services, main_ui, data, profile) ) return d.addCallback(lambda __: {"xmlui": main_ui.toXml()}) - def _addSimpleSearchUI(self, services, main_ui, data, profile): + def _add_simple_search_ui(self, services, main_ui, data, profile): """Add to the main UI a tab for the simple search. Display a single input field and search on the main service (it actually does one search per search field and then compile the results). @@ -176,13 +176,13 @@ ) ) - sub_cont = main_ui.main_container.addTab( + sub_cont = main_ui.main_container.add_tab( "simple_search", label=_("Simple search"), container=xml_tools.VerticalContainer, ) - main_ui.changeContainer(sub_cont.append(xml_tools.PairsContainer(main_ui))) - xml_tools.dataForm2Widgets(main_ui, form) + main_ui.change_container(sub_cont.append(xml_tools.PairsContainer(main_ui))) + xml_tools.data_form_2_widgets(main_ui, form) # FIXME: add colspan attribute to divider? (we are in a PairsContainer) main_ui.addDivider("blank") @@ -197,19 +197,19 @@ } if simple_data: log.debug("Simple search with %s on %s" % (simple_data, service_jid)) - sub_cont.parent.setSelected(True) - main_ui.changeContainer(sub_cont.append(xml_tools.VerticalContainer(main_ui))) + sub_cont.parent.set_selected(True) + main_ui.change_container(sub_cont.append(xml_tools.VerticalContainer(main_ui))) main_ui.addDivider("dash") - d = self.searchRequest(service_jid, simple_data, profile) + d = self.search_request(service_jid, simple_data, profile) d.addCallbacks( - lambda elt: self._displaySearchResult(main_ui, elt), + lambda elt: self._display_search_result(main_ui, elt), lambda failure: main_ui.addText(failure.getErrorMessage()), ) return d return defer.succeed(None) - def _addAdvancedSearchUI(self, services, main_ui, data, profile): + def _add_advanced_search_ui(self, services, main_ui, data, profile): """Add to the main UI a tab for the advanced search. Display a service selector and allow to search on all the fields that are implemented by the selected service. @@ -221,7 +221,7 @@ @return: a __ Deferred """ - sub_cont = main_ui.main_container.addTab( + sub_cont = main_ui.main_container.add_tab( "advanced_search", label=_("Advanced search"), container=xml_tools.VerticalContainer, @@ -230,7 +230,7 @@ if "service_jid_extra" in data: # refresh button has been pushed, select the tab - sub_cont.parent.setSelected(True) + sub_cont.parent.set_selected(True) # get the selected service service_jid_s = data.get("service_jid_extra", "") if not service_jid_s: @@ -242,7 +242,7 @@ if service_jid_s not in services_s: services_s.append(service_jid_s) - main_ui.changeContainer(sub_cont.append(xml_tools.PairsContainer(main_ui))) + main_ui.change_container(sub_cont.append(xml_tools.PairsContainer(main_ui))) main_ui.addLabel(_("Search on")) main_ui.addList("service_jid", options=services_s, selected=service_jid_s) main_ui.addLabel(_("Other service")) @@ -262,17 +262,17 @@ main_ui.addDivider("dash") main_ui.addDivider("dash") - main_ui.changeContainer(sub_cont.append(xml_tools.VerticalContainer(main_ui))) + main_ui.change_container(sub_cont.append(xml_tools.VerticalContainer(main_ui))) service_jid = jid.JID(service_jid_s) - d = self.getFieldsUI(service_jid, profile) + d = self.get_fields_ui(service_jid, profile) d.addCallbacks( - self._addAdvancedForm, + self._add_advanced_form, lambda failure: main_ui.addText(failure.getErrorMessage()), [service_jid, main_ui, sub_cont, data, profile], ) return d - def _addAdvancedForm(self, form_elt, service_jid, main_ui, sub_cont, data, profile): + def _add_advanced_form(self, form_elt, service_jid, main_ui, sub_cont, data, profile): """Add the search form and the search results (if there is some to display). @param form_elt (domish.Element): form element listing the fields @@ -288,7 +288,7 @@ adv_fields = [field.var for field in field_list if field.var] adv_data = {key: value for key, value in data.items() if key in adv_fields} - xml_tools.dataForm2Widgets(main_ui, data_form.Form.fromElement(form_elt)) + xml_tools.data_form_2_widgets(main_ui, data_form.Form.fromElement(form_elt)) # refill the submitted values # FIXME: wokkel's data_form.Form.fromElement doesn't parse the values, so we do it directly in XMLUI for now @@ -309,26 +309,26 @@ if adv_data: # display the search results log.debug("Advanced search with %s on %s" % (adv_data, service_jid)) - sub_cont.parent.setSelected(True) - main_ui.changeContainer(sub_cont.append(xml_tools.VerticalContainer(main_ui))) + sub_cont.parent.set_selected(True) + main_ui.change_container(sub_cont.append(xml_tools.VerticalContainer(main_ui))) main_ui.addDivider("dash") - d = self.searchRequest(service_jid, adv_data, profile) + d = self.search_request(service_jid, adv_data, profile) d.addCallbacks( - lambda elt: self._displaySearchResult(main_ui, elt), + lambda elt: self._display_search_result(main_ui, elt), lambda failure: main_ui.addText(failure.getErrorMessage()), ) return d return defer.succeed(None) - def _displaySearchResult(self, main_ui, elt): + def _display_search_result(self, main_ui, elt): """Display the search results. @param main_ui (XMLUI): the main XMLUI instance @param elt (domish.Element): form result element """ if [child for child in elt.children if child.name == "item"]: - headers, xmlui_data = xml_tools.dataFormEltResult2XMLUIData(elt) + headers, xmlui_data = xml_tools.data_form_elt_result_2_xmlui_data(elt) if "jid" in headers: # use XMLUI JidsListWidget to display the results values = {} for i in range(len(xmlui_data)): @@ -341,41 +341,41 @@ main_ui.addJidsList(jids=values["jid"], name=D_("Search results")) # TODO: also display the values other than JID else: - xml_tools.XMLUIData2AdvancedList(main_ui, headers, xmlui_data) + xml_tools.xmlui_data_2_advanced_list(main_ui, headers, xmlui_data) else: main_ui.addText(D_("The search gave no result")) ## Retrieve the search fields ## - def _getFieldsUI(self, to_jid_s, profile_key): + def _get_fields_ui(self, to_jid_s, profile_key): """Ask a service to send us the list of the form fields it manages. @param to_jid_s (unicode): XEP-0055 compliant search entity @param profile_key (unicode): %(doc_profile_key)s @return: a deferred XMLUI instance """ - d = self.getFieldsUI(jid.JID(to_jid_s), profile_key) - d.addCallback(lambda form: xml_tools.dataFormEltResult2XMLUI(form).toXml()) + d = self.get_fields_ui(jid.JID(to_jid_s), profile_key) + d.addCallback(lambda form: xml_tools.data_form_elt_result_2_xmlui(form).toXml()) return d - def getFieldsUI(self, to_jid, profile_key): + def get_fields_ui(self, to_jid, profile_key): """Ask a service to send us the list of the form fields it manages. @param to_jid (jid.JID): XEP-0055 compliant search entity @param profile_key (unicode): %(doc_profile_key)s @return: a deferred domish.Element """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) fields_request = IQ(client.xmlstream, "get") fields_request["from"] = client.jid.full() fields_request["to"] = to_jid.full() fields_request.addElement("query", NS_SEARCH) d = fields_request.send(to_jid.full()) - d.addCallbacks(self._getFieldsUICb, self._getFieldsUIEb) + d.addCallbacks(self._get_fields_ui_cb, self._get_fields_ui_eb) return d - def _getFieldsUICb(self, answer): - """Callback for self.getFieldsUI. + def _get_fields_ui_cb(self, answer): + """Callback for self.get_fields_ui. @param answer (domish.Element): search query element @return: domish.Element @@ -394,8 +394,8 @@ ) return form_elt - def _getFieldsUIEb(self, failure): - """Errback to self.getFieldsUI. + def _get_fields_ui_eb(self, failure): + """Errback to self.get_fields_ui. @param failure (defer.failure.Failure): twisted failure @raise: the unchanged defer.failure.Failure @@ -405,35 +405,35 @@ ## Do the search ## - def _searchRequest(self, to_jid_s, search_data, profile_key): + def _search_request(self, to_jid_s, search_data, profile_key): """Actually do a search, according to filled data. @param to_jid_s (unicode): XEP-0055 compliant search entity - @param search_data (dict): filled data, corresponding to the form obtained in getFieldsUI + @param search_data (dict): filled data, corresponding to the form obtained in get_fields_ui @param profile_key (unicode): %(doc_profile_key)s @return: a deferred XMLUI string representation """ - d = self.searchRequest(jid.JID(to_jid_s), search_data, profile_key) - d.addCallback(lambda form: xml_tools.dataFormEltResult2XMLUI(form).toXml()) + d = self.search_request(jid.JID(to_jid_s), search_data, profile_key) + d.addCallback(lambda form: xml_tools.data_form_elt_result_2_xmlui(form).toXml()) return d - def searchRequest(self, to_jid, search_data, profile_key): + def search_request(self, to_jid, search_data, profile_key): """Actually do a search, according to filled data. @param to_jid (jid.JID): XEP-0055 compliant search entity - @param search_data (dict): filled data, corresponding to the form obtained in getFieldsUI + @param search_data (dict): filled data, corresponding to the form obtained in get_fields_ui @param profile_key (unicode): %(doc_profile_key)s @return: a deferred domish.Element """ if FIELD_SINGLE in search_data: value = search_data[FIELD_SINGLE] - d = self.getFieldsUI(to_jid, profile_key) + d = self.get_fields_ui(to_jid, profile_key) d.addCallback( - lambda elt: self.searchRequestMulti(to_jid, value, elt, profile_key) + lambda elt: self.search_request_multi(to_jid, value, elt, profile_key) ) return d - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) search_request = IQ(client.xmlstream, "set") search_request["from"] = client.jid.full() search_request["to"] = to_jid.full() @@ -443,10 +443,10 @@ query_elt.addChild(x_form.toElement()) # TODO: XEP-0059 could be used here (with the needed new method attributes) d = search_request.send(to_jid.full()) - d.addCallbacks(self._searchOk, self._searchErr) + d.addCallbacks(self._search_ok, self._search_err) return d - def searchRequestMulti(self, to_jid, value, form_elt, profile_key): + def search_request_multi(self, to_jid, value, form_elt, profile_key): """Search for a value simultaneously in all fields, returns the results compilation. @param to_jid (jid.JID): XEP-0055 compliant search entity @@ -459,7 +459,7 @@ d_list = [] for field in [field.var for field in form.fieldList if field.var]: - d_list.append(self.searchRequest(to_jid, {field: value}, profile_key)) + d_list.append(self.search_request(to_jid, {field: value}, profile_key)) def cb(result): # return the results compiled in one domish element result_elt = None @@ -481,8 +481,8 @@ return defer.DeferredList(d_list).addCallback(cb) - def _searchOk(self, answer): - """Callback for self.searchRequest. + def _search_ok(self, answer): + """Callback for self.search_request. @param answer (domish.Element): search query element @return: domish.Element @@ -501,8 +501,8 @@ ) return form_elt - def _searchErr(self, failure): - """Errback to self.searchRequest. + def _search_err(self, failure): + """Errback to self.search_request. @param failure (defer.failure.Failure): twisted failure @raise: the unchanged defer.failure.Failure
--- a/sat/plugins/plugin_xep_0059.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0059.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,10 +51,10 @@ def __init__(self, host): log.info(_("Result Set Management plugin initialization")) - def getHandler(self, client): + def get_handler(self, client): return XEP_0059_handler() - def parseExtra(self, extra): + def parse_extra(self, extra): """Parse extra dictionnary to retrieve RSM arguments @param extra(dict): data for parse @@ -102,7 +102,7 @@ data["index"] = rsm_response.index return data - def getNextRequest( + def get_next_request( self, rsm_request: rsm.RSMRequest, rsm_response: rsm.RSMResponse,
--- a/sat/plugins/plugin_xep_0060.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0060.py Sat Apr 08 13:54:42 2023 +0200 @@ -111,224 +111,224 @@ self._mam = host.plugins.get("XEP-0313") self._node_cb = {} # dictionnary of callbacks for node (key: node, value: list of callbacks) self.rt_sessions = sat_defer.RTDeferredSessions() - host.bridge.addMethod( - "psNodeCreate", + host.bridge.add_method( + "ps_node_create", ".plugin", in_sign="ssa{ss}s", out_sign="s", - method=self._createNode, + method=self._create_node, async_=True, ) - host.bridge.addMethod( - "psNodeConfigurationGet", + host.bridge.add_method( + "ps_node_configuration_get", ".plugin", in_sign="sss", out_sign="a{ss}", - method=self._getNodeConfiguration, + method=self._get_node_configuration, async_=True, ) - host.bridge.addMethod( - "psNodeConfigurationSet", + host.bridge.add_method( + "ps_node_configuration_set", ".plugin", in_sign="ssa{ss}s", out_sign="", - method=self._setNodeConfiguration, + method=self._set_node_configuration, async_=True, ) - host.bridge.addMethod( - "psNodeAffiliationsGet", + host.bridge.add_method( + "ps_node_affiliations_get", ".plugin", in_sign="sss", out_sign="a{ss}", - method=self._getNodeAffiliations, + method=self._get_node_affiliations, async_=True, ) - host.bridge.addMethod( - "psNodeAffiliationsSet", + host.bridge.add_method( + "ps_node_affiliations_set", ".plugin", in_sign="ssa{ss}s", out_sign="", - method=self._setNodeAffiliations, + method=self._set_node_affiliations, async_=True, ) - host.bridge.addMethod( - "psNodeSubscriptionsGet", + host.bridge.add_method( + "ps_node_subscriptions_get", ".plugin", in_sign="sss", out_sign="a{ss}", - method=self._getNodeSubscriptions, + method=self._get_node_subscriptions, async_=True, ) - host.bridge.addMethod( - "psNodeSubscriptionsSet", + host.bridge.add_method( + "ps_node_subscriptions_set", ".plugin", in_sign="ssa{ss}s", out_sign="", - method=self._setNodeSubscriptions, + method=self._set_node_subscriptions, async_=True, ) - host.bridge.addMethod( - "psNodePurge", + host.bridge.add_method( + "ps_node_purge", ".plugin", in_sign="sss", out_sign="", - method=self._purgeNode, + method=self._purge_node, async_=True, ) - host.bridge.addMethod( - "psNodeDelete", + host.bridge.add_method( + "ps_node_delete", ".plugin", in_sign="sss", out_sign="", - method=self._deleteNode, + method=self._delete_node, async_=True, ) - host.bridge.addMethod( - "psNodeWatchAdd", + host.bridge.add_method( + "ps_node_watch_add", ".plugin", in_sign="sss", out_sign="", method=self._addWatch, async_=False, ) - host.bridge.addMethod( - "psNodeWatchRemove", + host.bridge.add_method( + "ps_node_watch_remove", ".plugin", in_sign="sss", out_sign="", - method=self._removeWatch, + method=self._remove_watch, async_=False, ) - host.bridge.addMethod( - "psAffiliationsGet", + host.bridge.add_method( + "ps_affiliations_get", ".plugin", in_sign="sss", out_sign="a{ss}", - method=self._getAffiliations, + method=self._get_affiliations, async_=True, ) - host.bridge.addMethod( - "psItemsGet", + host.bridge.add_method( + "ps_items_get", ".plugin", in_sign="ssiassss", out_sign="s", - method=self._getItems, + method=self._get_items, async_=True, ) - host.bridge.addMethod( - "psItemSend", + host.bridge.add_method( + "ps_item_send", ".plugin", in_sign="ssssss", out_sign="s", - method=self._sendItem, + method=self._send_item, async_=True, ) - host.bridge.addMethod( - "psItemsSend", + host.bridge.add_method( + "ps_items_send", ".plugin", in_sign="ssasss", out_sign="as", - method=self._sendItems, + method=self._send_items, async_=True, ) - host.bridge.addMethod( - "psItemRetract", + host.bridge.add_method( + "ps_item_retract", ".plugin", in_sign="sssbs", out_sign="", - method=self._retractItem, + method=self._retract_item, async_=True, ) - host.bridge.addMethod( - "psItemsRetract", + host.bridge.add_method( + "ps_items_retract", ".plugin", in_sign="ssasbs", out_sign="", - method=self._retractItems, + method=self._retract_items, async_=True, ) - host.bridge.addMethod( - "psItemRename", + host.bridge.add_method( + "ps_item_rename", ".plugin", in_sign="sssss", out_sign="", - method=self._renameItem, + method=self._rename_item, async_=True, ) - host.bridge.addMethod( - "psSubscribe", + host.bridge.add_method( + "ps_subscribe", ".plugin", in_sign="ssss", out_sign="s", method=self._subscribe, async_=True, ) - host.bridge.addMethod( - "psUnsubscribe", + host.bridge.add_method( + "ps_unsubscribe", ".plugin", in_sign="sss", out_sign="", method=self._unsubscribe, async_=True, ) - host.bridge.addMethod( - "psSubscriptionsGet", + host.bridge.add_method( + "ps_subscriptions_get", ".plugin", in_sign="sss", out_sign="s", method=self._subscriptions, async_=True, ) - host.bridge.addMethod( - "psSubscribeToMany", + host.bridge.add_method( + "ps_subscribe_to_many", ".plugin", in_sign="a(ss)sa{ss}s", out_sign="s", - method=self._subscribeToMany, + method=self._subscribe_to_many, ) - host.bridge.addMethod( - "psGetSubscribeRTResult", + host.bridge.add_method( + "ps_get_subscribe_rt_result", ".plugin", in_sign="ss", out_sign="(ua(sss))", - method=self._manySubscribeRTResult, + method=self._many_subscribe_rt_result, async_=True, ) - host.bridge.addMethod( - "psGetFromMany", + host.bridge.add_method( + "ps_get_from_many", ".plugin", in_sign="a(ss)iss", out_sign="s", - method=self._getFromMany, + method=self._get_from_many, ) - host.bridge.addMethod( - "psGetFromManyRTResult", + host.bridge.add_method( + "ps_get_from_many_rt_result", ".plugin", in_sign="ss", out_sign="(ua(sssasa{ss}))", - method=self._getFromManyRTResult, + method=self._get_from_many_rt_result, async_=True, ) # high level observer method - host.bridge.addSignal( - "psEvent", ".plugin", signature="ssssss" + host.bridge.add_signal( + "ps_event", ".plugin", signature="ssssss" ) # args: category, service(jid), node, type (C.PS_ITEMS, C.PS_DELETE), data, profile # low level observer method, used if service/node is in watching list (see psNodeWatch* methods) - host.bridge.addSignal( - "psEventRaw", ".plugin", signature="sssass" + host.bridge.add_signal( + "ps_event_raw", ".plugin", signature="sssass" ) # args: service(jid), node, type (C.PS_ITEMS, C.PS_DELETE), list of item_xml, profile - def getHandler(self, client): + def get_handler(self, client): client.pubsub_client = SatPubSubClient(self.host, self) return client.pubsub_client - async def profileConnected(self, client): + async def profile_connected(self, client): client.pubsub_watching = set() try: client.pubsub_service = jid.JID( - self.host.memory.getConfig("", "pubsub_service") + self.host.memory.config_get("", "pubsub_service") ) except RuntimeError: log.info( @@ -337,11 +337,11 @@ "we find" ) ) - pubsub_services = await self.host.findServiceEntities( + pubsub_services = await self.host.find_service_entities( client, "pubsub", "service" ) for service_jid in pubsub_services: - infos = await self.host.memory.disco.getInfos(client, service_jid) + infos = await self.host.memory.disco.get_infos(client, service_jid) if not DEFAULT_PUBSUB_MIN_FEAT.issubset(infos.features): continue names = {(n or "").lower() for n in infos.identities.values()} @@ -367,9 +367,9 @@ ) log.info(f"default pubsub service: {pubsub_service_str}") - def getFeatures(self, profile): + def features_get(self, profile): try: - client = self.host.getClient(profile) + client = self.host.get_client(profile) except exceptions.ProfileNotSetError: return {} try: @@ -379,13 +379,13 @@ else "" } except AttributeError: - if self.host.isConnected(profile): + if self.host.is_connected(profile): log.debug("Profile is not connected, service is not checked yet") else: log.error("Service should be available !") return {} - def parseExtra(self, extra): + def parse_extra(self, extra): """Parse extra dictionnary used bridge's extra dictionnaries @@ -407,13 +407,13 @@ if self._rsm is None: rsm_request = None else: - rsm_request = self._rsm.parseExtra(extra) + rsm_request = self._rsm.parse_extra(extra) # mam if self._mam is None: mam_request = None else: - mam_request = self._mam.parseExtra(extra, with_rsm=False) + mam_request = self._mam.parse_extra(extra, with_rsm=False) if mam_request is not None: assert "mam" not in extra @@ -421,7 +421,7 @@ return Extra(rsm_request, extra) - def addManagedNode( + def add_managed_node( self, node: str, priority: int = 0, @@ -449,7 +449,7 @@ cb_list.append((cb, priority)) cb_list.sort(key=lambda c: c[1], reverse=True) - def removeManagedNode(self, node, *args): + def remove_managed_node(self, node, *args): """Add a handler for a node @param node(unicode): node to monitor @@ -490,7 +490,7 @@ # @param profile (str): %(doc_profile)s # @return: deferred which fire a list of nodes # """ - # client = self.host.getClient(profile) + # client = self.host.get_client(profile) # d = self.host.getDiscoItems(client, service, nodeIdentifier) # d.addCallback(lambda result: [item.getAttribute('node') for item in result.toElement().children if item.hasAttribute('node')]) # return d @@ -512,21 +512,21 @@ # d.addCallback(lambda subs: [sub.getAttribute('node') for sub in subs if sub.getAttribute('subscription') == filter_]) # return d - def _sendItem(self, service, nodeIdentifier, payload, item_id=None, extra_ser="", + def _send_item(self, service, nodeIdentifier, payload, item_id=None, extra_ser="", profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) payload = xml_tools.parse(payload) extra = data_format.deserialise(extra_ser) - d = defer.ensureDeferred(self.sendItem( + d = defer.ensureDeferred(self.send_item( client, service, nodeIdentifier, payload, item_id or None, extra )) d.addCallback(lambda ret: ret or "") return d - def _sendItems(self, service, nodeIdentifier, items, extra_ser=None, + def _send_items(self, service, nodeIdentifier, items, extra_ser=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) try: items = [xml_tools.parse(item) for item in items] @@ -534,11 +534,11 @@ raise exceptions.DataError(_("Can't parse items: {msg}").format( msg=e)) extra = data_format.deserialise(extra_ser) - return defer.ensureDeferred(self.sendItems( + return defer.ensureDeferred(self.send_items( client, service, nodeIdentifier, items, extra=extra )) - async def sendItem( + async def send_item( self, client: SatXMPPClient, service: Union[jid.JID, None], @@ -561,7 +561,7 @@ if item_id is not None: item_elt['id'] = item_id item_elt.addChild(payload) - published_ids = await self.sendItems( + published_ids = await self.send_items( client, service, nodeIdentifier, @@ -573,7 +573,7 @@ except IndexError: return item_id - async def sendItems( + async def send_items( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -676,7 +676,7 @@ sender = client.jid if extra is None: extra = {} - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0060_publish", client, service, nodeIdentifier, items, options, sender, extra ): @@ -687,7 +687,7 @@ ) return iq_result_elt - def _unwrapMAMMessage(self, message_elt): + def _unwrap_mam_message(self, message_elt): try: item_elt = reduce( lambda elt, ns_name: next(elt.elements(*ns_name)), @@ -703,22 +703,22 @@ raise exceptions.DataError("Can't find Item in MAM message element") return item_elt - def serialiseItems(self, items_data): + def serialise_items(self, items_data): items, metadata = items_data metadata['items'] = items return data_format.serialise(metadata) - def _getItems(self, service="", node="", max_items=10, item_ids=None, sub_id=None, + def _get_items(self, service="", node="", max_items=10, item_ids=None, sub_id=None, extra="", profile_key=C.PROF_KEY_NONE): """Get items from pubsub node @param max_items(int): maximum number of item to get, C.NO_LIMIT for no limit """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None max_items = None if max_items == C.NO_LIMIT else max_items - extra = self.parseExtra(data_format.deserialise(extra)) - d = defer.ensureDeferred(self.getItems( + extra = self.parse_extra(data_format.deserialise(extra)) + d = defer.ensureDeferred(self.get_items( client, service, node, @@ -728,11 +728,11 @@ extra.rsm_request, extra.extra, )) - d.addCallback(self.transItemsData) - d.addCallback(self.serialiseItems) + d.addCallback(self.trans_items_data) + d.addCallback(self.serialise_items) return d - async def getItems( + async def get_items( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -767,7 +767,7 @@ raise ValueError("items_id can't be used with rsm") if extra is None: extra = {} - cont, ret = await self.host.trigger.asyncReturnPoint( + cont, ret = await self.host.trigger.async_return_point( "XEP-0060_getItems", client, service, node, max_items, item_ids, sub_id, rsm_request, extra ) @@ -788,7 +788,7 @@ extra = extra )) # we have no MAM data here, so we add None - d.addErrback(sat_defer.stanza2NotFound) + d.addErrback(sat_defer.stanza_2_not_found) d.addTimeout(TIMEOUT, reactor) items, rsm_response = await d mam_response = None @@ -804,7 +804,7 @@ mam_query.node = node elif mam_query.node != node: raise exceptions.DataError( - "MAM query node is incoherent with getItems's node" + "MAM query node is incoherent with get_items's node" ) if mam_query.rsm is None: mam_query.rsm = rsm_request @@ -813,8 +813,8 @@ raise exceptions.DataError( "Conflict between RSM request and MAM's RSM request" ) - items, rsm_response, mam_response = await self._mam.getArchives( - client, mam_query, service, self._unwrapMAMMessage + items, rsm_response, mam_response = await self._mam.get_archives( + client, mam_query, service, self._unwrap_mam_message ) try: @@ -835,7 +835,7 @@ metadata = { "service": service_jid, "node": node, - "uri": self.getNodeURI(service_jid, node), + "uri": self.get_node_uri(service_jid, node), } if mam_response is not None: # mam_response is a dict with "complete" and "stable" keys @@ -875,32 +875,32 @@ # - list of items # - RSM response data # """ - # client = self.host.getClient(profile_key) + # client = self.host.get_client(profile_key) # found_nodes = yield self.listNodes(service, profile=client.profile) # d_dict = {} # for publisher, node in data.items(): # if node not in found_nodes: # log.debug(u"Skip the items retrieval for [{node}]: node doesn't exist".format(node=node)) # continue # avoid pubsub "item-not-found" error - # d_dict[publisher] = self.getItems(service, node, max_items, None, sub_id, rsm, client.profile) + # d_dict[publisher] = self.get_items(service, node, max_items, None, sub_id, rsm, client.profile) # defer.returnValue(d_dict) def getOptions(self, service, nodeIdentifier, subscriber, subscriptionIdentifier=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return client.pubsub_client.getOptions( service, nodeIdentifier, subscriber, subscriptionIdentifier ) def setOptions(self, service, nodeIdentifier, subscriber, options, subscriptionIdentifier=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return client.pubsub_client.setOptions( service, nodeIdentifier, subscriber, options, subscriptionIdentifier ) - def _createNode(self, service_s, nodeIdentifier, options, profile_key): - client = self.host.getClient(profile_key) + def _create_node(self, service_s, nodeIdentifier, options, profile_key): + client = self.host.get_client(profile_key) return self.createNode( client, jid.JID(service_s) if service_s else None, nodeIdentifier, options ) @@ -924,7 +924,7 @@ return client.pubsub_client.createNode(service, nodeIdentifier, options) @defer.inlineCallbacks - def createIfNewNode(self, client, service, nodeIdentifier, options=None): + def create_if_new_node(self, client, service, nodeIdentifier, options=None): """Helper method similar to createNode, but will not fail in case of conflict""" try: yield self.createNode(client, service, nodeIdentifier, options) @@ -934,8 +934,8 @@ else: raise e - def _getNodeConfiguration(self, service_s, nodeIdentifier, profile_key): - client = self.host.getClient(profile_key) + def _get_node_configuration(self, service_s, nodeIdentifier, profile_key): + client = self.host.get_client(profile_key) d = self.getConfiguration( client, jid.JID(service_s) if service_s else None, nodeIdentifier ) @@ -969,8 +969,8 @@ form.makeFields(options) return form - def _setNodeConfiguration(self, service_s, nodeIdentifier, options, profile_key): - client = self.host.getClient(profile_key) + def _set_node_configuration(self, service_s, nodeIdentifier, options, profile_key): + client = self.host.get_client(profile_key) d = self.setConfiguration( client, jid.JID(service_s) if service_s else None, nodeIdentifier, options ) @@ -987,14 +987,14 @@ d = request.send(client.xmlstream) return d - def _getAffiliations(self, service_s, nodeIdentifier, profile_key): - client = self.host.getClient(profile_key) - d = self.getAffiliations( + def _get_affiliations(self, service_s, nodeIdentifier, profile_key): + client = self.host.get_client(profile_key) + d = self.get_affiliations( client, jid.JID(service_s) if service_s else None, nodeIdentifier or None ) return d - def getAffiliations(self, client, service, nodeIdentifier=None): + def get_affiliations(self, client, service, nodeIdentifier=None): """Retrieve affiliations of an entity @param nodeIdentifier(unicode, None): node to get affiliation from @@ -1031,9 +1031,9 @@ d.addCallback(cb) return d - def _getNodeAffiliations(self, service_s, nodeIdentifier, profile_key): - client = self.host.getClient(profile_key) - d = self.getNodeAffiliations( + def _get_node_affiliations(self, service_s, nodeIdentifier, profile_key): + client = self.host.get_client(profile_key) + d = self.get_node_affiliations( client, jid.JID(service_s) if service_s else None, nodeIdentifier ) d.addCallback( @@ -1041,7 +1041,7 @@ ) return d - def getNodeAffiliations(self, client, service, nodeIdentifier): + def get_node_affiliations(self, client, service, nodeIdentifier): """Retrieve affiliations of a node owned by profile""" request = pubsub.PubSubRequest("affiliationsGet") request.recipient = service @@ -1076,14 +1076,14 @@ d.addCallback(cb) return d - def _setNodeAffiliations( + def _set_node_affiliations( self, service_s, nodeIdentifier, affiliations, profile_key=C.PROF_KEY_NONE ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) affiliations = { jid.JID(jid_): affiliation for jid_, affiliation in affiliations.items() } - d = self.setNodeAffiliations( + d = self.set_node_affiliations( client, jid.JID(service_s) if service_s else None, nodeIdentifier, @@ -1091,7 +1091,7 @@ ) return d - def setNodeAffiliations(self, client, service, nodeIdentifier, affiliations): + def set_node_affiliations(self, client, service, nodeIdentifier, affiliations): """Update affiliations of a node owned by profile @param affiliations(dict[jid.JID, unicode]): affiliations to set @@ -1104,17 +1104,17 @@ d = request.send(client.xmlstream) return d - def _purgeNode(self, service_s, nodeIdentifier, profile_key): - client = self.host.getClient(profile_key) - return self.purgeNode( + def _purge_node(self, service_s, nodeIdentifier, profile_key): + client = self.host.get_client(profile_key) + return self.purge_node( client, jid.JID(service_s) if service_s else None, nodeIdentifier ) - def purgeNode(self, client, service, nodeIdentifier): - return client.pubsub_client.purgeNode(service, nodeIdentifier) + def purge_node(self, client, service, nodeIdentifier): + return client.pubsub_client.purge_node(service, nodeIdentifier) - def _deleteNode(self, service_s, nodeIdentifier, profile_key): - client = self.host.getClient(profile_key) + def _delete_node(self, service_s, nodeIdentifier, profile_key): + client = self.host.get_client(profile_key) return self.deleteNode( client, jid.JID(service_s) if service_s else None, nodeIdentifier ) @@ -1132,31 +1132,31 @@ This method should only be called from bridge """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service_s) if service_s else client.jid.userhostJID() client.pubsub_watching.add((service, node)) - def _removeWatch(self, service_s, node, profile_key): + def _remove_watch(self, service_s, node, profile_key): """remove a node watch This method should only be called from bridge """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service_s) if service_s else client.jid.userhostJID() client.pubsub_watching.remove((service, node)) - def _retractItem( + def _retract_item( self, service_s, nodeIdentifier, itemIdentifier, notify, profile_key ): - return self._retractItems( + return self._retract_items( service_s, nodeIdentifier, (itemIdentifier,), notify, profile_key ) - def _retractItems( + def _retract_items( self, service_s, nodeIdentifier, itemIdentifiers, notify, profile_key ): - client = self.host.getClient(profile_key) - return self.retractItems( + client = self.host.get_client(profile_key) + return self.retract_items( client, jid.JID(service_s) if service_s else None, nodeIdentifier, @@ -1164,7 +1164,7 @@ notify, ) - def retractItems( + def retract_items( self, client: SatXMPPClient, service: jid.JID, @@ -1176,7 +1176,7 @@ service, nodeIdentifier, itemIdentifiers, notify=notify ) - def _renameItem( + def _rename_item( self, service, node, @@ -1184,13 +1184,13 @@ new_id, profile_key=C.PROF_KEY_NONE, ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None - return defer.ensureDeferred(self.renameItem( + return defer.ensureDeferred(self.rename_item( client, service, node, item_id, new_id )) - async def renameItem( + async def rename_item( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -1207,12 +1207,12 @@ raise ValueError("item_id and new_id must not be empty") # retract must be done last, so if something goes wrong, the exception will stop # the workflow and no accidental delete should happen - item_elt = (await self.getItems(client, service, node, item_ids=[item_id]))[0][0] - await self.sendItem(client, service, node, item_elt.firstChildElement(), new_id) - await self.retractItems(client, service, node, [item_id]) + item_elt = (await self.get_items(client, service, node, item_ids=[item_id]))[0][0] + await self.send_item(client, service, node, item_elt.firstChildElement(), new_id) + await self.retract_items(client, service, node, [item_id]) def _subscribe(self, service, nodeIdentifier, options, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) d = defer.ensureDeferred( self.subscribe( @@ -1236,7 +1236,7 @@ # TODO: reimplement a subscribtion cache, checking that we have not subscription before trying to subscribe if service is None: service = client.jid.userhostJID() - cont, trigger_sub = await self.host.trigger.asyncReturnPoint( + cont, trigger_sub = await self.host.trigger.async_return_point( "XEP-0060_subscribe", client, service, nodeIdentifier, sub_jid, options, ) if not cont: @@ -1254,7 +1254,7 @@ return subscription def _unsubscribe(self, service, nodeIdentifier, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) return defer.ensureDeferred(self.unsubscribe(client, service, nodeIdentifier)) @@ -1267,7 +1267,7 @@ subscriptionIdentifier: Optional[str] = None, sender: Optional[jid.JID] = None, ) -> None: - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0060_unsubscribe", client, service, nodeIdentifier, sub_jid, subscriptionIdentifier, sender ): @@ -1298,7 +1298,7 @@ nodeIdentifier="", profile_key=C.PROF_KEY_NONE ) -> str: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) subs = await self.subscriptions(client, service, nodeIdentifier or None) return data_format.serialise(subs) @@ -1315,7 +1315,7 @@ @param nodeIdentifier(unicode, None): node to check None to get all subscriptions """ - cont, ret = await self.host.trigger.asyncReturnPoint( + cont, ret = await self.host.trigger.async_return_point( "XEP-0060_subscriptions", client, service, node ) if not cont: @@ -1336,7 +1336,7 @@ ## misc tools ## - def getNodeURI(self, service, node, item=None): + def get_node_uri(self, service, node, item=None): """Return XMPP URI of a PubSub node @param service(jid.JID): PubSub service @@ -1359,17 +1359,17 @@ # generic # - def getRTResults( + def get_rt_results( self, session_id, on_success=None, on_error=None, profile=C.PROF_KEY_NONE ): - return self.rt_sessions.getResults(session_id, on_success, on_error, profile) + return self.rt_sessions.get_results(session_id, on_success, on_error, profile) - def transItemsData(self, items_data, item_cb=lambda item: item.toXml()): - """Helper method to transform result from [getItems] + def trans_items_data(self, items_data, item_cb=lambda item: item.toXml()): + """Helper method to transform result from [get_items] the items_data must be a tuple(list[domish.Element], dict[unicode, unicode]) - as returned by [getItems]. - @param items_data(tuple): tuple returned by [getItems] + as returned by [get_items]. + @param items_data(tuple): tuple returned by [get_items] @param item_cb(callable): method to transform each item @return (tuple): a serialised form ready to go throught bridge """ @@ -1378,15 +1378,15 @@ return (items, metadata) - def transItemsDataD(self, items_data, item_cb): - """Helper method to transform result from [getItems], deferred version + def trans_items_data_d(self, items_data, item_cb): + """Helper method to transform result from [get_items], deferred version the items_data must be a tuple(list[domish.Element], dict[unicode, unicode]) - as returned by [getItems]. metadata values are then casted to unicode and + as returned by [get_items]. metadata values are then casted to unicode and each item is passed to items_cb. An errback is added to item_cb, and when it is fired the value is filtered from final items - @param items_data(tuple): tuple returned by [getItems] + @param items_data(tuple): tuple returned by [get_items] @param item_cb(callable): method to transform each item (must return a deferred) @return (tuple): a deferred which fire a dict which can be serialised to go throught bridge @@ -1403,7 +1403,7 @@ )) return d - def serDList(self, results, failure_result=None): + def ser_d_list(self, results, failure_result=None): """Serialise a DeferredList result @param results: DeferredList results @@ -1425,19 +1425,19 @@ # subscribe # @utils.ensure_deferred - async def _getNodeSubscriptions( + async def _get_node_subscriptions( self, service: str, node: str, profile_key: str ) -> Dict[str, str]: - client = self.host.getClient(profile_key) - subs = await self.getNodeSubscriptions( + client = self.host.get_client(profile_key) + subs = await self.get_node_subscriptions( client, jid.JID(service) if service else None, node ) return {j.full(): a for j, a in subs.items()} - async def getNodeSubscriptions( + async def get_node_subscriptions( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -1480,15 +1480,15 @@ ) ) - def _setNodeSubscriptions( + def _set_node_subscriptions( self, service_s, nodeIdentifier, subscriptions, profile_key=C.PROF_KEY_NONE ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) subscriptions = { jid.JID(jid_): subscription for jid_, subscription in subscriptions.items() } - d = self.setNodeSubscriptions( + d = self.set_node_subscriptions( client, jid.JID(service_s) if service_s else None, nodeIdentifier, @@ -1496,7 +1496,7 @@ ) return d - def setNodeSubscriptions(self, client, service, nodeIdentifier, subscriptions): + def set_node_subscriptions(self, client, service, nodeIdentifier, subscriptions): """Set or update subscriptions of a node owned by profile @param subscriptions(dict[jid.JID, unicode]): subscriptions to set @@ -1512,7 +1512,7 @@ d = request.send(client.xmlstream) return d - def _manySubscribeRTResult(self, session_id, profile_key=C.PROF_KEY_DEFAULT): + def _many_subscribe_rt_result(self, session_id, profile_key=C.PROF_KEY_DEFAULT): """Get real-time results for subcribeToManu session @param session_id: id of the real-time deferred session @@ -1524,8 +1524,8 @@ - failure(unicode): empty string in case of success, error message else @param profile_key: %(doc_profile_key)s """ - profile = self.host.getClient(profile_key).profile - d = self.rt_sessions.getResults( + profile = self.host.get_client(profile_key).profile + d = self.rt_sessions.get_results( session_id, on_success=lambda result: "", on_error=lambda failure: str(failure.value), @@ -1543,17 +1543,17 @@ ) return d - def _subscribeToMany( + def _subscribe_to_many( self, node_data, subscriber=None, options=None, profile_key=C.PROF_KEY_NONE ): - return self.subscribeToMany( + return self.subscribe_to_many( [(jid.JID(service), str(node)) for service, node in node_data], jid.JID(subscriber), options, profile_key, ) - def subscribeToMany( + def subscribe_to_many( self, node_data, subscriber, options=None, profile_key=C.PROF_KEY_NONE ): """Subscribe to several nodes at once. @@ -1566,7 +1566,7 @@ @param profile_key (str): %(doc_profile_key)s @return (str): RT Deferred session id """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) deferreds = {} for service, node in node_data: deferreds[(service, node)] = defer.ensureDeferred( @@ -1574,7 +1574,7 @@ service, node, subscriber, options=options ) ) - return self.rt_sessions.newSession(deferreds, client.profile) + return self.rt_sessions.new_session(deferreds, client.profile) # found_nodes = yield self.listNodes(service, profile=client.profile) # subscribed_nodes = yield self.listSubscribedNodes(service, profile=client.profile) # d_list = [] @@ -1587,8 +1587,8 @@ # get # - def _getFromManyRTResult(self, session_id, profile_key=C.PROF_KEY_DEFAULT): - """Get real-time results for getFromMany session + def _get_from_many_rt_result(self, session_id, profile_key=C.PROF_KEY_DEFAULT): + """Get real-time results for get_from_many session @param session_id: id of the real-time deferred session @param profile_key: %(doc_profile_key)s @@ -1601,10 +1601,10 @@ - items (list[s]): raw XML of items - metadata(dict): serialised metadata """ - profile = self.host.getClient(profile_key).profile - d = self.rt_sessions.getResults( + profile = self.host.get_client(profile_key).profile + d = self.rt_sessions.get_results( session_id, - on_success=lambda result: ("", self.transItemsData(result)), + on_success=lambda result: ("", self.trans_items_data(result)), on_error=lambda failure: (str(failure.value) or UNSPECIFIED, ([], {})), profile=profile, ) @@ -1621,15 +1621,15 @@ ) return d - def _getFromMany( + def _get_from_many( self, node_data, max_item=10, extra="", profile_key=C.PROF_KEY_NONE ): """ @param max_item(int): maximum number of item to get, C.NO_LIMIT for no limit """ max_item = None if max_item == C.NO_LIMIT else max_item - extra = self.parseExtra(data_format.deserialise(extra)) - return self.getFromMany( + extra = self.parse_extra(data_format.deserialise(extra)) + return self.get_from_many( [(jid.JID(service), str(node)) for service, node in node_data], max_item, extra.rsm_request, @@ -1637,7 +1637,7 @@ profile_key, ) - def getFromMany(self, node_data, max_item=None, rsm_request=None, extra=None, + def get_from_many(self, node_data, max_item=None, rsm_request=None, extra=None, profile_key=C.PROF_KEY_NONE): """Get items from many nodes at once @@ -1649,13 +1649,13 @@ @param profile_key (unicode): %(doc_profile_key)s @return (str): RT Deferred session id """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) deferreds = {} for service, node in node_data: - deferreds[(service, node)] = defer.ensureDeferred(self.getItems( + deferreds[(service, node)] = defer.ensureDeferred(self.get_items( client, service, node, max_item, rsm_request=rsm_request, extra=extra )) - return self.rt_sessions.newSession(deferreds, client.profile) + return self.rt_sessions.new_session(deferreds, client.profile) @implementer(disco.IDisco) @@ -1689,13 +1689,13 @@ ) # items must be returned, thus this async point can't stop the workflow (but it # can modify returned items) - await self.host.trigger.asyncPoint( + await self.host.trigger.async_point( "XEP-0060_items", self.parent, service, nodeIdentifier, items, rsm_response, extra ) return items, rsm_response - def _getNodeCallbacks(self, node, event): + def _get_node_callbacks(self, node, event): """Generate callbacks from given node and event @param node(unicode): node used for the item @@ -1712,16 +1712,16 @@ except KeyError: continue - async def _callNodeCallbacks(self, client, event: pubsub.ItemsEvent) -> None: + async def _call_node_callbacks(self, client, event: pubsub.ItemsEvent) -> None: """Call sequencially event callbacks of a node Callbacks are called sequencially and not in parallel to be sure to respect priority (notably for plugin needing to get old items before they are modified or deleted from cache). """ - for callback in self._getNodeCallbacks(event.nodeIdentifier, C.PS_ITEMS): + for callback in self._get_node_callbacks(event.nodeIdentifier, C.PS_ITEMS): try: - await utils.asDeferred(callback, client, event) + await utils.as_deferred(callback, client, event) except Exception as e: log.error( f"Error while running items event callback {callback}: {e}" @@ -1730,10 +1730,10 @@ def itemsReceived(self, event): log.debug("Pubsub items received") client = self.parent - defer.ensureDeferred(self._callNodeCallbacks(client, event)) + defer.ensureDeferred(self._call_node_callbacks(client, event)) if (event.sender, event.nodeIdentifier) in client.pubsub_watching: raw_items = [i.toXml() for i in event.items] - self.host.bridge.psEventRaw( + self.host.bridge.ps_event_raw( event.sender.full(), event.nodeIdentifier, C.PS_ITEMS, @@ -1743,27 +1743,27 @@ def deleteReceived(self, event): log.debug(("Publish node deleted")) - for callback in self._getNodeCallbacks(event.nodeIdentifier, C.PS_DELETE): - d = utils.asDeferred(callback, self.parent, event) + for callback in self._get_node_callbacks(event.nodeIdentifier, C.PS_DELETE): + d = utils.as_deferred(callback, self.parent, event) d.addErrback(lambda f: log.error( f"Error while running delete event callback {callback}: {f}" )) client = self.parent if (event.sender, event.nodeIdentifier) in client.pubsub_watching: - self.host.bridge.psEventRaw( + self.host.bridge.ps_event_raw( event.sender.full(), event.nodeIdentifier, C.PS_DELETE, [], client.profile ) def purgeReceived(self, event): log.debug(("Publish node purged")) - for callback in self._getNodeCallbacks(event.nodeIdentifier, C.PS_PURGE): - d = utils.asDeferred(callback, self.parent, event) + for callback in self._get_node_callbacks(event.nodeIdentifier, C.PS_PURGE): + d = utils.as_deferred(callback, self.parent, event) d.addErrback(lambda f: log.error( f"Error while running purge event callback {callback}: {f}" )) client = self.parent if (event.sender, event.nodeIdentifier) in client.pubsub_watching: - self.host.bridge.psEventRaw( + self.host.bridge.ps_event_raw( event.sender.full(), event.nodeIdentifier, C.PS_PURGE, [], client.profile ) @@ -1798,7 +1798,7 @@ return d.addCallback(cb) - def purgeNode(self, service, nodeIdentifier): + def purge_node(self, service, nodeIdentifier): """Purge a node (i.e. delete all items from it) @param service(jid.JID, None): service to send the item to
--- a/sat/plugins/plugin_xep_0065.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0065.py Sat Apr 08 13:54:42 2023 +0200 @@ -186,7 +186,7 @@ self.id = id_ if id_ is not None else str(uuid.uuid4()) if priority_local: self._local_priority = int(priority) - self._priority = self.calculatePriority() + self._priority = self.calculate_priority() else: self._local_priority = 0 self._priority = int(priority) @@ -231,7 +231,7 @@ def __ne__(self, other): return not self.__eq__(other) - def calculatePriority(self): + def calculate_priority(self): """Calculate candidate priority according to XEP-0260 §2.2 @@ -254,7 +254,7 @@ Send activation request as explained in XEP-0065 § 6.3.5 Must only be used with proxy candidates - @param sid(unicode): session id (same as for getSessionHash) + @param sid(unicode): session id (same as for get_session_hash) @param peer_jid(jid.JID): jid of the other peer @return (D(domish.Element)): IQ result (or error) """ @@ -267,15 +267,15 @@ query_elt.addElement("activate", content=peer_jid.full()) return iq_elt.send() - def startTransfer(self, session_hash=None): + def start_transfer(self, session_hash=None): if self.type == XEP_0065.TYPE_PROXY: chunk_size = 4096 # Prosody's proxy reject bigger chunks by default else: chunk_size = None - self.factory.startTransfer(session_hash, chunk_size=chunk_size) + self.factory.start_transfer(session_hash, chunk_size=chunk_size) -def getSessionHash(requester_jid, target_jid, sid): +def get_session_hash(requester_jid, target_jid, sid): """Calculate SHA1 Hash according to XEP-0065 §5.3.2 @param requester_jid(jid.JID): jid of the requester (the one which activate the proxy) @@ -334,12 +334,12 @@ else: return self.factory.getSession() - def _startNegotiation(self): + def _start_negotiation(self): log.debug("starting negotiation (client mode)") self.state = STATE_CLIENT_AUTH self.transport.write(struct.pack("!3B", SOCKS5_VER, 1, AUTHMECH_ANON)) - def _parseNegotiation(self): + def _parse_negotiation(self): try: # Parse out data ver, nmethod = struct.unpack("!BB", self.buf[:2]) @@ -373,7 +373,7 @@ except struct.error: pass - def _parseUserPass(self): + def _parse_user_pass(self): try: # Parse out data ver, ulen = struct.unpack("BB", self.buf[:2]) @@ -383,7 +383,7 @@ # Trim off fron of the buffer self.buf = self.buf[3 + ulen + plen :] # Fire event to authenticate user - if self.authenticateUserPass(uname, password): + if self.authenticate_user_pass(uname, password): # Signal success self.state = STATE_REQUEST self.transport.write(struct.pack("!BB", SOCKS5_VER, 0x00)) @@ -394,7 +394,7 @@ except struct.error: pass - def sendErrorReply(self, errorcode): + def send_error_reply(self, errorcode): # Any other address types are not supported result = struct.pack("!BBBBIH", SOCKS5_VER, errorcode, 0, 1, 0, 0) self.transport.write(result) @@ -407,7 +407,7 @@ # Ensure we actually support the requested address type if self.addressType not in self.supportedAddrs: - self.sendErrorReply(REPLY_ADDR_NOT_SUPPORTED) + self.send_error_reply(REPLY_ADDR_NOT_SUPPORTED) return # Deal with addresses @@ -420,29 +420,29 @@ self.buf = self.buf[7 + len(addr) :] else: # Any other address types are not supported - self.sendErrorReply(REPLY_ADDR_NOT_SUPPORTED) + self.send_error_reply(REPLY_ADDR_NOT_SUPPORTED) return # Ensure command is supported if cmd not in self.enabledCommands: # Send a not supported error - self.sendErrorReply(REPLY_CMD_NOT_SUPPORTED) + self.send_error_reply(REPLY_CMD_NOT_SUPPORTED) return # Process the command if cmd == CMD_CONNECT: - self.connectRequested(addr, port) + self.connect_requested(addr, port) elif cmd == CMD_BIND: - self.bindRequested(addr, port) + self.bind_requested(addr, port) else: # Any other command is not supported - self.sendErrorReply(REPLY_CMD_NOT_SUPPORTED) + self.send_error_reply(REPLY_CMD_NOT_SUPPORTED) except struct.error: # The buffer is probably not complete, we need to wait more return None - def _makeRequest(self): + def _make_request(self): hash_ = self._session_hash.encode('utf-8') request = struct.pack( "!5B%dsH" % len(hash_), @@ -457,12 +457,12 @@ self.transport.write(request) self.state = STATE_CLIENT_REQUEST - def _parseRequestReply(self): + def _parse_request_reply(self): try: ver, rep, rsvd, self.addressType = struct.unpack("!BBBB", self.buf[:4]) # Ensure we actually support the requested address type if self.addressType not in self.supportedAddrs: - self.sendErrorReply(REPLY_ADDR_NOT_SUPPORTED) + self.send_error_reply(REPLY_ADDR_NOT_SUPPORTED) return # Deal with addresses @@ -475,7 +475,7 @@ self.buf = self.buf[7 + len(addr) :] else: # Any other address types are not supported - self.sendErrorReply(REPLY_ADDR_NOT_SUPPORTED) + self.send_error_reply(REPLY_ADDR_NOT_SUPPORTED) return # Ensure reply is OK @@ -497,22 +497,22 @@ ) ) if self.state == STATE_CLIENT_INITIAL: - self._startNegotiation() + self._start_negotiation() - def connectRequested(self, addr, port): + def connect_requested(self, addr, port): # Check that this session is expected - if not self.factory.addToSession(addr.decode('utf-8'), self): + if not self.factory.add_to_session(addr.decode('utf-8'), self): log.warning( "Unexpected connection request received from {host}".format( host=self.transport.getPeer().host ) ) - self.sendErrorReply(REPLY_CONN_REFUSED) + self.send_error_reply(REPLY_CONN_REFUSED) return self._session_hash = addr.decode('utf-8') - self.connectCompleted(addr, 0) + self.connect_completed(addr, 0) - def startTransfer(self, chunk_size): + def start_transfer(self, chunk_size): """Callback called when the result iq is received @param chunk_size(None, int): size of the buffer, or None for default @@ -521,14 +521,14 @@ if chunk_size is not None: self.CHUNK_SIZE = chunk_size log.debug("Starting file transfer") - d = self.stream_object.startStream(self.transport) - d.addCallback(self.streamFinished) + d = self.stream_object.start_stream(self.transport) + d.addCallback(self.stream_finished) - def streamFinished(self, d): + def stream_finished(self, d): log.info(_("File transfer completed, closing connection")) self.transport.loseConnection() - def connectCompleted(self, remotehost, remoteport): + def connect_completed(self, remotehost, remoteport): if self.addressType == ADDR_IPV4: result = struct.pack( "!BBBBIH", SOCKS5_VER, REPLY_SUCCESS, 0, 1, remotehost, remoteport @@ -547,10 +547,10 @@ self.transport.write(result) self.state = STATE_READY - def bindRequested(self, addr, port): + def bind_requested(self, addr, port): pass - def authenticateUserPass(self, user, passwd): + def authenticate_user_pass(self, user, passwd): # FIXME: implement authentication and remove the debug printing a password log.debug("User/pass: %s/%s" % (user, passwd)) return True @@ -566,20 +566,20 @@ self.buf = self.buf + buf if self.state == STATE_INITIAL: - self._parseNegotiation() + self._parse_negotiation() if self.state == STATE_AUTH_USERPASS: - self._parseUserPass() + self._parse_user_pass() if self.state == STATE_REQUEST: self._parseRequest() if self.state == STATE_CLIENT_REQUEST: - self._parseRequestReply() + self._parse_request_reply() if self.state == STATE_CLIENT_AUTH: ver, method = struct.unpack("!BB", buf) self.buf = self.buf[2:] if ver != SOCKS5_VER or method != AUTHMECH_ANON: self.transport.loseConnection() else: - self._makeRequest() + self._make_request() def connectionLost(self, reason): log.debug("Socks5 connection lost: {}".format(reason.value)) @@ -591,7 +591,7 @@ except AttributeError: log.debug("no session has been received yet") else: - self.factory.removeFromSession(session_hash, self, reason) + self.factory.remove_from_session(session_hash, self, reason) class Socks5ServerFactory(protocol.ServerFactory): @@ -606,7 +606,7 @@ def getSession(self, session_hash): return self.parent.getSession(None, session_hash) - def startTransfer(self, session_hash, chunk_size=None): + def start_transfer(self, session_hash, chunk_size=None): session = self.getSession(session_hash) try: protocol = session["protocols"][0] @@ -614,9 +614,9 @@ log.error("Can't start file transfer, can't find protocol") else: session[TIMER_KEY].cancel() - protocol.startTransfer(chunk_size) + protocol.start_transfer(chunk_size) - def addToSession(self, session_hash, protocol): + def add_to_session(self, session_hash, protocol): """Check is session_hash is valid, and associate protocol with it the session will be associated to the corresponding candidate @@ -633,7 +633,7 @@ session_data.setdefault("protocols", []).append(protocol) return True - def removeFromSession(self, session_hash, protocol, reason): + def remove_from_session(self, session_hash, protocol, reason): """Remove a protocol from session_data There can be several protocol instances while candidates are tried, they @@ -683,9 +683,9 @@ def getSession(self): return self.session - def startTransfer(self, __=None, chunk_size=None): + def start_transfer(self, __=None, chunk_size=None): self.session[TIMER_KEY].cancel() - self._protocol_instance.startTransfer(chunk_size) + self._protocol_instance.start_transfer(chunk_size) def clientConnectionFailed(self, connector, reason): log.debug("Connection failed") @@ -741,19 +741,19 @@ # parameters # XXX: params are not used for now, but they may be used in the futur to force proxy/IP - # host.memory.updateParams(PARAMS) + # host.memory.update_params(PARAMS) - def getHandler(self, client): + def get_handler(self, client): return XEP_0065_handler(self) - def profileConnected(self, client): + def profile_connected(self, client): client.xep_0065_sid_session = {} # key: stream_id, value: session_data(dict) client._s5b_sessions = {} - def getSessionHash(self, from_jid, to_jid, sid): - return getSessionHash(from_jid, to_jid, sid) + def get_session_hash(self, from_jid, to_jid, sid): + return get_session_hash(from_jid, to_jid, sid) - def getSocks5ServerFactory(self): + def get_socks_5_server_factory(self): """Return server factory The server is created if it doesn't exists yet @@ -785,11 +785,11 @@ return self._server_factory @defer.inlineCallbacks - def getProxy(self, client, local_jid): + def get_proxy(self, client, local_jid): """Return the proxy available for this profile cache is used between clients using the same server - @param local_jid(jid.JID): same as for [getCandidates] + @param local_jid(jid.JID): same as for [get_candidates] @return ((D)(ProxyInfos, None)): Found proxy infos, or None if not acceptable proxy is found @raise exceptions.NotFound: no Proxy found @@ -807,7 +807,7 @@ pass try: proxy = ( - yield self.host.findServiceEntities(client, "proxy", "bytestreams") + yield self.host.find_service_entities(client, "proxy", "bytestreams") ).pop() except (defer.CancelledError, StopIteration, KeyError): notFound(server) @@ -844,16 +844,16 @@ defer.returnValue(proxy_infos) @defer.inlineCallbacks - def _getNetworkData(self, client): + def _get_network_data(self, client): """Retrieve information about network @param client: %(doc_client)s @return (D(tuple[local_port, external_port, local_ips, external_ip])): network data """ - self.getSocks5ServerFactory() + self.get_socks_5_server_factory() local_port = self._server_factory_port - external_ip = yield self._ip.getExternalIP(client) - local_ips = yield self._ip.getLocalIPs(client) + external_ip = yield self._ip.get_external_ip(client) + local_ips = yield self._ip.get_local_i_ps(client) if external_ip is not None and self._external_port is None: if external_ip != local_ips[0]: @@ -861,7 +861,7 @@ if self._np is None: log.warning("NAT port plugin not available, we can't map port") else: - ext_port = yield self._np.mapPort( + ext_port = yield self._np.map_port( local_port, desc="SaT socks5 stream" ) if ext_port is None: @@ -872,7 +872,7 @@ defer.returnValue((local_port, self._external_port, local_ips, external_ip)) @defer.inlineCallbacks - def getCandidates(self, client, local_jid): + def get_candidates(self, client, local_jid): """Return a list of our stream candidates @param local_jid(jid.JID): jid to use as local jid @@ -881,10 +881,10 @@ client.jid would be file.example.net) @return (D(list[Candidate])): list of candidates, ordered by priority """ - server_factory = yield self.getSocks5ServerFactory() - local_port, ext_port, local_ips, external_ip = yield self._getNetworkData(client) + server_factory = yield self.get_socks_5_server_factory() + local_port, ext_port, local_ips, external_ip = yield self._get_network_data(client) try: - proxy = yield self.getProxy(client, local_jid) + proxy = yield self.get_proxy(client, local_jid) except exceptions.NotFound: proxy = None @@ -950,7 +950,7 @@ candidates.sort(key=lambda c: c.priority, reverse=True) defer.returnValue(candidates) - def _addConnector(self, connector, candidate): + def _add_connector(self, connector, candidate): """Add connector used to connect to candidate, and return client factory's connection Deferred the connector can be used to disconnect the candidate, and returning the factory's connection Deferred allow to wait for connection completion @@ -961,7 +961,7 @@ candidate.factory.connector = connector return candidate.factory.connection - def connectCandidate( + def connect_candidate( self, client, candidate, session_hash, peer_session_hash=None, delay=None ): """Connect to a candidate @@ -975,7 +975,7 @@ None must be used in 2 cases: - when XEP-0065 is used with XEP-0096 - when a peer connect to a proxy *he proposed himself* - in practice, peer_session_hash is only used by tryCandidates + in practice, peer_session_hash is only used by try_candidates @param delay(None, float): optional delay to wait before connection, in seconds @return (D): Deferred launched when TCP connection + Socks5 connection is done """ @@ -990,10 +990,10 @@ else: d = sat_defer.DelayedDeferred(delay, candidate.host) d.addCallback(reactor.connectTCP, candidate.port, factory) - d.addCallback(self._addConnector, candidate) + d.addCallback(self._add_connector, candidate) return d - def tryCandidates( + def try_candidates( self, client, candidates, @@ -1008,7 +1008,7 @@ delay = CANDIDATE_DELAY * len(defers_list) if candidate.type == XEP_0065.TYPE_PROXY: delay += CANDIDATE_DELAY_PROXY - d = self.connectCandidate( + d = self.connect_candidate( client, candidate, session_hash, peer_session_hash, delay ) if connection_cb is not None: @@ -1023,7 +1023,7 @@ return defers_list - def getBestCandidate(self, client, candidates, session_hash, peer_session_hash=None): + def get_best_candidate(self, client, candidates, session_hash, peer_session_hash=None): """Get best candidate (according to priority) which can connect @param candidates(iterable[Candidate]): candidates to test @@ -1035,7 +1035,7 @@ """ defer_candidates = None - def connectionCb(client, candidate): + def connection_cb(client, candidate): log.info("Connection of {} successful".format(str(candidate))) for idx, other_candidate in enumerate(candidates): try: @@ -1045,7 +1045,7 @@ except AttributeError: assert other_candidate is None - def connectionEb(failure, client, candidate): + def connection_eb(failure, client, candidate): if failure.check(defer.CancelledError): log.debug("Connection of {} has been cancelled".format(candidate)) else: @@ -1056,37 +1056,37 @@ ) candidates[candidates.index(candidate)] = None - def allTested(__): + def all_tested(__): log.debug("All candidates have been tested") good_candidates = [c for c in candidates if c] return good_candidates[0] if good_candidates else None - defer_candidates = self.tryCandidates( + defer_candidates = self.try_candidates( client, candidates, session_hash, peer_session_hash, - connectionCb, - connectionEb, + connection_cb, + connection_eb, ) d_list = defer.DeferredList(defer_candidates) - d_list.addCallback(allTested) + d_list.addCallback(all_tested) return d_list - def _timeOut(self, session_hash, client): + def _time_out(self, session_hash, client): """Called when stream was not started quickly enough - @param session_hash(str): hash as returned by getSessionHash + @param session_hash(str): hash as returned by get_session_hash @param client: %(doc_client)s """ log.info("Socks5 Bytestream: TimeOut reached") session = self.getSession(client, session_hash) session[DEFER_KEY].errback(exceptions.TimeOutError()) - def killSession(self, failure_, session_hash, sid, client): + def kill_session(self, failure_, session_hash, sid, client): """Clean the current session - @param session_hash(str): hash as returned by getSessionHash + @param session_hash(str): hash as returned by get_session_hash @param sid(None, unicode): session id or None if self.xep_0065_sid_session was not used @param client: %(doc_client)s @@ -1128,23 +1128,23 @@ return failure_ - def startStream(self, client, stream_object, local_jid, to_jid, sid): + def start_stream(self, client, stream_object, local_jid, to_jid, sid): """Launch the stream workflow @param streamProducer: stream_object to use - @param local_jid(jid.JID): same as for [getCandidates] + @param local_jid(jid.JID): same as for [get_candidates] @param to_jid: JID of the recipient @param sid: Stream session id @param successCb: method to call when stream successfuly finished @param failureCb: method to call when something goes wrong @return (D): Deferred fired when session is finished """ - session_data = self._createSession( + session_data = self._create_session( client, stream_object, local_jid, to_jid, sid, True) session_data[client] = client - def gotCandidates(candidates): + def got_candidates(candidates): session_data["candidates"] = candidates iq_elt = client.IQ() iq_elt["from"] = local_jid.full() @@ -1162,12 +1162,12 @@ d = iq_elt.send() args = [client, session_data, local_jid] - d.addCallbacks(self._IQNegotiationCb, self._IQNegotiationEb, args, None, args) + d.addCallbacks(self._iq_negotiation_cb, self._iq_negotiation_eb, args, None, args) - self.getCandidates(client, local_jid).addCallback(gotCandidates) + self.get_candidates(client, local_jid).addCallback(got_candidates) return session_data[DEFER_KEY] - def _IQNegotiationCb(self, iq_elt, client, session_data, local_jid): + def _iq_negotiation_cb(self, iq_elt, client, session_data, local_jid): """Called when the result of open iq is received @param session_data(dict): data of the session @@ -1197,33 +1197,33 @@ if candidate.type == XEP_0065.TYPE_PROXY: log.info("A Socks5 proxy is used") - d = self.connectCandidate(client, candidate, session_data["hash"]) + d = self.connect_candidate(client, candidate, session_data["hash"]) d.addCallback( lambda __: candidate.activate( client, session_data["id"], session_data["peer_jid"], local_jid ) ) - d.addErrback(self._activationEb) + d.addErrback(self._activation_eb) else: d = defer.succeed(None) - d.addCallback(lambda __: candidate.startTransfer(session_data["hash"])) + d.addCallback(lambda __: candidate.start_transfer(session_data["hash"])) - def _activationEb(self, failure): + def _activation_eb(self, failure): log.warning("Proxy activation error: {}".format(failure.value)) - def _IQNegotiationEb(self, stanza_err, client, session_data, local_jid): + def _iq_negotiation_eb(self, stanza_err, client, session_data, local_jid): log.warning("Socks5 transfer failed: {}".format(stanza_err.value)) # FIXME: must clean session - def createSession(self, *args, **kwargs): - """like [_createSession] but return the session deferred instead of the whole session + def create_session(self, *args, **kwargs): + """like [_create_session] but return the session deferred instead of the whole session session deferred is fired when transfer is finished """ - return self._createSession(*args, **kwargs)[DEFER_KEY] + return self._create_session(*args, **kwargs)[DEFER_KEY] - def _createSession(self, client, stream_object, local_jid, to_jid, sid, + def _create_session(self, client, stream_object, local_jid, to_jid, sid, requester=False): """Called when a bytestream is imminent @@ -1237,16 +1237,16 @@ if sid in client.xep_0065_sid_session: raise exceptions.ConflictError("A session with this id already exists !") if requester: - session_hash = getSessionHash(local_jid, to_jid, sid) - session_data = self._registerHash(client, session_hash, stream_object) + session_hash = get_session_hash(local_jid, to_jid, sid) + session_data = self._register_hash(client, session_hash, stream_object) else: - session_hash = getSessionHash(to_jid, local_jid, sid) + session_hash = get_session_hash(to_jid, local_jid, sid) session_d = defer.Deferred() - session_d.addBoth(self.killSession, session_hash, sid, client) + session_d.addBoth(self.kill_session, session_hash, sid, client) session_data = client._s5b_sessions[session_hash] = { DEFER_KEY: session_d, TIMER_KEY: reactor.callLater( - TIMEOUT, self._timeOut, session_hash, client + TIMEOUT, self._time_out, session_hash, client ), } client.xep_0065_sid_session[sid] = session_data @@ -1283,13 +1283,13 @@ raise e return client._s5b_sessions[session_hash] - def registerHash(self, *args, **kwargs): - """like [_registerHash] but return the session deferred instead of the whole session + def register_hash(self, *args, **kwargs): + """like [_register_hash] but return the session deferred instead of the whole session session deferred is fired when transfer is finished """ - return self._registerHash(*args, **kwargs)[DEFER_KEY] + return self._register_hash(*args, **kwargs)[DEFER_KEY] - def _registerHash(self, client, session_hash, stream_object): + def _register_hash(self, client, session_hash, stream_object): """Create a session_data associated to hash @param session_hash(str): hash of the session @@ -1299,10 +1299,10 @@ """ assert session_hash not in client._s5b_sessions session_d = defer.Deferred() - session_d.addBoth(self.killSession, session_hash, None, client) + session_d.addBoth(self.kill_session, session_hash, None, client) session_data = client._s5b_sessions[session_hash] = { DEFER_KEY: session_d, - TIMER_KEY: reactor.callLater(TIMEOUT, self._timeOut, session_hash, client), + TIMER_KEY: reactor.callLater(TIMEOUT, self._time_out, session_hash, client), } if stream_object is not None: @@ -1313,13 +1313,13 @@ return session_data - def associateStreamObject(self, client, session_hash, stream_object): + def associate_stream_object(self, client, session_hash, stream_object): """Associate a stream object with a session""" session_data = self.getSession(client, session_hash) assert "stream_object" not in session_data session_data["stream_object"] = stream_object - def streamQuery(self, iq_elt, client): + def stream_query(self, iq_elt, client): log.debug("BS stream query") iq_elt.handled = True @@ -1361,10 +1361,10 @@ for candidate in candidates: log.info("Candidate proposed: {}".format(candidate)) - d = self.getBestCandidate(client, candidates, session_data["hash"]) - d.addCallback(self._ackStream, iq_elt, session_data, client) + d = self.get_best_candidate(client, candidates, session_data["hash"]) + d.addCallback(self._ack_stream, iq_elt, session_data, client) - def _ackStream(self, candidate, iq_elt, session_data, client): + def _ack_stream(self, candidate, iq_elt, session_data, client): if candidate is None: log.info("No streamhost candidate worked, we have to end negotiation") return client.sendError(iq_elt, "item-not-found") @@ -1386,7 +1386,7 @@ def connectionInitialized(self): self.xmlstream.addObserver( - BS_REQUEST, self.plugin_parent.streamQuery, client=self.parent + BS_REQUEST, self.plugin_parent.stream_query, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0070.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0070.py Sat Apr 08 13:54:42 2023 +0200 @@ -67,28 +67,28 @@ self.host = host self._dictRequest = dict() - def getHandler(self, client): + def get_handler(self, client): return XEP_0070_handler(self, client.profile) - def onHttpAuthRequestIQ(self, iq_elt, client): + def on_http_auth_request_iq(self, iq_elt, client): """This method is called on confirmation request received (XEP-0070 #4.5) @param iq_elt: IQ element @param client: %(doc_client)s """ log.info(_("XEP-0070 Verifying HTTP Requests via XMPP (iq)")) - self._treatHttpAuthRequest(iq_elt, IQ, client) + self._treat_http_auth_request(iq_elt, IQ, client) - def onHttpAuthRequestMsg(self, msg_elt, client): + def on_http_auth_request_msg(self, msg_elt, client): """This method is called on confirmation request received (XEP-0070 #4.5) @param msg_elt: message element @param client: %(doc_client)s """ log.info(_("XEP-0070 Verifying HTTP Requests via XMPP (message)")) - self._treatHttpAuthRequest(msg_elt, MSG, client) + self._treat_http_auth_request(msg_elt, MSG, client) - def _treatHttpAuthRequest(self, elt, stanzaType, client): + def _treat_http_auth_request(self, elt, stanzaType, client): elt.handled = True auth_elt = next(elt.elements(NS_HTTP_AUTH, "confirm")) auth_id = auth_elt["id"] @@ -100,11 +100,11 @@ "Validation code : {auth_id}\n\n" "Please check that this code is the same as on {auth_url}" ).format(auth_url=auth_url, auth_id=auth_id) - d = xml_tools.deferConfirm(self.host, message=message, title=title, + d = xml_tools.defer_confirm(self.host, message=message, title=title, profile=client.profile) - d.addCallback(self._authRequestCallback, client) + d.addCallback(self._auth_request_callback, client) - def _authRequestCallback(self, authorized, client): + def _auth_request_callback(self, authorized, client): try: auth_id, auth_method, auth_url, stanzaType, elt = self._dictRequest.pop( client) @@ -140,12 +140,12 @@ def connectionInitialized(self): self.xmlstream.addObserver( IQ_HTTP_AUTH_REQUEST, - self.plugin_parent.onHttpAuthRequestIQ, + self.plugin_parent.on_http_auth_request_iq, client=self.parent, ) self.xmlstream.addObserver( MSG_HTTP_AUTH_REQUEST, - self.plugin_parent.onHttpAuthRequestMsg, + self.plugin_parent.on_http_auth_request_msg, client=self.parent, )
--- a/sat/plugins/plugin_xep_0071.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0071.py Sat Apr 08 13:54:42 2023 +0200 @@ -94,19 +94,19 @@ log.info(_("XHTML-IM plugin initialization")) self.host = host self._s = self.host.plugins["TEXT_SYNTAXES"] - self._s.addSyntax( + self._s.add_syntax( self.SYNTAX_XHTML_IM, lambda xhtml: xhtml, self.XHTML2XHTML_IM, [self._s.OPT_HIDDEN], ) - host.trigger.add("messageReceived", self.messageReceivedTrigger) - host.trigger.add("sendMessage", self.sendMessageTrigger) + host.trigger.add("messageReceived", self.message_received_trigger) + host.trigger.add("sendMessage", self.send_message_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP_0071_handler(self) - def _messagePostTreat(self, data, message_elt, body_elts, client): + def _message_post_treat(self, data, message_elt, body_elts, client): """Callback which manage the post treatment of the message in case of XHTML-IM found @param data: data send by messageReceived trigger through post_treat deferred @@ -155,7 +155,7 @@ d.addCallback(self._fill_body_text, data, lang) defers.append(d) - def _sendMessageAddRich(self, data, client): + def _send_message_add_rich(self, data, client): """ Construct XHTML-IM node and add it XML element @param data: message data as sended by sendMessage callback @@ -174,18 +174,18 @@ data["extra"]["xhtml"] = xhtml_im body_elt.addRawXml(xhtml_im) - syntax = self._s.getCurrentSyntax(client.profile) + syntax = self._s.get_current_syntax(client.profile) defers = [] if "xhtml" in data["extra"]: # we have directly XHTML - for lang, xhtml in data_format.getSubDict("xhtml", data["extra"]): + for lang, xhtml in data_format.get_sub_dict("xhtml", data["extra"]): self._check_body_text(data, lang, xhtml, self._s.SYNTAX_XHTML, defers) d = self._s.convert(xhtml, self._s.SYNTAX_XHTML, self.SYNTAX_XHTML_IM) d.addCallback(syntax_converted, lang) defers.append(d) elif "rich" in data["extra"]: # we have rich syntax to convert - for lang, rich_data in data_format.getSubDict("rich", data["extra"]): + for lang, rich_data in data_format.get_sub_dict("rich", data["extra"]): self._check_body_text(data, lang, rich_data, syntax, defers) d = self._s.convert(rich_data, syntax, self.SYNTAX_XHTML_IM) d.addCallback(syntax_converted, lang) @@ -196,7 +196,7 @@ d_list.addCallback(lambda __: data) return d_list - def messageReceivedTrigger(self, client, message, post_treat): + def message_received_trigger(self, client, message, post_treat): """ Check presence of XHTML-IM in message """ try: @@ -206,10 +206,10 @@ pass else: body_elts = html_elt.elements(NS_XHTML, "body") - post_treat.addCallback(self._messagePostTreat, message, body_elts, client) + post_treat.addCallback(self._message_post_treat, message, body_elts, client) return True - def sendMessageTrigger(self, client, data, pre_xml_treatments, post_xml_treatments): + def send_message_trigger(self, client, data, pre_xml_treatments, post_xml_treatments): """ Check presence of rich text in extra """ rich = {} xhtml = {} @@ -227,10 +227,10 @@ data["rich"] = rich else: data["xhtml"] = xhtml - post_xml_treatments.addCallback(self._sendMessageAddRich, client) + post_xml_treatments.addCallback(self._send_message_add_rich, client) return True - def _purgeStyle(self, styles_raw): + def _purge_style(self, styles_raw): """ Remove unauthorised styles according to the XEP-0071 @param styles_raw: raw styles (value of the style attribute) """ @@ -277,7 +277,7 @@ for att in att_to_remove: del (attrib[att]) if "style" in attrib: - attrib["style"] = self._purgeStyle(attrib["style"]) + attrib["style"] = self._purge_style(attrib["style"]) for elem in to_strip: if elem.tag in blacklist:
--- a/sat/plugins/plugin_xep_0077.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0077.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,15 +77,15 @@ def register(self, xmlstream): log.debug(_("Stream started with {server}, now registering" .format(server=self.jid.host))) - iq = XEP_0077.buildRegisterIQ(self.xmlstream, self.jid, self.password, self.email) - d = iq.send(self.jid.host).addCallbacks(self.registrationCb, self.registrationEb) + iq = XEP_0077.build_register_iq(self.xmlstream, self.jid, self.password, self.email) + d = iq.send(self.jid.host).addCallbacks(self.registration_cb, self.registration_eb) d.chainDeferred(self.registered) - def registrationCb(self, answer): + def registration_cb(self, answer): log.debug(_("Registration answer: {}").format(answer.toXml())) self.xmlstream.sendFooter() - def registrationEb(self, failure_): + def registration_eb(self, failure_): log.info(_("Registration failure: {}").format(str(failure_.value))) self.xmlstream.sendFooter() raise failure_ @@ -115,41 +115,41 @@ def __init__(self, host): log.info(_("Plugin XEP_0077 initialization")) self.host = host - host.bridge.addMethod( - "inBandRegister", + host.bridge.add_method( + "in_band_register", ".plugin", in_sign="ss", out_sign="", - method=self._inBandRegister, + method=self._in_band_register, async_=True, ) - host.bridge.addMethod( - "inBandAccountNew", + host.bridge.add_method( + "in_band_account_new", ".plugin", in_sign="ssssi", out_sign="", - method=self._registerNewAccount, + method=self._register_new_account, async_=True, ) - host.bridge.addMethod( - "inBandUnregister", + host.bridge.add_method( + "in_band_unregister", ".plugin", in_sign="ss", out_sign="", method=self._unregister, async_=True, ) - host.bridge.addMethod( - "inBandPasswordChange", + host.bridge.add_method( + "in_band_password_change", ".plugin", in_sign="ss", out_sign="", - method=self._changePassword, + method=self._change_password, async_=True, ) @staticmethod - def buildRegisterIQ(xmlstream_, jid_, password, email=None): + def build_register_iq(xmlstream_, jid_, password, email=None): iq_elt = xmlstream.IQ(xmlstream_, "set") iq_elt["to"] = jid_.host query_elt = iq_elt.addElement(("jabber:iq:register", "query")) @@ -162,7 +162,7 @@ email_elt.addContent(email) return iq_elt - def _regCb(self, answer, client, post_treat_cb): + def _reg_cb(self, answer, client, post_treat_cb): """Called after the first get IQ""" try: query_elt = next(answer.elements(NS_REG, "query")) @@ -178,8 +178,8 @@ _("This gateway can't be managed by SàT, sorry :(") ) - def submitForm(data, profile): - form_elt = xml_tools.XMLUIResultToElt(data) + def submit_form(data, profile): + form_elt = xml_tools.xmlui_result_to_elt(data) iq_elt = client.IQ() iq_elt["id"] = answer["id"] @@ -187,28 +187,28 @@ query_elt = iq_elt.addElement("query", NS_REG) query_elt.addChild(form_elt) d = iq_elt.send() - d.addCallback(self._regSuccess, client, post_treat_cb) - d.addErrback(self._regFailure, client) + d.addCallback(self._reg_success, client, post_treat_cb) + d.addErrback(self._reg_failure, client) return d form = data_form.Form.fromElement(x_elem) - submit_reg_id = self.host.registerCallback( - submitForm, with_data=True, one_shot=True + submit_reg_id = self.host.register_callback( + submit_form, with_data=True, one_shot=True ) - return xml_tools.dataForm2XMLUI(form, submit_reg_id) + return xml_tools.data_form_2_xmlui(form, submit_reg_id) - def _regEb(self, failure, client): + def _reg_eb(self, failure, client): """Called when something is wrong with registration""" log.info(_("Registration failure: %s") % str(failure.value)) raise failure - def _regSuccess(self, answer, client, post_treat_cb): + def _reg_success(self, answer, client, post_treat_cb): log.debug(_("registration answer: %s") % answer.toXml()) if post_treat_cb is not None: post_treat_cb(jid.JID(answer["from"]), client.profile) return {} - def _regFailure(self, failure, client): + def _reg_failure(self, failure, client): log.info(_("Registration failure: %s") % str(failure.value)) if failure.value.condition == "conflict": raise exceptions.ConflictError( @@ -216,30 +216,30 @@ ) raise failure - def _inBandRegister(self, to_jid_s, profile_key=C.PROF_KEY_NONE): - return self.inBandRegister, jid.JID(to_jid_s, profile_key) + def _in_band_register(self, to_jid_s, profile_key=C.PROF_KEY_NONE): + return self.in_band_register, jid.JID(to_jid_s, profile_key) - def inBandRegister(self, to_jid, post_treat_cb=None, profile_key=C.PROF_KEY_NONE): + def in_band_register(self, to_jid, post_treat_cb=None, profile_key=C.PROF_KEY_NONE): """register to a service @param to_jid(jid.JID): jid of the service to register to """ # FIXME: this post_treat_cb arguments seems wrong, check it - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.debug(_("Asking registration for {}").format(to_jid.full())) reg_request = client.IQ("get") reg_request["from"] = client.jid.full() reg_request["to"] = to_jid.full() reg_request.addElement("query", NS_REG) d = reg_request.send(to_jid.full()).addCallbacks( - self._regCb, - self._regEb, + self._reg_cb, + self._reg_eb, callbackArgs=[client, post_treat_cb], errbackArgs=[client], ) return d - def _registerNewAccount(self, jid_, password, email, host, port): + def _register_new_account(self, jid_, password, email, host, port): kwargs = {} if email: kwargs["email"] = email @@ -247,9 +247,9 @@ kwargs["host"] = host if port: kwargs["port"] = port - return self.registerNewAccount(jid.JID(jid_), password, **kwargs) + return self.register_new_account(jid.JID(jid_), password, **kwargs) - def registerNewAccount( + def register_new_account( self, jid_, password, email=None, host=None, port=C.XMPP_C2S_PORT ): """register a new account on a XMPP server @@ -261,7 +261,7 @@ @param port(int): port of the server to register to """ if host is None: - host = self.host.memory.getConfig("", "xmpp_domain", "127.0.0.1") + host = self.host.memory.config_get("", "xmpp_domain", "127.0.0.1") check_certificate = host != "127.0.0.1" authenticator = RegisteringAuthenticator( jid_, password, email, check_certificate=check_certificate) @@ -270,22 +270,22 @@ reactor.connectTCP(host, port, server_register) return registered_d - def _changePassword(self, new_password, profile_key): - client = self.host.getClient(profile_key) - return self.changePassword(client, new_password) + def _change_password(self, new_password, profile_key): + client = self.host.get_client(profile_key) + return self.change_password(client, new_password) - def changePassword(self, client, new_password): - iq_elt = self.buildRegisterIQ(client.xmlstream, client.jid, new_password) + def change_password(self, client, new_password): + iq_elt = self.build_register_iq(client.xmlstream, client.jid, new_password) d = iq_elt.send(client.jid.host) d.addCallback( - lambda __: self.host.memory.setParam( + lambda __: self.host.memory.param_set( "Password", new_password, "Connection", profile_key=client.profile ) ) return d def _unregister(self, to_jid_s, profile_key): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return self.unregister(client, jid.JID(to_jid_s)) def unregister( @@ -307,6 +307,6 @@ query_elt.addElement("remove") d = iq_elt.send() if not to_jid or to_jid == jid.JID(client.jid.host): - d.addCallback(lambda __: client.entityDisconnect()) + d.addCallback(lambda __: client.entity_disconnect()) return d
--- a/sat/plugins/plugin_xep_0080.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0080.py Sat Apr 08 13:54:42 2023 +0200 @@ -72,7 +72,7 @@ def __init__(self, host): log.info(_("XEP-0080 (User Location) plugin initialization")) - host.registerNamespace("geoloc", NS_GEOLOC) + host.register_namespace("geoloc", NS_GEOLOC) def get_geoloc_elt( self,
--- a/sat/plugins/plugin_xep_0084.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0084.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,39 +61,39 @@ def __init__(self, host): log.info(_("XEP-0084 (User Avatar) plugin initialization")) - host.registerNamespace("avatar_metadata", NS_AVATAR_METADATA) - host.registerNamespace("avatar_data", NS_AVATAR_DATA) + host.register_namespace("avatar_metadata", NS_AVATAR_METADATA) + host.register_namespace("avatar_data", NS_AVATAR_DATA) self.host = host self._p = host.plugins["XEP-0060"] self._i = host.plugins['IDENTITY'] self._i.register( IMPORT_NAME, "avatar", - self.getAvatar, - self.setAvatar, + self.get_avatar, + self.set_avatar, priority=2000 ) - host.plugins["XEP-0163"].addPEPEvent( - None, NS_AVATAR_METADATA, self._onMetadataUpdate + host.plugins["XEP-0163"].add_pep_event( + None, NS_AVATAR_METADATA, self._on_metadata_update ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0084_Handler() - def _onMetadataUpdate(self, itemsEvent, profile): - client = self.host.getClient(profile) - defer.ensureDeferred(self.onMetadataUpdate(client, itemsEvent)) + def _on_metadata_update(self, itemsEvent, profile): + client = self.host.get_client(profile) + defer.ensureDeferred(self.on_metadata_update(client, itemsEvent)) - async def onMetadataUpdate( + async def on_metadata_update( self, client: SatXMPPEntity, itemsEvent: pubsub.ItemsEvent ) -> None: entity = client.jid.userhostJID() - avatar_metadata = await self.getAvatar(client, entity) + avatar_metadata = await self.get_avatar(client, entity) await self._i.update(client, IMPORT_NAME, "avatar", avatar_metadata, entity) - async def getAvatar( + async def get_avatar( self, client: SatXMPPEntity, entity_jid: jid.JID @@ -106,7 +106,7 @@ service = entity_jid.userhostJID() # metadata try: - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, NS_AVATAR_METADATA, @@ -146,10 +146,10 @@ # (https://xmpp.org/extensions/xep-0084.html#pub-disable) return None - cache_data = self.host.common_cache.getMetadata(avatar_id) + cache_data = self.host.common_cache.get_metadata(avatar_id) if not cache_data: try: - data_items, __ = await self._p.getItems( + data_items, __ = await self._p.get_items( client, service, NS_AVATAR_DATA, @@ -172,7 +172,7 @@ f"{avatar_id!r}: {e}\n{data_item_elt.toXml()}" ) return None - with self.host.common_cache.cacheData( + with self.host.common_cache.cache_data( IMPORT_NAME, avatar_id, metadata["media_type"] @@ -183,11 +183,11 @@ "mime_type": metadata["media_type"] } - return self._i.avatarBuildMetadata( + return self._i.avatar_build_metadata( cache_data['path'], cache_data['mime_type'], avatar_id ) - def buildItemDataElt(self, avatar_data: Dict[str, Any]) -> domish.Element: + def build_item_data_elt(self, avatar_data: Dict[str, Any]) -> domish.Element: """Generate the item for the data node @param avatar_data: data as build by identity plugin (need to be filled with @@ -197,7 +197,7 @@ data_elt.addContent(avatar_data["base64"]) return pubsub.Item(id=avatar_data["cache_uid"], payload=data_elt) - def buildItemMetadataElt(self, avatar_data: Dict[str, Any]) -> domish.Element: + def build_item_metadata_elt(self, avatar_data: Dict[str, Any]) -> domish.Element: """Generate the item for the metadata node @param avatar_data: data as build by identity plugin (need to be filled with @@ -212,7 +212,7 @@ info_elt["bytes"] = str(avatar_data["path"].stat().st_size) return pubsub.Item(id=self._p.ID_SINGLETON, payload=metadata_elt) - async def setAvatar( + async def set_avatar( self, client: SatXMPPEntity, avatar_data: Dict[str, Any], @@ -227,7 +227,7 @@ service = entity.userhostJID() # Data - await self._p.createIfNewNode( + await self._p.create_if_new_node( client, service, NS_AVATAR_DATA, @@ -237,11 +237,11 @@ self._p.OPT_MAX_ITEMS: 1, } ) - item_data_elt = self.buildItemDataElt(avatar_data) - await self._p.sendItems(client, service, NS_AVATAR_DATA, [item_data_elt]) + item_data_elt = self.build_item_data_elt(avatar_data) + await self._p.send_items(client, service, NS_AVATAR_DATA, [item_data_elt]) # Metadata - await self._p.createIfNewNode( + await self._p.create_if_new_node( client, service, NS_AVATAR_METADATA, @@ -251,8 +251,8 @@ self._p.OPT_MAX_ITEMS: 1, } ) - item_metadata_elt = self.buildItemMetadataElt(avatar_data) - await self._p.sendItems(client, service, NS_AVATAR_METADATA, [item_metadata_elt]) + item_metadata_elt = self.build_item_metadata_elt(avatar_data) + await self._p.send_items(client, service, NS_AVATAR_METADATA, [item_metadata_elt]) @implementer(iwokkel.IDisco)
--- a/sat/plugins/plugin_xep_0085.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0085.py Sat Apr 08 13:54:42 2023 +0200 @@ -103,29 +103,29 @@ self.map = {} # FIXME: would be better to use client instead of mapping profile to data # parameter value is retrieved before each use - host.memory.updateParams(self.params) + host.memory.update_params(self.params) # triggers from core - host.trigger.add("messageReceived", self.messageReceivedTrigger) - host.trigger.add("sendMessage", self.sendMessageTrigger) - host.trigger.add("paramUpdateTrigger", self.paramUpdateTrigger) + host.trigger.add("messageReceived", self.message_received_trigger) + host.trigger.add("sendMessage", self.send_message_trigger) + host.trigger.add("param_update_trigger", self.param_update_trigger) # args: to_s (jid as string), profile - host.bridge.addMethod( - "chatStateComposing", + host.bridge.add_method( + "chat_state_composing", ".plugin", in_sign="ss", out_sign="", - method=self.chatStateComposing, + method=self.chat_state_composing, ) # args: from (jid as string), state in CHAT_STATES, profile - host.bridge.addSignal("chatStateReceived", ".plugin", signature="sss") + host.bridge.add_signal("chat_state_received", ".plugin", signature="sss") - def getHandler(self, client): + def get_handler(self, client): return XEP_0085_handler(self, client.profile) - def profileDisconnected(self, client): + def profile_disconnected(self, client): """Eventually send a 'gone' state to all one2one contacts.""" profile = client.profile if profile not in self.map: @@ -137,7 +137,7 @@ self.map[profile][to_jid]._onEvent("gone") del self.map[profile] - def updateCache(self, entity_jid, value, profile): + def update_cache(self, entity_jid, value, profile): """Update the entity data of the given profile for one or all contacts. Reset the chat state(s) display if the notification has been disabled. @@ -145,18 +145,18 @@ @param value: True, False or DELETE_VALUE to delete the entity data @param profile: current profile """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) if value == DELETE_VALUE: - self.host.memory.delEntityDatum(client, entity_jid, ENTITY_KEY) + self.host.memory.del_entity_datum(client, entity_jid, ENTITY_KEY) else: - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, entity_jid, ENTITY_KEY, value ) if not value or value == DELETE_VALUE: # reinit chat state UI for this or these contact(s) - self.host.bridge.chatStateReceived(entity_jid.full(), "", profile) + self.host.bridge.chat_state_received(entity_jid.full(), "", profile) - def paramUpdateTrigger(self, name, value, category, type_, profile): + def param_update_trigger(self, name, value, category, type_, profile): """Reset all the existing chat state entity data associated with this profile after a parameter modification. @param name: parameter name @@ -165,23 +165,23 @@ @param type_: parameter type """ if (category, name) == (PARAM_KEY, PARAM_NAME): - self.updateCache( + self.update_cache( C.ENTITY_ALL, True if C.bool(value) else DELETE_VALUE, profile=profile ) return False return True - def messageReceivedTrigger(self, client, message, post_treat): + def message_received_trigger(self, client, message, post_treat): """ Update the entity cache when we receive a message with body. Check for a chat state in the message and signal frontends. """ profile = client.profile - if not self.host.memory.getParamA(PARAM_NAME, PARAM_KEY, profile_key=profile): + if not self.host.memory.param_get_a(PARAM_NAME, PARAM_KEY, profile_key=profile): return True from_jid = JID(message.getAttribute("from")) - if self._isMUC(from_jid, profile): + if self._is_muc(from_jid, profile): from_jid = from_jid.userhostJID() else: # update entity data for one2one chat # assert from_jid.resource # FIXME: assert doesn't work on normal message from server (e.g. server announce), because there is no resource @@ -190,17 +190,17 @@ try: next(domish.generateElementsNamed(message.elements(), name="active")) # contact enabled Chat State Notifications - self.updateCache(from_jid, True, profile=profile) + self.update_cache(from_jid, True, profile=profile) except StopIteration: if message.getAttribute("type") == "chat": # contact didn't enable Chat State Notifications - self.updateCache(from_jid, False, profile=profile) + self.update_cache(from_jid, False, profile=profile) return True except StopIteration: pass # send our next "composing" states to any MUC and to the contacts who enabled the feature - self._chatStateInit(from_jid, message.getAttribute("type"), profile) + self._chat_state_init(from_jid, message.getAttribute("type"), profile) state_list = [ child.name @@ -212,13 +212,13 @@ for state in state_list: # there must be only one state according to the XEP if state != "gone" or message.getAttribute("type") != "groupchat": - self.host.bridge.chatStateReceived( + self.host.bridge.chat_state_received( message.getAttribute("from"), state, profile ) break return True - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """ @@ -230,16 +230,16 @@ def treatment(mess_data): message = mess_data["xml"] to_jid = JID(message.getAttribute("to")) - if not self._checkActivation(to_jid, forceEntityData=True, profile=profile): + if not self._check_activation(to_jid, forceEntityData=True, profile=profile): return mess_data try: # message with a body always mean active state next(domish.generateElementsNamed(message.elements(), name="body")) message.addElement("active", NS_CHAT_STATES) # launch the chat state machine (init the timer) - if self._isMUC(to_jid, profile): + if self._is_muc(to_jid, profile): to_jid = to_jid.userhostJID() - self._chatStateActive(to_jid, mess_data["type"], profile) + self._chat_state_active(to_jid, mess_data["type"], profile) except StopIteration: if "chat_state" in mess_data["extra"]: state = mess_data["extra"].pop("chat_state") @@ -250,16 +250,16 @@ post_xml_treatments.addCallback(treatment) return True - def _isMUC(self, to_jid, profile): + def _is_muc(self, to_jid, profile): """Tell if that JID is a MUC or not @param to_jid (JID): full or bare JID to check @param profile (str): %(doc_profile)s @return: bool """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: - type_ = self.host.memory.getEntityDatum( + type_ = self.host.memory.get_entity_datum( client, to_jid.userhostJID(), C.ENTITY_TYPE) if type_ == C.ENTITY_TYPE_MUC: return True @@ -267,7 +267,7 @@ pass return False - def _checkActivation(self, to_jid, forceEntityData, profile): + def _check_activation(self, to_jid, forceEntityData, profile): """ @param to_jid: the contact's full JID (or bare if you know it's a MUC) @param forceEntityData: if set to True, a non-existing @@ -275,26 +275,26 @@ @param: current profile @return: True if the notifications should be sent to this JID. """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) # check if the parameter is active - if not self.host.memory.getParamA(PARAM_NAME, PARAM_KEY, profile_key=profile): + if not self.host.memory.param_get_a(PARAM_NAME, PARAM_KEY, profile_key=profile): return False # check if notifications should be sent to this contact - if self._isMUC(to_jid, profile): + if self._is_muc(to_jid, profile): return True # FIXME: this assertion crash when we want to send a message to an online bare jid - # assert to_jid.resource or not self.host.memory.isEntityAvailable(to_jid, profile) # must either have a resource, or talk to an offline contact + # assert to_jid.resource or not self.host.memory.is_entity_available(to_jid, profile) # must either have a resource, or talk to an offline contact try: - return self.host.memory.getEntityDatum(client, to_jid, ENTITY_KEY) + return self.host.memory.get_entity_datum(client, to_jid, ENTITY_KEY) except (exceptions.UnknownEntityError, KeyError): if forceEntityData: # enable it for the first time - self.updateCache(to_jid, True, profile=profile) + self.update_cache(to_jid, True, profile=profile) return True # wait for the first message before sending states return False - def _chatStateInit(self, to_jid, mess_type, profile): + def _chat_state_init(self, to_jid, mess_type, profile): """ Data initialization for the chat state machine. @@ -309,7 +309,7 @@ machine = ChatStateMachine(self.host, to_jid, mess_type, profile) self.map[profile][to_jid] = machine - def _chatStateActive(self, to_jid, mess_type, profile_key): + def _chat_state_active(self, to_jid, mess_type, profile_key): """ Launch the chat state machine on "active" state. @@ -317,13 +317,13 @@ @param mess_type (str): "one2one" or "groupchat" @param profile (str): %(doc_profile)s """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if profile is None: raise exceptions.ProfileUnknownError - self._chatStateInit(to_jid, mess_type, profile) + self._chat_state_init(to_jid, mess_type, profile) self.map[profile][to_jid]._onEvent("active") - def chatStateComposing(self, to_jid_s, profile_key): + def chat_state_composing(self, to_jid_s, profile_key): """Move to the "composing" state when required. Since this method is called from the front-end, it needs to check the @@ -334,13 +334,13 @@ @param profile_key (str): %(doc_profile_key)s """ # TODO: try to optimize this method which is called often - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) to_jid = JID(to_jid_s) - if self._isMUC(to_jid, client.profile): + if self._is_muc(to_jid, client.profile): to_jid = to_jid.userhostJID() elif not to_jid.resource: - to_jid.resource = self.host.memory.getMainResource(client, to_jid) - if not self._checkActivation( + to_jid.resource = self.host.memory.main_resource_get(client, to_jid) + if not self._check_activation( to_jid, forceEntityData=False, profile=client.profile ): return @@ -392,7 +392,7 @@ state=state, jid=self.to_jid.full() ) ) - client = self.host.getClient(self.profile) + client = self.host.get_client(self.profile) mess_data = { "from": client.jid, "to": self.to_jid, @@ -402,7 +402,7 @@ "subject": {}, "extra": {}, } - client.generateMessageXML(mess_data) + client.generate_message_xml(mess_data) mess_data["xml"].addElement(state, NS_CHAT_STATES) client.send(mess_data["xml"])
--- a/sat/plugins/plugin_xep_0092.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0092.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,29 +47,29 @@ def __init__(self, host): log.info(_("Plugin XEP_0092 initialization")) self.host = host - host.bridge.addMethod( - "getSoftwareVersion", + host.bridge.add_method( + "software_version_get", ".plugin", in_sign="ss", out_sign="(sss)", - method=self._getVersion, + method=self._get_version, async_=True, ) try: - self.host.plugins[C.TEXT_CMDS].addWhoIsCb(self._whois, 50) + self.host.plugins[C.TEXT_CMDS].add_who_is_cb(self._whois, 50) except KeyError: log.info(_("Text commands not available")) - def _getVersion(self, entity_jid_s, profile_key): - def prepareForBridge(data): + def _get_version(self, entity_jid_s, profile_key): + def prepare_for_bridge(data): name, version, os = data return (name or "", version or "", os or "") - d = self.getVersion(jid.JID(entity_jid_s), profile_key) - d.addCallback(prepareForBridge) + d = self.version_get(jid.JID(entity_jid_s), profile_key) + d.addCallback(prepare_for_bridge) return d - def getVersion(self, jid_, profile_key=C.PROF_KEY_NONE): + def version_get(self, jid_, profile_key=C.PROF_KEY_NONE): """ Ask version of the client that jid_ is running @param jid_: jid from who we want to know client's version @param profile_key: %(doc_profile_key)s @@ -78,24 +78,24 @@ - version: specific version of the software - os: operating system of the queried entity """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) - def getVersion(__): + def version_get(__): iq_elt = compat.IQ(client.xmlstream, "get") iq_elt["to"] = jid_.full() iq_elt.addElement("query", NS_VERSION) d = iq_elt.send() - d.addCallback(self._gotVersion) + d.addCallback(self._got_version) return d - d = self.host.checkFeature(client, NS_VERSION, jid_) - d.addCallback(getVersion) + d = self.host.check_feature(client, NS_VERSION, jid_) + d.addCallback(version_get) reactor.callLater( TIMEOUT, d.cancel ) # XXX: timeout needed because some clients don't answer the IQ return d - def _gotVersion(self, iq_elt): + def _got_version(self, iq_elt): try: query_elt = next(iq_elt.elements(NS_VERSION, "query")) except StopIteration: @@ -113,7 +113,7 @@ def _whois(self, client, whois_msg, mess_data, target_jid): """ Add software/OS information to whois """ - def versionCb(version_data): + def version_cb(version_data): name, version, os = version_data if name: whois_msg.append(_("Client name: %s") % name) @@ -122,13 +122,13 @@ if os: whois_msg.append(_("Operating system: %s") % os) - def versionEb(failure): + def version_eb(failure): failure.trap(exceptions.FeatureNotFound, defer.CancelledError) if failure.check(failure, exceptions.FeatureNotFound): whois_msg.append(_("Software version not available")) else: whois_msg.append(_("Client software version request timeout")) - d = self.getVersion(target_jid, client.profile) - d.addCallbacks(versionCb, versionEb) + d = self.version_get(target_jid, client.profile) + d.addCallbacks(version_cb, version_eb) return d
--- a/sat/plugins/plugin_xep_0095.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0095.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,10 +55,10 @@ self.host = host self.si_profiles = {} # key: SI profile, value: callback - def getHandler(self, client): + def get_handler(self, client): return XEP_0095_handler(self) - def registerSIProfile(self, si_profile, callback): + def register_si_profile(self, si_profile, callback): """Add a callback for a SI Profile @param si_profile(unicode): SI profile name (e.g. file-transfer) @@ -66,7 +66,7 @@ """ self.si_profiles[si_profile] = callback - def unregisterSIProfile(self, si_profile): + def unregister_si_profile(self, si_profile): try: del self.si_profiles[si_profile] except KeyError: @@ -76,7 +76,7 @@ ) ) - def streamInit(self, iq_elt, client): + def stream_init(self, iq_elt, client): """This method is called on stream initiation (XEP-0095 #3.2) @param iq_elt: IQ element @@ -117,7 +117,7 @@ client.send(iq_error_elt) - def acceptStream(self, client, iq_elt, feature_elt, misc_elts=None): + def accept_stream(self, client, iq_elt, feature_elt, misc_elts=None): """Send the accept stream initiation answer @param iq_elt(domish.Element): initial SI request @@ -134,7 +134,7 @@ si_elt.addChild(elt) client.send(result_elt) - def _parseOfferResult(self, iq_elt): + def _parse_offer_result(self, iq_elt): try: si_elt = next(iq_elt.elements(NS_SI, "si")) except StopIteration: @@ -142,7 +142,7 @@ raise exceptions.DataError return (iq_elt, si_elt) - def proposeStream( + def propose_stream( self, client, to_jid, @@ -178,7 +178,7 @@ si.addChild(feature_elt) offer_d = offer.send() - offer_d.addCallback(self._parseOfferResult) + offer_d.addCallback(self._parse_offer_result) return sid, offer_d @@ -191,7 +191,7 @@ def connectionInitialized(self): self.xmlstream.addObserver( - SI_REQUEST, self.plugin_parent.streamInit, client=self.parent + SI_REQUEST, self.plugin_parent.stream_init, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0096.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0096.py Sat Apr 08 13:54:42 2023 +0200 @@ -64,18 +64,18 @@ self._f = self.host.plugins["FILE"] self._f.register(self) self._si = self.host.plugins["XEP-0095"] - self._si.registerSIProfile(SI_PROFILE_NAME, self._transferRequest) - host.bridge.addMethod( - "siSendFile", ".plugin", in_sign="sssss", out_sign="s", method=self._fileSend + self._si.register_si_profile(SI_PROFILE_NAME, self._transfer_request) + host.bridge.add_method( + "si_file_send", ".plugin", in_sign="sssss", out_sign="s", method=self._file_send ) - async def canHandleFileSend(self, client, peer_jid, filepath): + async def can_handle_file_send(self, client, peer_jid, filepath): return await self.host.hasFeature(client, NS_SI_FT, peer_jid) def unload(self): - self._si.unregisterSIProfile(SI_PROFILE_NAME) + self._si.unregister_si_profile(SI_PROFILE_NAME) - def _badRequest(self, client, iq_elt, message=None): + def _bad_request(self, client, iq_elt, message=None): """Send a bad-request error @param iq_elt(domish.Element): initial <IQ> element of the SI request @@ -85,7 +85,7 @@ log.warning(message) self._si.sendError(client, iq_elt, "bad-request") - def _parseRange(self, parent_elt, file_size): + def _parse_range(self, parent_elt, file_size): """find and parse <range/> element @param parent_elt(domish.Element): direct parent of the <range/> element @@ -118,7 +118,7 @@ return range_, range_offset, range_length - def _transferRequest(self, client, iq_elt, si_id, si_mime_type, si_elt): + def _transfer_request(self, client, iq_elt, si_id, si_mime_type, si_elt): """Called when a file transfer is requested @param iq_elt(domish.Element): initial <IQ> element of the SI request @@ -132,14 +132,14 @@ try: file_elt = next(si_elt.elements(NS_SI_FT, "file")) except StopIteration: - return self._badRequest( + return self._bad_request( client, iq_elt, "No <file/> element found in SI File Transfer request" ) try: - feature_elt = self.host.plugins["XEP-0020"].getFeatureElt(si_elt) + feature_elt = self.host.plugins["XEP-0020"].get_feature_elt(si_elt) except exceptions.NotFound: - return self._badRequest( + return self._bad_request( client, iq_elt, "No <feature/> element found in SI File Transfer request" ) @@ -147,7 +147,7 @@ filename = file_elt["name"] file_size = int(file_elt["size"]) except (KeyError, ValueError): - return self._badRequest(client, iq_elt, "Malformed SI File Transfer request") + return self._bad_request(client, iq_elt, "Malformed SI File Transfer request") file_date = file_elt.getAttribute("date") file_hash = file_elt.getAttribute("hash") @@ -164,16 +164,16 @@ file_desc = "" try: - range_, range_offset, range_length = self._parseRange(file_elt, file_size) + range_, range_offset, range_length = self._parse_range(file_elt, file_size) except ValueError: - return self._badRequest(client, iq_elt, "Malformed SI File Transfer request") + return self._bad_request(client, iq_elt, "Malformed SI File Transfer request") try: stream_method = self.host.plugins["XEP-0020"].negotiate( feature_elt, "stream-method", self.managed_stream_m, namespace=None ) except KeyError: - return self._badRequest(client, iq_elt, "No stream method found") + return self._bad_request(client, iq_elt, "No stream method found") if stream_method: if stream_method == self.host.plugins["XEP-0065"].NAMESPACE: @@ -207,11 +207,11 @@ } d = defer.ensureDeferred( - self._f.getDestDir(client, peer_jid, data, data, stream_object=True) + self._f.get_dest_dir(client, peer_jid, data, data, stream_object=True) ) - d.addCallback(self.confirmationCb, client, iq_elt, data) + d.addCallback(self.confirmation_cb, client, iq_elt, data) - def confirmationCb(self, accepted, client, iq_elt, data): + def confirmation_cb(self, accepted, client, iq_elt, data): """Called on confirmation answer @param accepted(bool): True if file transfer is accepted @@ -244,14 +244,14 @@ # file_obj = self._getFileObject(dest_path, can_range) # range_offset = file_obj.tell() - d = data["stream_plugin"].createSession( + d = data["stream_plugin"].create_session( client, data["stream_object"], client.jid, data["peer_jid"], data["si_id"] ) - d.addCallback(self._transferCb, client, data) - d.addErrback(self._transferEb, client, data) + d.addCallback(self._transfer_cb, client, data) + d.addErrback(self._transfer_eb, client, data) # we can send the iq result - feature_elt = self.host.plugins["XEP-0020"].chooseOption( + feature_elt = self.host.plugins["XEP-0020"].choose_option( {"stream-method": data["stream_method"]}, namespace=None ) misc_elts = [] @@ -261,9 +261,9 @@ # range_elt['offset'] = str(range_offset) # #TODO: manage range length # misc_elts.append(range_elt) - self._si.acceptStream(client, iq_elt, feature_elt, misc_elts) + self._si.accept_stream(client, iq_elt, feature_elt, misc_elts) - def _transferCb(self, __, client, data): + def _transfer_cb(self, __, client, data): """Called by the stream method when transfer successfuly finished @param data: session data @@ -272,7 +272,7 @@ data["stream_object"].close() log.info("Transfer {si_id} successfuly finished".format(**data)) - def _transferEb(self, failure, client, data): + def _transfer_eb(self, failure, client, data): """Called when something went wrong with the transfer @param id: stream id @@ -285,13 +285,13 @@ ) data["stream_object"].close() - def _fileSend(self, peer_jid_s, filepath, name, desc, profile=C.PROF_KEY_NONE): - client = self.host.getClient(profile) - return self.fileSend( + def _file_send(self, peer_jid_s, filepath, name, desc, profile=C.PROF_KEY_NONE): + client = self.host.get_client(profile) + return self.file_send( client, jid.JID(peer_jid_s), filepath, name or None, desc or None ) - def fileSend(self, client, peer_jid, filepath, name=None, desc=None, extra=None): + def file_send(self, client, peer_jid, filepath, name=None, desc=None, extra=None): """Send a file using XEP-0096 @param peer_jid(jid.JID): recipient @@ -302,7 +302,7 @@ @param extra: not used here @return: an unique id to identify the transfer """ - feature_elt = self.host.plugins["XEP-0020"].proposeFeatures( + feature_elt = self.host.plugins["XEP-0020"].propose_features( {"stream-method": self.managed_stream_m}, namespace=None ) @@ -320,23 +320,23 @@ file_transfer_elts.append(domish.Element((None, "range"))) - sid, offer_d = self._si.proposeStream( + sid, offer_d = self._si.propose_stream( client, peer_jid, SI_PROFILE, feature_elt, file_transfer_elts ) args = [filepath, sid, size, client] - offer_d.addCallbacks(self._fileCb, self._fileEb, args, None, args) + offer_d.addCallbacks(self._file_cb, self._file_eb, args, None, args) return sid - def _fileCb(self, result_tuple, filepath, sid, size, client): + def _file_cb(self, result_tuple, filepath, sid, size, client): iq_elt, si_elt = result_tuple try: - feature_elt = self.host.plugins["XEP-0020"].getFeatureElt(si_elt) + feature_elt = self.host.plugins["XEP-0020"].get_feature_elt(si_elt) except exceptions.NotFound: log.warning("No <feature/> element found in result while expected") return - choosed_options = self.host.plugins["XEP-0020"].getChoosedOptions( + choosed_options = self.host.plugins["XEP-0020"].get_choosed_options( feature_elt, namespace=None ) try: @@ -350,7 +350,7 @@ except StopIteration: pass else: - range_, range_offset, range_length = self._parseRange(file_elt, size) + range_, range_offset, range_length = self._parse_range(file_elt, size) if stream_method == self.host.plugins["XEP-0065"].NAMESPACE: plugin = self.host.plugins["XEP-0065"] @@ -363,12 +363,12 @@ stream_object = stream.FileStreamObject( self.host, client, filepath, uid=sid, size=size ) - d = plugin.startStream(client, stream_object, client.jid, + d = plugin.start_stream(client, stream_object, client.jid, jid.JID(iq_elt["from"]), sid) - d.addCallback(self._sendCb, client, sid, stream_object) - d.addErrback(self._sendEb, client, sid, stream_object) + d.addCallback(self._send_cb, client, sid, stream_object) + d.addErrback(self._send_eb, client, sid, stream_object) - def _fileEb(self, failure, filepath, sid, size, client): + def _file_eb(self, failure, filepath, sid, size, client): if failure.check(error.StanzaError): stanza_err = failure.value if stanza_err.code == "403" and stanza_err.condition == "forbidden": @@ -376,20 +376,20 @@ log.info("File transfer refused by {}".format(from_s)) msg = D_("The contact {} has refused your file").format(from_s) title = D_("File refused") - xml_tools.quickNote(self.host, client, msg, title, C.XMLUI_DATA_LVL_INFO) + xml_tools.quick_note(self.host, client, msg, title, C.XMLUI_DATA_LVL_INFO) else: log.warning(_("Error during file transfer")) msg = D_( "Something went wrong during the file transfer session initialisation: {reason}" ).format(reason=str(stanza_err)) title = D_("File transfer error") - xml_tools.quickNote(self.host, client, msg, title, C.XMLUI_DATA_LVL_ERROR) + xml_tools.quick_note(self.host, client, msg, title, C.XMLUI_DATA_LVL_ERROR) elif failure.check(exceptions.DataError): log.warning("Invalid stanza received") else: log.error("Error while proposing stream: {}".format(failure)) - def _sendCb(self, __, client, sid, stream_object): + def _send_cb(self, __, client, sid, stream_object): log.info( _("transfer {sid} successfuly finished [{profile}]").format( sid=sid, profile=client.profile @@ -397,7 +397,7 @@ ) stream_object.close() - def _sendEb(self, failure, client, sid, stream_object): + def _send_eb(self, failure, client, sid, stream_object): log.warning( _("transfer {sid} failed [{profile}]: {reason}").format( sid=sid, profile=client.profile, reason=str(failure.value)
--- a/sat/plugins/plugin_xep_0100.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0100.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,54 +61,54 @@ def __init__(self, host): log.info(_("Gateways plugin initialization")) self.host = host - self.__gateways = {} # dict used to construct the answer to findGateways. Key = target jid - host.bridge.addMethod( - "findGateways", + self.__gateways = {} # dict used to construct the answer to gateways_find. Key = target jid + host.bridge.add_method( + "gateways_find", ".plugin", in_sign="ss", out_sign="s", - method=self._findGateways, + method=self._find_gateways, ) - host.bridge.addMethod( - "gatewayRegister", + host.bridge.add_method( + "gateway_register", ".plugin", in_sign="ss", out_sign="s", - method=self._gatewayRegister, + method=self._gateway_register, ) - self.__menu_id = host.registerCallback(self._gatewaysMenu, with_data=True) - self.__selected_id = host.registerCallback( - self._gatewaySelectedCb, with_data=True + self.__menu_id = host.register_callback(self._gateways_menu, with_data=True) + self.__selected_id = host.register_callback( + self._gateway_selected_cb, with_data=True ) - host.importMenu( + host.import_menu( (D_("Service"), D_("Gateways")), - self._gatewaysMenu, + self._gateways_menu, security_limit=1, help_string=D_("Find gateways"), ) - def _gatewaysMenu(self, data, profile): + def _gateways_menu(self, data, profile): """ XMLUI activated by menu: return Gateways UI @param profile: %(doc_profile)s """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) try: jid_ = jid.JID( - data.get(xml_tools.formEscape("external_jid"), client.jid.host) + data.get(xml_tools.form_escape("external_jid"), client.jid.host) ) except RuntimeError: raise exceptions.DataError(_("Invalid JID")) - d = self.findGateways(jid_, profile) - d.addCallback(self._gatewaysResult2XMLUI, jid_) + d = self.gateways_find(jid_, profile) + d.addCallback(self._gateways_result_2_xmlui, jid_) d.addCallback(lambda xmlui: {"xmlui": xmlui.toXml()}) return d - def _gatewaysResult2XMLUI(self, result, entity): + def _gateways_result_2_xmlui(self, result, entity): xmlui = xml_tools.XMLUI(title=_("Gateways manager (%s)") % entity.full()) xmlui.addText(_(WARNING_MSG)) xmlui.addDivider("dash") - adv_list = xmlui.changeContainer( + adv_list = xmlui.change_container( "advanced_list", columns=3, selectable="single", @@ -124,30 +124,30 @@ jid_, data = gateway_data for datum in data: identity, name = datum - adv_list.setRowIndex(jid_.full()) + adv_list.set_row_index(jid_.full()) xmlui.addJid(jid_) xmlui.addText(name) - xmlui.addText(self._getIdentityDesc(identity)) + xmlui.addText(self._get_identity_desc(identity)) adv_list.end() xmlui.addDivider("blank") - xmlui.changeContainer("advanced_list", columns=3) + xmlui.change_container("advanced_list", columns=3) xmlui.addLabel(_("Use external XMPP server")) xmlui.addString("external_jid") xmlui.addButton(self.__menu_id, _("Go !"), fields_back=("external_jid",)) return xmlui - def _gatewaySelectedCb(self, data, profile): + def _gateway_selected_cb(self, data, profile): try: target_jid = jid.JID(data["index"]) except (KeyError, RuntimeError): log.warning(_("No gateway index selected")) return {} - d = self.gatewayRegister(target_jid, profile) + d = self.gateway_register(target_jid, profile) d.addCallback(lambda xmlui: {"xmlui": xmlui.toXml()}) return d - def _getIdentityDesc(self, identity): + def _get_identity_desc(self, identity): """ Return a human readable description of identity @param identity: tuple as returned by Disco identities (category, type) @@ -165,27 +165,27 @@ except KeyError: return _("Unknown IM") - def _registrationSuccessful(self, jid_, profile): + def _registration_successful(self, jid_, profile): """Called when in_band registration is ok, we must now follow the rest of procedure""" log.debug(_("Registration successful, doing the rest")) - self.host.addContact(jid_, profile_key=profile) - self.host.setPresence(jid_, profile_key=profile) + self.host.contact_add(jid_, profile_key=profile) + self.host.presence_set(jid_, profile_key=profile) - def _gatewayRegister(self, target_jid_s, profile_key=C.PROF_KEY_NONE): - d = self.gatewayRegister(jid.JID(target_jid_s), profile_key) + def _gateway_register(self, target_jid_s, profile_key=C.PROF_KEY_NONE): + d = self.gateway_register(jid.JID(target_jid_s), profile_key) d.addCallback(lambda xmlui: xmlui.toXml()) return d - def gatewayRegister(self, target_jid, profile_key=C.PROF_KEY_NONE): + def gateway_register(self, target_jid, profile_key=C.PROF_KEY_NONE): """Register gateway using in-band registration, then log-in to gateway""" - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) assert profile - d = self.host.plugins["XEP-0077"].inBandRegister( - target_jid, self._registrationSuccessful, profile + d = self.host.plugins["XEP-0077"].in_band_register( + target_jid, self._registration_successful, profile ) return d - def _infosReceived(self, dl_result, items, target, client): + def _infos_received(self, dl_result, items, target, client): """Find disco infos about entity, to check if it is a gateway""" ret = [] @@ -224,7 +224,7 @@ ) return ret - def _itemsReceived(self, disco, target, client): + def _items_received(self, disco, target, client): """Look for items with disco protocol, and ask infos for each one""" if len(disco._items) == 0: @@ -237,29 +237,29 @@ _defers.append(client.disco.requestInfo(item.entity)) dl = defer.DeferredList(_defers) dl.addCallback( - self._infosReceived, items=disco._items, target=target, client=client + self._infos_received, items=disco._items, target=target, client=client ) reactor.callLater(GATEWAY_TIMEOUT, dl.cancel) return dl - def _findGateways(self, target_jid_s, profile_key): + def _find_gateways(self, target_jid_s, profile_key): target_jid = jid.JID(target_jid_s) - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileUnknownError - d = self.findGateways(target_jid, profile) - d.addCallback(self._gatewaysResult2XMLUI, target_jid) + d = self.gateways_find(target_jid, profile) + d.addCallback(self._gateways_result_2_xmlui, target_jid) d.addCallback(lambda xmlui: xmlui.toXml()) return d - def findGateways(self, target, profile): + def gateways_find(self, target, profile): """Find gateways in the target JID, using discovery protocol """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) log.debug( _("find gateways (target = %(target)s, profile = %(profile)s)") % {"target": target.full(), "profile": profile} ) d = client.disco.requestItems(target) - d.addCallback(self._itemsReceived, target=target, client=client) + d.addCallback(self._items_received, target=target, client=client) return d
--- a/sat/plugins/plugin_xep_0103.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0103.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,7 +46,7 @@ def __init__(self, host): log.info(_("XEP-0103 (URL Address Information) plugin initialization")) - host.registerNamespace("url-data", NS_URL_DATA) + host.register_namespace("url-data", NS_URL_DATA) def get_url_data_elt( self,
--- a/sat/plugins/plugin_xep_0106.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0106.py Sat Apr 08 13:54:42 2023 +0200 @@ -58,7 +58,7 @@ def __init__(self, host): self.reverse_map = {v:k for k,v in ESCAPE_MAP.items()} - def getHandler(self, client): + def get_handler(self, client): return XEP_0106_handler() def escape(self, text):
--- a/sat/plugins/plugin_xep_0115.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0115.py Sat Apr 08 13:54:42 2023 +0200 @@ -57,13 +57,13 @@ def __init__(self, host): log.info(_("Plugin XEP_0115 initialization")) self.host = host - host.trigger.add("Presence send", self._presenceTrigger) + host.trigger.add("Presence send", self._presence_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP_0115_handler(self) @defer.inlineCallbacks - def _prepareCaps(self, client): + def _prepare_caps(self, client): # we have to calculate hash for client # because disco infos/identities may change between clients @@ -80,7 +80,7 @@ disco_infos = disco.DiscoInfo() for item in _infos: disco_infos.append(item) - cap_hash = client._caps_hash = self.host.memory.disco.generateHash(disco_infos) + cap_hash = client._caps_hash = self.host.memory.disco.generate_hash(disco_infos) log.info( "Our capability hash has been generated: [{cap_hash}]".format( cap_hash=cap_hash @@ -96,22 +96,22 @@ client._caps_sent = False if cap_hash not in self.host.memory.disco.hashes: self.host.memory.disco.hashes[cap_hash] = disco_infos - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( client, client.jid, C.ENTITY_CAP_HASH, cap_hash ) - def _presenceAddElt(self, client, obj): + def _presence_add_elt(self, client, obj): if client._caps_optimize: if client._caps_sent: return client.caps_sent = True obj.addChild(client._caps_elt) - def _presenceTrigger(self, client, obj, presence_d): + def _presence_trigger(self, client, obj, presence_d): if not hasattr(client, "_caps_optimize"): - presence_d.addCallback(lambda __: self._prepareCaps(client)) + presence_d.addCallback(lambda __: self._prepare_caps(client)) - presence_d.addCallback(lambda __: self._presenceAddElt(client, obj)) + presence_d.addCallback(lambda __: self._presence_add_elt(client, obj)) return True @@ -160,7 +160,7 @@ "hash [%(hash)s] already in cache, updating entity [%(jid)s]" % {"hash": c_ver, "jid": from_jid.full()} ) - self.host.memory.updateEntityData( + self.host.memory.update_entity_data( self.client, from_jid, C.ENTITY_CAP_HASH, c_ver ) return @@ -175,7 +175,7 @@ ) def cb(__): - computed_hash = self.host.memory.getEntityDatum( + computed_hash = self.host.memory.get_entity_datum( self.client, from_jid, C.ENTITY_CAP_HASH ) if computed_hash != c_ver: @@ -207,6 +207,6 @@ ) ) - d = self.host.getDiscoInfos(self.parent, from_jid) + d = self.host.get_disco_infos(self.parent, from_jid) d.addCallbacks(cb, eb) # TODO: me must manage the full algorithm described at XEP-0115 #5.4 part 3
--- a/sat/plugins/plugin_xep_0163.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0163.py Sat Apr 08 13:54:42 2023 +0200 @@ -53,18 +53,18 @@ self.host = host self.pep_events = set() self.pep_out_cb = {} - host.trigger.add("PubSub Disco Info", self.disoInfoTrigger) - host.bridge.addMethod( - "PEPSend", + host.trigger.add("PubSub Disco Info", self.diso_info_trigger) + host.bridge.add_method( + "pep_send", ".plugin", in_sign="sa{ss}s", out_sign="", - method=self.PEPSend, + method=self.pep_send, async_=True, ) # args: type(MOOD, TUNE, etc), data, profile_key; - self.addPEPEvent("MOOD", NS_USER_MOOD, self.userMoodCB, self.sendMood) + self.add_pep_event("MOOD", NS_USER_MOOD, self.user_mood_cb, self.send_mood) - def disoInfoTrigger(self, disco_info, profile): + def diso_info_trigger(self, disco_info, profile): """Add info from managed PEP @param disco_info: list of disco feature as returned by PubSub, @@ -74,7 +74,7 @@ disco_info.extend(list(map(disco.DiscoFeature, self.pep_events))) return True - def addPEPEvent( + def add_pep_event( self, event_type: Optional[str], node: str, @@ -93,7 +93,7 @@ the callable will be called with (itemsEvent, profile) as arguments @param out_callback: method to call when we want to publish this event (must return a deferred) - the callable will be called when sendPEPEvent is called + the callable will be called when send_pep_event is called @param notify: add autosubscribe (+notify) if True """ if event_type and out_callback: @@ -107,7 +107,7 @@ if notify: self.pep_events.add(node + "+notify") - def filterPEPEvent(client, itemsEvent): + def filter_pep_event(client, itemsEvent): """Ignore messages which are not coming from PEP (i.e. a bare jid) @param itemsEvent(pubsub.ItemsEvent): pubsub event @@ -121,20 +121,20 @@ return in_callback(itemsEvent, client.profile) - self.host.plugins["XEP-0060"].addManagedNode(node, items_cb=filterPEPEvent) + self.host.plugins["XEP-0060"].add_managed_node(node, items_cb=filter_pep_event) - def sendPEPEvent(self, node, data, profile): + def send_pep_event(self, node, data, profile): """Publish the event data @param node(unicode): node namespace @param data: domish.Element to use as payload @param profile: profile which send the data """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) item = pubsub.Item(payload=data) return self.host.plugins["XEP-0060"].publish(client, None, node, [item]) - def PEPSend(self, event_type, data, profile_key=C.PROF_KEY_NONE): + def pep_send(self, event_type, data, profile_key=C.PROF_KEY_NONE): """Send personal event after checking the data is alright @param event_type: type of event (eg: MOOD, TUNE), @@ -142,7 +142,7 @@ @param data: dict of {string:string} of event_type dependant data @param profile_key: profile who send the event """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if not profile: log.error( _("Trying to send personal event with an unknown profile key [%s]") @@ -154,7 +154,7 @@ raise exceptions.DataError("Type unknown") return self.pep_out_cb[event_type](data, profile) - def userMoodCB(self, itemsEvent, profile): + def user_mood_cb(self, itemsEvent, profile): if not itemsEvent.items: log.debug(_("No item found")) return @@ -169,7 +169,7 @@ if not mood: log.debug(_("No mood found")) return - self.host.bridge.psEvent( + self.host.bridge.ps_event( C.PS_PEP, itemsEvent.sender.full(), itemsEvent.nodeIdentifier, @@ -178,7 +178,7 @@ profile, ) - def sendMood(self, data, profile): + def send_mood(self, data, profile): """Send XEP-0107's User Mood @param data: must include mood and text @@ -189,7 +189,7 @@ except KeyError: raise exceptions.DataError("Mood data must contain at least 'mood' key") mood = UserMood(value, text) - return self.sendPEPEvent(NS_USER_MOOD, mood, profile) + return self.send_pep_event(NS_USER_MOOD, mood, profile) class UserMood(Mood, domish.Element):
--- a/sat/plugins/plugin_xep_0166.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0166.py Sat Apr 08 13:54:42 2023 +0200 @@ -110,13 +110,13 @@ XEP_0166.TRANSPORT_STREAMING: [], } - def profileConnected(self, client): + def profile_connected(self, client): client.jingle_sessions = {} # key = sid, value = session_data - def getHandler(self, client): + def get_handler(self, client): return XEP_0166_handler(self) - def _delSession(self, client, sid): + def _del_session(self, client, sid): try: del client.jingle_sessions[sid] except KeyError: @@ -128,7 +128,7 @@ ## helpers methods to build stanzas ## - def _buildJingleElt(self, client, session, action): + def _build_jingle_elt(self, client, session, action): iq_elt = client.IQ("set") iq_elt["from"] = session['local_jid'].full() iq_elt["to"] = session["peer_jid"].full() @@ -149,7 +149,7 @@ if jingle_condition is not None: iq_elt.error.addElement((NS_JINGLE_ERROR, jingle_condition)) if error.STANZA_CONDITIONS[error_condition]["type"] == "cancel" and sid: - self._delSession(client, sid) + self._del_session(client, sid) log.warning( "Error while managing jingle session, cancelling: {condition}".format( condition=error_condition @@ -157,7 +157,7 @@ ) return client.send(iq_elt) - def _terminateEb(self, failure_): + def _terminate_eb(self, failure_): log.warning(_("Error while terminating session: {msg}").format(msg=failure_)) def terminate(self, client, reason, session, text=None): @@ -168,7 +168,7 @@ if a list of element, add them as children of the <reason/> element @param session(dict): data of the session """ - iq_elt, jingle_elt = self._buildJingleElt( + iq_elt, jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_SESSION_TERMINATE ) reason_elt = jingle_elt.addElement("reason") @@ -179,14 +179,14 @@ reason_elt.addChild(elt) if text is not None: reason_elt.addElement("text", content=text) - self._delSession(client, session["id"]) + self._del_session(client, session["id"]) d = iq_elt.send() - d.addErrback(self._terminateEb) + d.addErrback(self._terminate_eb) return d ## errors which doesn't imply a stanza sending ## - def _iqError(self, failure_, sid, client): + def _iq_error(self, failure_, sid, client): """Called when we got an <iq/> error @param failure_(failure.Failure): the exceptions raised @@ -197,9 +197,9 @@ failure_=failure_.value ) ) - self._delSession(client, sid) + self._del_session(client, sid) - def _jingleErrorCb(self, failure_, session, request, client): + def _jingle_error_cb(self, failure_, session, request, client): """Called when something is going wrong while parsing jingle request The error condition depend of the exceptions raised: @@ -224,7 +224,7 @@ ## methods used by other plugins ## - def registerApplication(self, namespace, handler): + def register_application(self, namespace, handler): """Register an application plugin @param namespace(unicode): application namespace managed by the plugin @@ -235,10 +235,10 @@ - if it return True the session is accepted, else rejected. A Deferred can be returned - if not present, a generic accept dialog will be used - - jingleSessionInit(client, self, session, content_name[, *args, **kwargs]): must return the domish.Element used for initial content - - jingleHandler(client, self, action, session, content_name, transport_elt): + - jingle_session_init(client, self, session, content_name[, *args, **kwargs]): must return the domish.Element used for initial content + - jingle_handler(client, self, action, session, content_name, transport_elt): called on several action to negociate the application or transport - - jingleTerminate: called on session terminate, with reason_elt + - jingle_terminate: called on session terminate, with reason_elt May be used to clean session """ if namespace in self._applications: @@ -250,15 +250,15 @@ ) log.debug("new jingle application registered") - def registerTransport(self, namespace, transport_type, handler, priority=0): + def register_transport(self, namespace, transport_type, handler, priority=0): """Register a transport plugin @param namespace(unicode): the XML namespace used for this transport @param transport_type(unicode): type of transport to use (see XEP-0166 §8) @param handler(object): instance of a class which manage the application. Must have the following methods: - - jingleSessionInit(client, self, session, content_name[, *args, **kwargs]): must return the domish.Element used for initial content - - jingleHandler(client, self, action, session, content_name, transport_elt): + - jingle_session_init(client, self, session, content_name[, *args, **kwargs]): must return the domish.Element used for initial content + - jingle_handler(client, self, action, session, content_name, transport_elt): called on several action to negociate the application or transport @param priority(int): priority of this transport """ @@ -281,7 +281,7 @@ log.debug("new jingle transport registered") @defer.inlineCallbacks - def transportReplace(self, client, transport_ns, session, content_name): + def transport_replace(self, client, transport_ns, session, content_name): """Replace a transport @param transport_ns(unicode): namespace of the new transport to use @@ -297,35 +297,35 @@ transport = self._transports[transport_ns] except KeyError: raise exceptions.InternalError("Unkown transport") - yield content_data["transport"].handler.jingleHandler( + yield content_data["transport"].handler.jingle_handler( client, XEP_0166.A_DESTROY, session, content_name, None ) content_data["transport"] = transport transport_data.clear() - iq_elt, jingle_elt = self._buildJingleElt( + iq_elt, jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_TRANSPORT_REPLACE ) content_elt = jingle_elt.addElement("content") content_elt["name"] = content_name content_elt["creator"] = content_data["creator"] - transport_elt = transport.handler.jingleSessionInit(client, session, content_name) + transport_elt = transport.handler.jingle_session_init(client, session, content_name) content_elt.addChild(transport_elt) iq_elt.send() - def buildAction(self, client, action, session, content_name): + def build_action(self, client, action, session, content_name): """Build an element according to requested action @param action(unicode): a jingle action (see XEP-0166 §7.2), session-* actions are not managed here - transport-replace is managed in the dedicated [transportReplace] method + transport-replace is managed in the dedicated [transport_replace] method @param session(dict): jingle session data @param content_name(unicode): name of the content @return (tuple[domish.Element, domish.Element]): parent <iq> element, <transport> or <description> element, according to action """ # we first build iq, jingle and content element which are the same in every cases - iq_elt, jingle_elt = self._buildJingleElt(client, session, action) + iq_elt, jingle_elt = self._build_jingle_elt(client, session, action) # FIXME: XEP-0260 § 2.3 Ex 5 has an initiator attribute, but it should not according to XEP-0166 §7.1 table 1, must be checked content_data = session["contents"][content_name] content_elt = jingle_elt.addElement("content") @@ -342,13 +342,13 @@ return iq_elt, context_elt - def buildSessionInfo(self, client, session): + def build_session_info(self, client, session): """Build a session-info action @param session(dict): jingle session data @return (tuple[domish.Element, domish.Element]): parent <iq> element, <jingle> element """ - return self._buildJingleElt(client, session, XEP_0166.A_SESSION_INFO) + return self._build_jingle_elt(client, session, XEP_0166.A_SESSION_INFO) def getApplication(self, namespace: str) -> object: """Retreive application corresponding to a namespace @@ -362,7 +362,7 @@ f"No application registered for {namespace}" ) - def getContentData(self, content: dict) -> Tuple[object, list, dict, str]: + def get_content_data(self, content: dict) -> Tuple[object, list, dict, str]: """"Retrieve application and its argument from content""" app_ns = content["app_ns"] try: @@ -414,13 +414,13 @@ "contents": {}, } - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0166_initiate", client, session, contents ): return - iq_elt, jingle_elt = self._buildJingleElt( + iq_elt, jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_SESSION_INITIATE ) jingle_elt["initiator"] = initiator.full() @@ -429,7 +429,7 @@ for content in contents: # we get the application plugin - application, app_args, app_kwargs, content_name = self.getContentData(content) + application, app_args, app_kwargs, content_name = self.get_content_data(content) # and the transport plugin transport_type = content.get("transport_type", XEP_0166.TRANSPORT_STREAMING) @@ -465,20 +465,20 @@ pass # then the description element - desc_elt = await utils.asDeferred( - application.handler.jingleSessionInit, + desc_elt = await utils.as_deferred( + application.handler.jingle_session_init, client, session, content_name, *app_args, **app_kwargs ) content_elt.addChild(desc_elt) # and the transport one - transport_elt = await utils.asDeferred( - transport.handler.jingleSessionInit, + transport_elt = await utils.as_deferred( + transport.handler.jingle_session_init, client, session, content_name ) content_elt.addChild(transport_elt) - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0166_initiate_elt_built", client, session, iq_elt, jingle_elt ): @@ -494,17 +494,17 @@ await iq_elt.send() except Exception as e: failure_ = failure.Failure(e) - self._iqError(failure_, sid, client) + self._iq_error(failure_, sid, client) raise failure_ - def delayedContentTerminate(self, *args, **kwargs): - """Put contentTerminate in queue but don't execute immediately + def delayed_content_terminate(self, *args, **kwargs): + """Put content_terminate in queue but don't execute immediately This is used to terminate a content inside a handler, to avoid modifying contents """ - reactor.callLater(0, self.contentTerminate, *args, **kwargs) + reactor.callLater(0, self.content_terminate, *args, **kwargs) - def contentTerminate(self, client, session, content_name, reason=REASON_SUCCESS): + def content_terminate(self, client, session, content_name, reason=REASON_SUCCESS): """Terminate and remove a content if there is no more content, then session is terminated @@ -519,12 +519,12 @@ ## defaults methods called when plugin doesn't have them ## - def jingleRequestConfirmationDefault( + def jingle_request_confirmation_default( self, client, action, session, content_name, desc_elt ): """This method request confirmation for a jingle session""" log.debug("Using generic jingle confirmation method") - return xml_tools.deferConfirm( + return xml_tools.defer_confirm( self.host, _(CONFIRM_TXT).format(entity=session["peer_jid"].full()), _("Confirm Jingle session"), @@ -620,27 +620,27 @@ raise exceptions.InternalError if action == XEP_0166.A_SESSION_INITIATE: - await self.onSessionInitiate(client, request, jingle_elt, session) + await self.on_session_initiate(client, request, jingle_elt, session) elif action == XEP_0166.A_SESSION_TERMINATE: - self.onSessionTerminate(client, request, jingle_elt, session) + self.on_session_terminate(client, request, jingle_elt, session) elif action == XEP_0166.A_SESSION_ACCEPT: - self.onSessionAccept(client, request, jingle_elt, session) + self.on_session_accept(client, request, jingle_elt, session) elif action == XEP_0166.A_SESSION_INFO: - self.onSessionInfo(client, request, jingle_elt, session) + self.on_session_info(client, request, jingle_elt, session) elif action == XEP_0166.A_TRANSPORT_INFO: - self.onTransportInfo(client, request, jingle_elt, session) + self.on_transport_info(client, request, jingle_elt, session) elif action == XEP_0166.A_TRANSPORT_REPLACE: - self.onTransportReplace(client, request, jingle_elt, session) + self.on_transport_replace(client, request, jingle_elt, session) elif action == XEP_0166.A_TRANSPORT_ACCEPT: - self.onTransportAccept(client, request, jingle_elt, session) + self.on_transport_accept(client, request, jingle_elt, session) elif action == XEP_0166.A_TRANSPORT_REJECT: - self.onTransportReject(client, request, jingle_elt, session) + self.on_transport_reject(client, request, jingle_elt, session) else: raise exceptions.InternalError("Unknown action {}".format(action)) ## Actions callbacks ## - def _parseElements( + def _parse_elements( self, jingle_elt, session, @@ -759,14 +759,14 @@ content_data["transport_elt"] = transport_elt def _ignore(self, client, action, session, content_name, elt): - """Dummy method used when not exception must be raised if a method is not implemented in _callPlugins + """Dummy method used when not exception must be raised if a method is not implemented in _call_plugins must be used as app_default_cb and/or transp_default_cb """ return elt - def _callPlugins(self, client, action, session, app_method_name="jingleHandler", - transp_method_name="jingleHandler", app_default_cb=None, + def _call_plugins(self, client, action, session, app_method_name="jingle_handler", + transp_method_name="jingle_handler", app_default_cb=None, transp_default_cb=None, delete=True, elements=True, force_element=None): """Call application and transport plugin methods for all contents @@ -784,7 +784,7 @@ @param delete(bool): if True, remove desc_elt and transport_elt from session ignored if elements is False @param elements(bool): True if elements(desc_elt and tranport_elt) must be managed - must be True if _callPlugins is used in a request, and False if it used after a request + must be True if _call_plugins is used in a request, and False if it used after a request (i.e. on <iq> result or error) @param force_element(None, domish.Element, object): if elements is False, it is used as element parameter else it is ignored @@ -815,14 +815,14 @@ elt = content_data.pop(elt_name) if delete else content_data[elt_name] else: elt = force_element - d = utils.asDeferred( + d = utils.as_deferred( method, client, action, session, content_name, elt ) defers_list.append(d) return defers_list - async def onSessionInitiate( + async def on_session_initiate( self, client: SatXMPPEntity, request: domish.Element, @@ -831,8 +831,8 @@ ) -> None: """Called on session-initiate action - The "jingleRequestConfirmation" method of each application will be called - (or self.jingleRequestConfirmationDefault if the former doesn't exist). + The "jingle_request_confirmation" method of each application will be called + (or self.jingle_request_confirmation_default if the former doesn't exist). The session is only accepted if all application are confirmed. The application must manage itself multiple contents scenari (e.g. audio/video). @param client: %(doc_client)s @@ -847,7 +847,7 @@ session["contents"] = contents_dict = {} try: - self._parseElements( + self._parse_elements( jingle_elt, session, request, client, True, XEP_0166.ROLE_INITIATOR ) except exceptions.CancelError: @@ -862,7 +862,7 @@ client.send(xmlstream.toResponse(request, "result")) - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0166_on_session_initiate", client, session, request, jingle_elt ): @@ -870,21 +870,21 @@ # we now request each application plugin confirmation # and if all are accepted, we can accept the session - confirm_defers = self._callPlugins( + confirm_defers = self._call_plugins( client, XEP_0166.A_SESSION_INITIATE, session, - "jingleRequestConfirmation", + "jingle_request_confirmation", None, - self.jingleRequestConfirmationDefault, + self.jingle_request_confirmation_default, delete=False, ) confirm_dlist = defer.gatherResults(confirm_defers) - confirm_dlist.addCallback(self._confirmationCb, session, jingle_elt, client) - confirm_dlist.addErrback(self._jingleErrorCb, session, request, client) + confirm_dlist.addCallback(self._confirmation_cb, session, jingle_elt, client) + confirm_dlist.addErrback(self._jingle_error_cb, session, request, client) - def _confirmationCb(self, confirm_results, session, jingle_elt, client): + def _confirmation_cb(self, confirm_results, session, jingle_elt, client): """Method called when confirmation from user has been received This method is only called for the responder @@ -897,7 +897,7 @@ if not confirmed: return self.terminate(client, XEP_0166.REASON_DECLINE, session) - iq_elt, jingle_elt = self._buildJingleElt( + iq_elt, jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_SESSION_ACCEPT ) jingle_elt["responder"] = session['local_jid'].full() @@ -915,9 +915,9 @@ content_elt["name"] = content_name application = content_data["application"] - app_session_accept_cb = application.handler.jingleHandler + app_session_accept_cb = application.handler.jingle_handler - app_d = utils.asDeferred( + app_d = utils.as_deferred( app_session_accept_cb, client, XEP_0166.A_SESSION_INITIATE, @@ -929,9 +929,9 @@ defers_list.append(app_d) transport = content_data["transport"] - transport_session_accept_cb = transport.handler.jingleHandler + transport_session_accept_cb = transport.handler.jingle_handler - transport_d = utils.asDeferred( + transport_d = utils.as_deferred( transport_session_accept_cb, client, XEP_0166.A_SESSION_INITIATE, @@ -944,7 +944,7 @@ d_list = defer.DeferredList(defers_list) d_list.addCallback( - lambda __: self._callPlugins( + lambda __: self._call_plugins( client, XEP_0166.A_PREPARE_RESPONDER, session, @@ -954,19 +954,19 @@ ) d_list.addCallback(lambda __: iq_elt.send()) - def changeState(__, session): + def change_state(__, session): session["state"] = STATE_ACTIVE - d_list.addCallback(changeState, session) + d_list.addCallback(change_state, session) d_list.addCallback( - lambda __: self._callPlugins( + lambda __: self._call_plugins( client, XEP_0166.A_ACCEPTED_ACK, session, elements=False ) ) - d_list.addErrback(self._iqError, session["id"], client) + d_list.addErrback(self._iq_error, session["id"], client) return d_list - def onSessionTerminate(self, client, request, jingle_elt, session): + def on_session_terminate(self, client, request, jingle_elt, session): # TODO: check reason, display a message to user if needed log.debug("Jingle Session {} terminated".format(session["id"])) try: @@ -975,12 +975,12 @@ log.warning("No reason given for session termination") reason_elt = jingle_elt.addElement("reason") - terminate_defers = self._callPlugins( + terminate_defers = self._call_plugins( client, XEP_0166.A_SESSION_TERMINATE, session, - "jingleTerminate", - "jingleTerminate", + "jingle_terminate", + "jingle_terminate", self._ignore, self._ignore, elements=False, @@ -988,10 +988,10 @@ ) terminate_dlist = defer.DeferredList(terminate_defers) - terminate_dlist.addCallback(lambda __: self._delSession(client, session["id"])) + terminate_dlist.addCallback(lambda __: self._del_session(client, session["id"])) client.send(xmlstream.toResponse(request, "result")) - def onSessionAccept(self, client, request, jingle_elt, session): + def on_session_accept(self, client, request, jingle_elt, session): """Method called once session is accepted This method is only called for initiator @@ -1003,7 +1003,7 @@ log.debug("Jingle session {} has been accepted".format(session["id"])) try: - self._parseElements(jingle_elt, session, request, client) + self._parse_elements(jingle_elt, session, request, client) except exceptions.CancelError: return @@ -1013,28 +1013,28 @@ session["state"] = STATE_ACTIVE negociate_defers = [] - negociate_defers = self._callPlugins(client, XEP_0166.A_SESSION_ACCEPT, session) + negociate_defers = self._call_plugins(client, XEP_0166.A_SESSION_ACCEPT, session) negociate_dlist = defer.gatherResults(negociate_defers) # after negociations we start the transfer negociate_dlist.addCallback( - lambda __: self._callPlugins( + lambda __: self._call_plugins( client, XEP_0166.A_START, session, app_method_name=None, elements=False ) ) - def _onSessionCb(self, result, client, request, jingle_elt, session): + def _on_session_cb(self, result, client, request, jingle_elt, session): client.send(xmlstream.toResponse(request, "result")) - def _onSessionEb(self, failure_, client, request, jingle_elt, session): - log.error("Error while handling onSessionInfo: {}".format(failure_.value)) + def _on_session_eb(self, failure_, client, request, jingle_elt, session): + log.error("Error while handling on_session_info: {}".format(failure_.value)) # XXX: only error managed so far, maybe some applications/transports need more self.sendError( client, "feature-not-implemented", None, request, "unsupported-info" ) - def onSessionInfo(self, client, request, jingle_elt, session): + def on_session_info(self, client, request, jingle_elt, session): """Method called when a session-info action is received from other peer This method is only called for initiator @@ -1051,28 +1051,28 @@ try: # XXX: session-info is most likely only used for application, so we don't call transport plugins # if a future transport use it, this behaviour must be adapted - defers = self._callPlugins( + defers = self._call_plugins( client, XEP_0166.A_SESSION_INFO, session, - "jingleSessionInfo", + "jingle_session_info", None, elements=False, force_element=jingle_elt, ) except exceptions.NotFound as e: - self._onSessionEb(failure.Failure(e), client, request, jingle_elt, session) + self._on_session_eb(failure.Failure(e), client, request, jingle_elt, session) return dlist = defer.DeferredList(defers, fireOnOneErrback=True) - dlist.addCallback(self._onSessionCb, client, request, jingle_elt, session) - dlist.addErrback(self._onSessionCb, client, request, jingle_elt, session) + dlist.addCallback(self._on_session_cb, client, request, jingle_elt, session) + dlist.addErrback(self._on_session_cb, client, request, jingle_elt, session) @defer.inlineCallbacks - def onTransportReplace(self, client, request, jingle_elt, session): + def on_transport_replace(self, client, request, jingle_elt, session): """A transport change is requested - The request is parsed, and jingleHandler is called on concerned transport plugin(s) + The request is parsed, and jingle_handler is called on concerned transport plugin(s) @param client: %(doc_client)s @param request(domish.Element): full <iq> request @param jingle_elt(domish.Element): the <jingle> element @@ -1080,7 +1080,7 @@ """ log.debug("Other peer wants to replace the transport") try: - self._parseElements( + self._parse_elements( jingle_elt, session, request, client, with_application=False ) except exceptions.CancelError: @@ -1111,7 +1111,7 @@ if content_name is None: # wa can't accept the replacement - iq_elt, reject_jingle_elt = self._buildJingleElt( + iq_elt, reject_jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_TRANSPORT_REJECT ) for child in jingle_elt.children: @@ -1122,12 +1122,12 @@ # at this point, everything is alright and we can replace the transport(s) # this is similar to an session-accept action, but for transports only - iq_elt, accept_jingle_elt = self._buildJingleElt( + iq_elt, accept_jingle_elt = self._build_jingle_elt( client, session, XEP_0166.A_TRANSPORT_ACCEPT ) for content_name, content_data, transport, transport_elt in to_replace: # we can now actually replace the transport - yield content_data["transport"].handler.jingleHandler( + yield content_data["transport"].handler.jingle_handler( client, XEP_0166.A_DESTROY, session, content_name, None ) content_data["transport"] = transport @@ -1137,18 +1137,18 @@ content_elt["name"] = content_name content_elt["creator"] = content_data["creator"] # we notify the transport and insert its <transport/> in the answer - accept_transport_elt = yield transport.handler.jingleHandler( + accept_transport_elt = yield transport.handler.jingle_handler( client, XEP_0166.A_TRANSPORT_REPLACE, session, content_name, transport_elt ) content_elt.addChild(accept_transport_elt) # there is no confirmation needed here, so we can directly prepare it - yield transport.handler.jingleHandler( + yield transport.handler.jingle_handler( client, XEP_0166.A_PREPARE_RESPONDER, session, content_name, None ) iq_elt.send() - def onTransportAccept(self, client, request, jingle_elt, session): + def on_transport_accept(self, client, request, jingle_elt, session): """Method called once transport replacement is accepted @param client: %(doc_client)s @@ -1159,7 +1159,7 @@ log.debug("new transport has been accepted") try: - self._parseElements( + self._parse_elements( jingle_elt, session, request, client, with_application=False ) except exceptions.CancelError: @@ -1169,7 +1169,7 @@ client.send(xmlstream.toResponse(request, "result")) negociate_defers = [] - negociate_defers = self._callPlugins( + negociate_defers = self._call_plugins( client, XEP_0166.A_TRANSPORT_ACCEPT, session, app_method_name=None ) @@ -1177,12 +1177,12 @@ # after negociations we start the transfer negociate_dlist.addCallback( - lambda __: self._callPlugins( + lambda __: self._call_plugins( client, XEP_0166.A_START, session, app_method_name=None, elements=False ) ) - def onTransportReject(self, client, request, jingle_elt, session): + def on_transport_reject(self, client, request, jingle_elt, session): """Method called when a transport replacement is refused @param client: %(doc_client)s @@ -1194,10 +1194,10 @@ # this behaviour may change in the future self.terminate(client, "failed-transport", session) - def onTransportInfo(self, client, request, jingle_elt, session): + def on_transport_info(self, client, request, jingle_elt, session): """Method called when a transport-info action is received from other peer - The request is parsed, and jingleHandler is called on concerned transport plugin(s) + The request is parsed, and jingle_handler is called on concerned transport plugin(s) @param client: %(doc_client)s @param request(domish.Element): full <iq> request @param jingle_elt(domish.Element): the <jingle> element @@ -1206,7 +1206,7 @@ log.debug("Jingle session {} has been accepted".format(session["id"])) try: - self._parseElements( + self._parse_elements( jingle_elt, session, request, client, with_application=False ) except exceptions.CancelError: @@ -1221,7 +1221,7 @@ except KeyError: continue else: - content_data["transport"].handler.jingleHandler( + content_data["transport"].handler.jingle_handler( client, XEP_0166.A_TRANSPORT_INFO, session,
--- a/sat/plugins/plugin_xep_0184.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0184.py Sat Apr 08 13:54:42 2023 +0200 @@ -102,17 +102,17 @@ self._dictRequest = dict() # parameter value is retrieved before each use - host.memory.updateParams(self.params) + host.memory.update_params(self.params) - host.trigger.add("sendMessage", self.sendMessageTrigger) - host.bridge.addSignal( - "messageState", ".plugin", signature="sss" + host.trigger.add("sendMessage", self.send_message_trigger) + host.bridge.add_signal( + "message_state", ".plugin", signature="sss" ) # message_uid, status, profile - def getHandler(self, client): + def get_handler(self, client): return XEP_0184_handler(self, client.profile) - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """Install SendMessage command hook """ @@ -121,7 +121,7 @@ message = mess_data["xml"] message_type = message.getAttribute("type") - if self._isActif(client.profile) and ( + if self._is_actif(client.profile) and ( message_type == "chat" or message_type == "normal" ): message.addElement("request", NS_MESSAGE_DELIVERY_RECEIPTS) @@ -129,7 +129,7 @@ msg_id = message.getAttribute("id") self._dictRequest[msg_id] = uid reactor.callLater( - TEMPO_DELETE_WAITING_ACK_S, self._clearDictRequest, msg_id + TEMPO_DELETE_WAITING_ACK_S, self._clear_dict_request, msg_id ) log.debug( _( @@ -144,13 +144,13 @@ post_xml_treatments.addCallback(treatment) return True - def onMessageDeliveryReceiptsRequest(self, msg_elt, client): + def on_message_delivery_receipts_request(self, msg_elt, client): """This method is called on message delivery receipts **request** (XEP-0184 #7) @param msg_elt: message element @param client: %(doc_client)s""" from_jid = jid.JID(msg_elt["from"]) - if self._isActif(client.profile) and client.roster.isSubscribedFrom(from_jid): + if self._is_actif(client.profile) and client.roster.is_subscribed_from(from_jid): received_elt_ret = domish.Element((NS_MESSAGE_DELIVERY_RECEIPTS, "received")) try: received_elt_ret["id"] = msg_elt["id"] @@ -162,7 +162,7 @@ msg_result_elt.addChild(received_elt_ret) client.send(msg_result_elt) - def onMessageDeliveryReceiptsReceived(self, msg_elt, client): + def on_message_delivery_receipts_received(self, msg_elt, client): """This method is called on message delivery receipts **received** (XEP-0184 #7) @param msg_elt: message element @param client: %(doc_client)s""" @@ -173,7 +173,7 @@ try: uid = self._dictRequest[msg_id] del self._dictRequest[msg_id] - self.host.bridge.messageState( + self.host.bridge.message_state( uid, STATUS_MESSAGE_DELIVERY_RECEIVED, client.profile ) log.debug( @@ -182,7 +182,7 @@ except KeyError: pass - def _clearDictRequest(self, msg_id): + def _clear_dict_request(self, msg_id): try: del self._dictRequest[msg_id] log.debug( @@ -195,8 +195,8 @@ except KeyError: pass - def _isActif(self, profile): - return self.host.memory.getParamA(PARAM_NAME, PARAM_KEY, profile_key=profile) + def _is_actif(self, profile): + return self.host.memory.param_get_a(PARAM_NAME, PARAM_KEY, profile_key=profile) @implementer(iwokkel.IDisco) @@ -210,23 +210,23 @@ def connectionInitialized(self): self.xmlstream.addObserver( MSG_CHAT_MESSAGE_DELIVERY_RECEIPTS_REQUEST, - self.plugin_parent.onMessageDeliveryReceiptsRequest, + self.plugin_parent.on_message_delivery_receipts_request, client=self.parent, ) self.xmlstream.addObserver( MSG_CHAT_MESSAGE_DELIVERY_RECEIPTS_RECEIVED, - self.plugin_parent.onMessageDeliveryReceiptsReceived, + self.plugin_parent.on_message_delivery_receipts_received, client=self.parent, ) self.xmlstream.addObserver( MSG_NORMAL_MESSAGE_DELIVERY_RECEIPTS_REQUEST, - self.plugin_parent.onMessageDeliveryReceiptsRequest, + self.plugin_parent.on_message_delivery_receipts_request, client=self.parent, ) self.xmlstream.addObserver( MSG_NORMAL_MESSAGE_DELIVERY_RECEIPTS_RECEIVED, - self.plugin_parent.onMessageDeliveryReceiptsReceived, + self.plugin_parent.on_message_delivery_receipts_received, client=self.parent, )
--- a/sat/plugins/plugin_xep_0191.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0191.py Sat Apr 08 13:54:42 2023 +0200 @@ -53,26 +53,26 @@ def __init__(self, host): log.info(_("Blocking Command initialization")) - host.registerNamespace("blocking", NS_BLOCKING) + host.register_namespace("blocking", NS_BLOCKING) self.host = host - host.bridge.addMethod( - "blockingList", + host.bridge.add_method( + "blocking_list", ".plugin", in_sign="s", out_sign="as", - method=self._blockList, + method=self._block_list, async_=True, ) - host.bridge.addMethod( - "blockingBlock", + host.bridge.add_method( + "blocking_block", ".plugin", in_sign="ass", out_sign="", method=self._block, async_=True, ) - host.bridge.addMethod( - "blockingUnblock", + host.bridge.add_method( + "blocking_unblock", ".plugin", in_sign="ass", out_sign="", @@ -80,20 +80,20 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0191_Handler(self) @ensure_deferred - async def _blockList( + async def _block_list( self, profile_key=C.PROF_KEY_NONE ) -> List[str]: - client = self.host.getClient(profile_key) - blocked_jids = await self.blockList(client) + client = self.host.get_client(profile_key) + blocked_jids = await self.block_list(client) return [j.full() for j in blocked_jids] - async def blockList(self, client: SatXMPPEntity) -> Set[jid.JID]: - await self.host.checkFeature(client, NS_BLOCKING) + async def block_list(self, client: SatXMPPEntity) -> Set[jid.JID]: + await self.host.check_feature(client, NS_BLOCKING) iq_elt = client.IQ("get") iq_elt.addElement((NS_BLOCKING, "blocklist")) iq_result_elt = await iq_elt.send() @@ -118,13 +118,13 @@ entities: List[str], profile_key: str = C.PROF_KEY_NONE ) -> str: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred( self.block(client, [jid.JID(entity) for entity in entities]) ) async def block(self, client: SatXMPPEntity, entities: List[jid.JID]) -> None: - await self.host.checkFeature(client, NS_BLOCKING) + await self.host.check_feature(client, NS_BLOCKING) iq_elt = client.IQ("set") block_elt = iq_elt.addElement((NS_BLOCKING, "block")) for entity in entities: @@ -137,13 +137,13 @@ entities: List[str], profile_key: str = C.PROF_KEY_NONE ) -> None: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred( self.unblock(client, [jid.JID(e) for e in entities]) ) async def unblock(self, client: SatXMPPEntity, entities: List[jid.JID]) -> None: - await self.host.checkFeature(client, NS_BLOCKING) + await self.host.check_feature(client, NS_BLOCKING) iq_elt = client.IQ("set") unblock_elt = iq_elt.addElement((NS_BLOCKING, "unblock")) for entity in entities: @@ -151,7 +151,7 @@ item_elt["jid"] = entity.full() await iq_elt.send() - def onBlockPush(self, iq_elt: domish.Element, client: SatXMPPEntity) -> None: + def on_block_push(self, iq_elt: domish.Element, client: SatXMPPEntity) -> None: # TODO: send notification to user iq_elt.handled = True for item_elt in iq_elt.block.elements(NS_BLOCKING, "item"): @@ -164,7 +164,7 @@ iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) - def onUnblockPush(self, iq_elt: domish.Element, client: SatXMPPEntity) -> None: + def on_unblock_push(self, iq_elt: domish.Element, client: SatXMPPEntity) -> None: # TODO: send notification to user iq_elt.handled = True items = list(iq_elt.unblock.elements(NS_BLOCKING, "item")) @@ -193,13 +193,13 @@ def connectionInitialized(self): self.xmlstream.addObserver( IQ_BLOCK_PUSH, - self.plugin_parent.onBlockPush, + self.plugin_parent.on_block_push, client=self.parent ) self.xmlstream.addObserver( IQ_UNBLOCK_PUSH, - self.plugin_parent.onUnblockPush, + self.plugin_parent.on_unblock_push, client=self.parent )
--- a/sat/plugins/plugin_xep_0198.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0198.py Sat Apr 08 13:54:42 2023 +0200 @@ -123,7 +123,7 @@ self.req_timer.cancel() self.req_timer = None - def getBufferCopy(self): + def get_buffer_copy(self): return list(self.buffer) @@ -133,13 +133,13 @@ def __init__(self, host): log.info(_("Plugin Stream Management initialization")) self.host = host - host.registerNamespace('sm', NS_SM) - host.trigger.add("stream_hooks", self.addHooks) - host.trigger.add("xml_init", self._XMLInitTrigger) - host.trigger.add("disconnecting", self._disconnectingTrigger) - host.trigger.add("disconnected", self._disconnectedTrigger) + host.register_namespace('sm', NS_SM) + host.trigger.add("stream_hooks", self.add_hooks) + host.trigger.add("xml_init", self._xml_init_trigger) + host.trigger.add("disconnecting", self._disconnecting_trigger) + host.trigger.add("disconnected", self._disconnected_trigger) try: - self._ack_timeout = int(host.memory.getConfig("", "ack_timeout", ACK_TIMEOUT)) + self._ack_timeout = int(host.memory.config_get("", "ack_timeout", ACK_TIMEOUT)) except ValueError: log.error(_("Invalid ack_timeout value, please check your configuration")) self._ack_timeout = ACK_TIMEOUT @@ -149,20 +149,20 @@ log.info(_("Ack timeout set to {timeout}s").format( timeout=self._ack_timeout)) - def profileConnecting(self, client): - client._xep_0198_session = ProfileSessionData(callback=self.checkAcks, + def profile_connecting(self, client): + client._xep_0198_session = ProfileSessionData(callback=self.check_acks, client=client) - def getHandler(self, client): + def get_handler(self, client): return XEP_0198_handler(self) - def addHooks(self, client, receive_hooks, send_hooks): + def add_hooks(self, client, receive_hooks, send_hooks): """Add hooks to handle in/out stanzas counters""" - receive_hooks.append(partial(self.onReceive, client=client)) - send_hooks.append(partial(self.onSend, client=client)) + receive_hooks.append(partial(self.on_receive, client=client)) + send_hooks.append(partial(self.on_send, client=client)) return True - def _XMLInitTrigger(self, client): + def _xml_init_trigger(self, client): """Enable or resume a stream mangement""" if not (NS_SM, 'sm') in client.xmlstream.features: log.warning(_( @@ -201,16 +201,16 @@ session.enabled = True return True - def _disconnectingTrigger(self, client): + def _disconnecting_trigger(self, client): session = client._xep_0198_session if session.enabled: - self.sendAck(client) + self.send_ack(client) # This is a requested disconnection, so we can reset the session # to disable resuming and close normally the stream session.reset() return True - def _disconnectedTrigger(self, client, reason): + def _disconnected_trigger(self, client, reason): if client.is_component: return True session = client._xep_0198_session @@ -218,29 +218,29 @@ if session.resume_enabled: session.disconnected_time = time.time() session.disconnect_timer = reactor.callLater(session.session_max, - client.disconnectProfile, + client.disconnect_profile, reason) - # disconnectProfile must not be called at this point + # disconnect_profile must not be called at this point # because session can be resumed return False else: return True - def checkAcks(self, client): + def check_acks(self, client): """Request ack if needed""" session = client._xep_0198_session - # log.debug("checkAcks (in_counter={}, out_counter={}, buf len={}, buf idx={})" + # log.debug("check_acks (in_counter={}, out_counter={}, buf len={}, buf idx={})" # .format(session.in_counter, session.out_counter, len(session.buffer), # session.buffer_idx)) if session.ack_requested or not session.buffer: return if (session.out_counter - session.buffer_idx >= MAX_STANZA_ACK_R or time.time() - session.last_ack_r >= MAX_DELAY_ACK_R): - self.requestAck(client) + self.request_ack(client) session.ack_requested = True session.last_ack_r = time.time() - def updateBuffer(self, session, server_acked): + def update_buffer(self, session, server_acked): """Update buffer and buffer_index""" if server_acked > session.buffer_idx: diff = server_acked - session.buffer_idx @@ -257,7 +257,7 @@ buffer_id=session.buffer_idx)) session.buffer_idx += diff - def replayBuffer(self, client, buffer_, discard_results=False): + def replay_buffer(self, client, buffer_, discard_results=False): """Resend all stanza in buffer @param buffer_(collection.deque, list): buffer to replay @@ -276,13 +276,13 @@ continue client.send(stanza) - def sendAck(self, client): + def send_ack(self, client): """Send an answer element with current IN counter""" a_elt = domish.Element((NS_SM, 'a')) a_elt['h'] = str(client._xep_0198_session.in_counter) client.send(a_elt) - def requestAck(self, client): + def request_ack(self, client): """Send a request element""" session = client._xep_0198_session r_elt = domish.Element((NS_SM, 'r')) @@ -290,7 +290,7 @@ if session.req_timer is not None: raise exceptions.InternalError("req_timer should not be set") if self._ack_timeout: - session.req_timer = reactor.callLater(self._ack_timeout, self.onAckTimeOut, + session.req_timer = reactor.callLater(self._ack_timeout, self.on_ack_time_out, client) def _connectionFailed(self, failure_, connector): @@ -306,7 +306,7 @@ del connector.connectionFailed_ori return connector.connectionFailed(failure_) - def onEnabled(self, enabled_elt, client): + def on_enabled(self, enabled_elt, client): session = client._xep_0198_session session.in_counter = 0 @@ -367,25 +367,25 @@ .format(res_m = max_s/60))) session.session_max = max_s - def onResumed(self, enabled_elt, client): + def on_resumed(self, enabled_elt, client): session = client._xep_0198_session assert not session.enabled del session.resuming server_acked = int(enabled_elt['h']) - self.updateBuffer(session, server_acked) + self.update_buffer(session, server_acked) resend_count = len(session.buffer) # we resend all stanza which have not been received properly - self.replayBuffer(client, session.buffer) + self.replay_buffer(client, session.buffer) # now we can continue the session session.enabled = True d_time = time.time() - session.disconnected_time log.info(_("Stream session resumed (disconnected for {d_time} s, {count} " "stanza(s) resent)").format(d_time=int(d_time), count=resend_count)) - def onFailed(self, failed_elt, client): + def on_failed(self, failed_elt, client): session = client._xep_0198_session condition_elt = failed_elt.firstChildElement() - buffer_ = session.getBufferCopy() + buffer_ = session.get_buffer_copy() session.reset() try: @@ -429,7 +429,7 @@ if plg_0045 is not None: # we have to remove joined rooms - muc_join_args = plg_0045.popRooms(client) + muc_join_args = plg_0045.pop_rooms(client) # we need to recreate roster client.handlers.remove(client.roster) client.roster = client.roster.__class__(self.host) @@ -441,9 +441,9 @@ # we set the jid, which may have changed d.addCallback(lambda __: setattr(client.factory.authenticator, "jid", client.jid)) # we call the trigger who will send the <enable/> element - d.addCallback(lambda __: self._XMLInitTrigger(client)) + d.addCallback(lambda __: self._xml_init_trigger(client)) # then we have to re-request the roster, as changes may have occured - d.addCallback(lambda __: client.roster.requestRoster()) + d.addCallback(lambda __: client.roster.request_roster()) # we add got_roster to be sure to have roster before sending initial presence d.addCallback(lambda __: client.roster.got_roster) if plg_0313 is not None: @@ -460,16 +460,16 @@ d.addCallback(lambda __: muc_d_list) # at the end we replay the buffer, as those stanzas have probably not # been received - d.addCallback(lambda __: self.replayBuffer(client, buffer_, + d.addCallback(lambda __: self.replay_buffer(client, buffer_, discard_results=True)) - def onReceive(self, element, client): + def on_receive(self, element, client): if not client.is_component: session = client._xep_0198_session if session.enabled and element.name.lower() in C.STANZA_NAMES: session.in_counter += 1 % MAX_COUNTER - def onSend(self, obj, client): + def on_send(self, obj, client): if not client.is_component: session = client._xep_0198_session if (session.enabled @@ -477,12 +477,12 @@ and obj.name.lower() in C.STANZA_NAMES): session.out_counter += 1 % MAX_COUNTER session.buffer.appendleft(obj) - self.checkAcks(client) + self.check_acks(client) - def onAckRequest(self, r_elt, client): - self.sendAck(client) + def on_ack_request(self, r_elt, client): + self.send_ack(client) - def onAckAnswer(self, a_elt, client): + def on_ack_answer(self, a_elt, client): session = client._xep_0198_session session.ack_requested = False if self._ack_timeout: @@ -505,10 +505,10 @@ session.reset() return - self.updateBuffer(session, server_acked) - self.checkAcks(client) + self.update_buffer(session, server_acked) + self.check_acks(client) - def onAckTimeOut(self, client): + def on_ack_time_out(self, client): """Called when a requested ACK has not been received in time""" log.info(_("Ack was not received in time, aborting connection")) try: @@ -533,19 +533,19 @@ def connectionInitialized(self): self.xmlstream.addObserver( - SM_ENABLED, self.plugin_parent.onEnabled, client=self.parent + SM_ENABLED, self.plugin_parent.on_enabled, client=self.parent ) self.xmlstream.addObserver( - SM_RESUMED, self.plugin_parent.onResumed, client=self.parent + SM_RESUMED, self.plugin_parent.on_resumed, client=self.parent ) self.xmlstream.addObserver( - SM_FAILED, self.plugin_parent.onFailed, client=self.parent + SM_FAILED, self.plugin_parent.on_failed, client=self.parent ) self.xmlstream.addObserver( - SM_R_REQUEST, self.plugin_parent.onAckRequest, client=self.parent + SM_R_REQUEST, self.plugin_parent.on_ack_request, client=self.parent ) self.xmlstream.addObserver( - SM_A_REQUEST, self.plugin_parent.onAckAnswer, client=self.parent + SM_A_REQUEST, self.plugin_parent.on_ack_answer, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0199.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0199.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,36 +48,36 @@ def __init__(self, host): log.info(_("XMPP Ping plugin initialization")) self.host = host - host.bridge.addMethod( + host.bridge.add_method( "ping", ".plugin", in_sign='ss', out_sign='d', method=self._ping, async_=True) try: self.text_cmds = self.host.plugins[C.TEXT_CMDS] except KeyError: log.info(_("Text commands not available")) else: - self.text_cmds.registerTextCommands(self) + self.text_cmds.register_text_commands(self) - def getHandler(self, client): + def get_handler(self, client): return XEP_0199_handler(self) - def _pingRaiseIfFailure(self, pong): + def _ping_raise_if_failure(self, pong): """If ping didn't succeed, raise the failure, else return pong delay""" if pong[0] != "PONG": raise pong[0] return pong[1] def _ping(self, jid_s, profile): - client = self.host.getClient(profile) + client = self.host.get_client(profile) entity_jid = jid.JID(jid_s) d = self.ping(client, entity_jid) - d.addCallback(self._pingRaiseIfFailure) + d.addCallback(self._ping_raise_if_failure) return d - def _pingCb(self, iq_result, send_time): + def _ping_cb(self, iq_result, send_time): receive_time = time.time() return ("PONG", receive_time - send_time) - def _pingEb(self, failure_, send_time): + def _ping_eb(self, failure_, send_time): receive_time = time.time() return (failure_.value, receive_time - send_time) @@ -94,8 +94,8 @@ iq_elt.addElement((NS_PING, "ping")) d = iq_elt.send() send_time = time.time() - d.addCallback(self._pingCb, send_time) - d.addErrback(self._pingEb, send_time) + d.addCallback(self._ping_cb, send_time) + d.addErrback(self._ping_eb, send_time) return d def _cmd_ping_fb(self, pong, client, mess_data): @@ -103,9 +103,9 @@ txt_cmd = self.host.plugins[C.TEXT_CMDS] if pong[0] == "PONG": - txt_cmd.feedBack(client, "PONG ({time} s)".format(time=pong[1]), mess_data) + txt_cmd.feed_back(client, "PONG ({time} s)".format(time=pong[1]), mess_data) else: - txt_cmd.feedBack( + txt_cmd.feed_back( client, _("ping error ({err_msg}). Response time: {time} s") .format(err_msg=pong[0], time=pong[1]), mess_data) @@ -120,7 +120,7 @@ entity_jid = jid.JID(mess_data["unparsed"].strip()) except RuntimeError: txt_cmd = self.host.plugins[C.TEXT_CMDS] - txt_cmd.feedBack(client, _('Invalid jid: "{entity_jid}"').format( + txt_cmd.feed_back(client, _('Invalid jid: "{entity_jid}"').format( entity_jid=mess_data["unparsed"].strip()), mess_data) return False else: @@ -130,7 +130,7 @@ return False - def onPingRequest(self, iq_elt, client): + def on_ping_request(self, iq_elt, client): log.info(_("XMPP PING received from {from_jid} [{profile}]").format( from_jid=iq_elt["from"], profile=client.profile)) iq_elt.handled = True @@ -146,7 +146,7 @@ def connectionInitialized(self): self.xmlstream.addObserver( - PING_REQUEST, self.plugin_parent.onPingRequest, client=self.parent + PING_REQUEST, self.plugin_parent.on_ping_request, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0203.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0203.py Sat Apr 08 13:54:42 2023 +0200 @@ -51,7 +51,7 @@ log.info(_("Delayed Delivery plugin initialization")) self.host = host - def getHandler(self, client): + def get_handler(self, client): return XEP_0203_handler(self, client.profile) def delay(self, stamp, sender=None, desc="", parent=None):
--- a/sat/plugins/plugin_xep_0215.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0215.py Sat Apr 08 13:54:42 2023 +0200 @@ -58,7 +58,7 @@ def __init__(self, host): log.info(_("External Service Discovery plugin initialization")) self.host = host - host.bridge.addMethod( + host.bridge.add_method( "external_disco_get", ".plugin", in_sign="ss", @@ -66,7 +66,7 @@ method=self._external_disco_get, async_=True, ) - host.bridge.addMethod( + host.bridge.add_method( "external_disco_credentials_get", ".plugin", in_sign="ssis", @@ -75,10 +75,10 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0215_handler(self) - async def profileConnecting(self, client: SatXMPPEntity) -> None: + async def profile_connecting(self, client: SatXMPPEntity) -> None: client._xep_0215_services = {} def parse_services( @@ -147,13 +147,13 @@ for x_elt in service_elt.elements(data_form.NS_X_DATA, "x"): form = data_form.Form.fromElement(x_elt) extended = service.setdefault("extended", []) - extended.append(xml_tools.dataForm2dataDict(form)) + extended.append(xml_tools.data_form_2_data_dict(form)) services.append(service) return services def _external_disco_get(self, entity: str, profile_key: str) -> defer.Deferred: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) d = defer.ensureDeferred( self.get_external_services(client, jid.JID(entity) if entity else None) ) @@ -204,7 +204,7 @@ port: int = 0, profile_key=C.PROF_KEY_NONE, ) -> defer.Deferred: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) d = defer.ensureDeferred( self.request_credentials( client, host, type_, port or None, jid.JID(entity) if entity else None
--- a/sat/plugins/plugin_xep_0231.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0231.py Sat Apr 08 13:54:42 2023 +0200 @@ -58,18 +58,18 @@ def __init__(self, host): log.info(_("plugin Bits of Binary initialization")) self.host = host - host.registerNamespace("bob", NS_BOB) - host.trigger.add("xhtml_post_treat", self.XHTMLTrigger) - host.bridge.addMethod( - "bobGetFile", + host.register_namespace("bob", NS_BOB) + host.trigger.add("xhtml_post_treat", self.xhtml_trigger) + host.bridge.add_method( + "bob_get_file", ".plugin", in_sign="sss", out_sign="s", - method=self._getFile, + method=self._get_file, async_=True, ) - def dumpData(self, cache, data_elt, cid): + def dump_data(self, cache, data_elt, cid): """save file encoded in data_elt to cache @param cache(memory.cache.Cache): cache to use to store the data @@ -87,7 +87,7 @@ log.warning("invalid max-age found") max_age = None - with cache.cacheData( + with cache.cache_data( PLUGIN_INFO[C.PI_IMPORT_NAME], cid, data_elt.getAttribute("type"), max_age ) as f: @@ -96,13 +96,13 @@ return file_path - def getHandler(self, client): + def get_handler(self, client): return XEP_0231_handler(self) - def _requestCb(self, iq_elt, cache, cid): + def _request_cb(self, iq_elt, cache, cid): for data_elt in iq_elt.elements(NS_BOB, "data"): if data_elt.getAttribute("cid") == cid: - file_path = self.dumpData(cache, data_elt, cid) + file_path = self.dump_data(cache, data_elt, cid) return file_path log.warning( @@ -112,12 +112,12 @@ ) raise failure.Failure(exceptions.DataError("missing data")) - def _requestEb(self, failure_): + def _request_eb(self, failure_): """Log the error and continue errback chain""" log.warning("Can't get requested data:\n{reason}".format(reason=failure_)) return failure_ - def requestData(self, client, to_jid, cid, cache=None): + def request_data(self, client, to_jid, cid, cache=None): """Request data if we don't have it in cache @param to_jid(jid.JID): jid to request the data to @@ -133,19 +133,19 @@ data_elt = iq_elt.addElement((NS_BOB, "data")) data_elt["cid"] = cid d = iq_elt.send() - d.addCallback(self._requestCb, cache, cid) - d.addErrback(self._requestEb) + d.addCallback(self._request_cb, cache, cid) + d.addErrback(self._request_eb) return d - def _setImgEltSrc(self, path, img_elt): + def _set_img_elt_src(self, path, img_elt): img_elt["src"] = "file://{}".format(path) - def XHTMLTrigger(self, client, message_elt, body_elt, lang, treat_d): - for img_elt in xml_tools.findAll(body_elt, C.NS_XHTML, "img"): + def xhtml_trigger(self, client, message_elt, body_elt, lang, treat_d): + for img_elt in xml_tools.find_all(body_elt, C.NS_XHTML, "img"): source = img_elt.getAttribute("src", "") if source.startswith("cid:"): cid = source[4:] - file_path = client.cache.getFilePath(cid) + file_path = client.cache.get_file_path(cid) if file_path is not None: # image is in cache, we change the url img_elt["src"] = "file://{}".format(file_path) @@ -154,17 +154,17 @@ # image is not in cache, is it given locally? for data_elt in message_elt.elements(NS_BOB, "data"): if data_elt.getAttribute("cid") == cid: - file_path = self.dumpData(client.cache, data_elt, cid) + file_path = self.dump_data(client.cache, data_elt, cid) img_elt["src"] = "file://{}".format(file_path) break else: # cid not found locally, we need to request it # so we use the deferred - d = self.requestData(client, jid.JID(message_elt["from"]), cid) - d.addCallback(partial(self._setImgEltSrc, img_elt=img_elt)) + d = self.request_data(client, jid.JID(message_elt["from"]), cid) + d.addCallback(partial(self._set_img_elt_src, img_elt=img_elt)) treat_d.addCallback(lambda __: d) - def onComponentRequest(self, iq_elt, client): + def on_component_request(self, iq_elt, client): """cache data is retrieve from common cache for components""" # FIXME: this is a security/privacy issue as no access check is done # but this is mitigated by the fact that the cid must be known. @@ -179,7 +179,7 @@ client.send(error_elt) return - metadata = self.host.common_cache.getMetadata(cid) + metadata = self.host.common_cache.get_metadata(cid) if metadata is None: error_elt = jabber_error.StanzaError("item-not-found").toResponse(iq_elt) client.send(error_elt) @@ -196,15 +196,15 @@ data_elt["max-age"] = str(int(max(0, metadata["eol"] - time.time()))) client.send(result_elt) - def _getFile(self, peer_jid_s, cid, profile): + def _get_file(self, peer_jid_s, cid, profile): peer_jid = jid.JID(peer_jid_s) assert cid - client = self.host.getClient(profile) - d = self.getFile(client, peer_jid, cid) + client = self.host.get_client(profile) + d = self.get_file(client, peer_jid, cid) d.addCallback(lambda path: str(path)) return d - def getFile(self, client, peer_jid, cid, parent_elt=None): + def get_file(self, client, peer_jid, cid, parent_elt=None): """Retrieve a file from it's content-id @param peer_jid(jid.JID): jid of the entity offering the data @@ -214,7 +214,7 @@ None to ignore @return D(Path): path to cached data """ - file_path = client.cache.getFilePath(cid) + file_path = client.cache.get_file_path(cid) if file_path is not None: # file is in cache return defer.succeed(file_path) @@ -223,11 +223,11 @@ if parent_elt is not None: for data_elt in parent_elt.elements(NS_BOB, "data"): if data_elt.getAttribute("cid") == cid: - return defer.succeed(self.dumpData(client.cache, data_elt, cid)) + return defer.succeed(self.dump_data(client.cache, data_elt, cid)) # cid not found locally, we need to request it # so we use the deferred - return self.requestData(client, peer_jid, cid) + return self.request_data(client, peer_jid, cid) @implementer(iwokkel.IDisco) @@ -240,7 +240,7 @@ def connectionInitialized(self): if self.parent.is_component: self.xmlstream.addObserver( - IQ_BOB_REQUEST, self.plugin_parent.onComponentRequest, client=self.parent + IQ_BOB_REQUEST, self.plugin_parent.on_component_request, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0234.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0234.py Sat Apr 08 13:54:42 2023 +0200 @@ -71,33 +71,33 @@ def __init__(self, host): log.info(_("plugin Jingle File Transfer initialization")) self.host = host - host.registerNamespace("jingle-ft", NS_JINGLE_FT) + host.register_namespace("jingle-ft", NS_JINGLE_FT) self._j = host.plugins["XEP-0166"] # shortcut to access jingle - self._j.registerApplication(NS_JINGLE_FT, self) + self._j.register_application(NS_JINGLE_FT, self) self._f = host.plugins["FILE"] self._f.register(self, priority=10000) self._hash = self.host.plugins["XEP-0300"] - host.bridge.addMethod( - "fileJingleSend", + host.bridge.add_method( + "file_jingle_send", ".plugin", in_sign="ssssa{ss}s", out_sign="", - method=self._fileSend, + method=self._file_send, async_=True, ) - host.bridge.addMethod( - "fileJingleRequest", + host.bridge.add_method( + "file_jingle_request", ".plugin", in_sign="sssssa{ss}s", out_sign="s", - method=self._fileJingleRequest, + method=self._file_jingle_request, async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0234_handler() - def getProgressId(self, session, content_name): + def get_progress_id(self, session, content_name): """Return a unique progress ID @param session(dict): jingle session @@ -106,7 +106,7 @@ """ return "{}_{}".format(session["id"], content_name) - async def canHandleFileSend(self, client, peer_jid, filepath): + async def can_handle_file_send(self, client, peer_jid, filepath): if peer_jid.resource: return await self.host.hasFeature(client, NS_JINGLE_FT, peer_jid) else: @@ -115,7 +115,7 @@ # generic methods - def buildFileElement( + def build_file_element( self, client, name=None, file_hash=None, hash_algo=None, size=None, mime_type=None, desc=None, modified=None, transfer_range=None, path=None, namespace=None, file_elt=None, **kwargs): @@ -168,11 +168,11 @@ range_elt["length"] = transfer_range.length if file_hash is not None: if not file_hash: - file_elt.addChild(self._hash.buildHashUsedElt()) + file_elt.addChild(self._hash.build_hash_used_elt()) else: - file_elt.addChild(self._hash.buildHashElt(file_hash, hash_algo)) + file_elt.addChild(self._hash.build_hash_elt(file_hash, hash_algo)) elif hash_algo is not None: - file_elt.addChild(self._hash.buildHashUsedElt(hash_algo)) + file_elt.addChild(self._hash.build_hash_used_elt(hash_algo)) self.host.trigger.point( "XEP-0234_buildFileElement", client, file_elt, extra_args=kwargs) if kwargs: @@ -180,8 +180,8 @@ log.debug("ignored keyword: {}".format(kw)) return file_elt - def buildFileElementFromDict(self, client, file_data, **kwargs): - """like buildFileElement but get values from a file_data dict + def build_file_element_from_dict(self, client, file_data, **kwargs): + """like build_file_element but get values from a file_data dict @param file_data(dict): metadata to use @param **kwargs: data to override @@ -195,9 +195,9 @@ ) except KeyError: pass - return self.buildFileElement(client, **file_data) + return self.build_file_element(client, **file_data) - async def parseFileElement( + async def parse_file_element( self, client, file_elt, file_data=None, given=False, parent_elt=None, keep_empty_range=False): """Parse a <file> element and file dictionary accordingly @@ -248,7 +248,7 @@ name = "--" file_data["name"] = name elif name is not None and ("/" in name or "\\" in name): - file_data["name"] = regex.pathEscape(name) + file_data["name"] = regex.path_escape(name) try: file_data["mime_type"] = str( @@ -284,7 +284,7 @@ prefix = "given_" if given else "" hash_algo_key, hash_key = "hash_algo", prefix + "file_hash" try: - file_data[hash_algo_key], file_data[hash_key] = self._hash.parseHashElt( + file_data[hash_algo_key], file_data[hash_key] = self._hash.parse_hash_elt( file_elt ) except exceptions.NotFound: @@ -296,7 +296,7 @@ # bridge methods - def _fileSend( + def _file_send( self, peer_jid, filepath, @@ -305,8 +305,8 @@ extra=None, profile=C.PROF_KEY_NONE, ): - client = self.host.getClient(profile) - return defer.ensureDeferred(self.fileSend( + client = self.host.get_client(profile) + return defer.ensureDeferred(self.file_send( client, jid.JID(peer_jid), filepath, @@ -315,7 +315,7 @@ extra or None, )) - async def fileSend( + async def file_send( self, client, peer_jid, filepath, name, file_desc=None, extra=None ): """Send a file using jingle file transfer @@ -351,11 +351,11 @@ ) return await progress_id_d - def _fileJingleRequest( + def _file_jingle_request( self, peer_jid, filepath, name="", file_hash="", hash_algo="", extra=None, profile=C.PROF_KEY_NONE): - client = self.host.getClient(profile) - return defer.ensureDeferred(self.fileJingleRequest( + client = self.host.get_client(profile) + return defer.ensureDeferred(self.file_jingle_request( client, jid.JID(peer_jid), filepath, @@ -365,7 +365,7 @@ extra or None, )) - async def fileJingleRequest( + async def file_jingle_request( self, client, peer_jid, filepath, name=None, file_hash=None, hash_algo=None, extra=None): """Request a file using jingle file transfer @@ -407,12 +407,12 @@ # jingle callbacks - def jingleDescriptionElt( + def jingle_description_elt( self, client, session, content_name, filepath, name, extra, progress_id_d ): return domish.Element((NS_JINGLE_FT, "description")) - def jingleSessionInit( + def jingle_session_init( self, client, session, content_name, filepath, name, extra, progress_id_d ): if extra is None: @@ -424,13 +424,13 @@ keys=", ".join(EXTRA_ALLOWED) ) ) - progress_id_d.callback(self.getProgressId(session, content_name)) + progress_id_d.callback(self.get_progress_id(session, content_name)) content_data = session["contents"][content_name] application_data = content_data["application_data"] assert "file_path" not in application_data application_data["file_path"] = filepath file_data = application_data["file_data"] = {} - desc_elt = self.jingleDescriptionElt( + desc_elt = self.jingle_description_elt( client, session, content_name, filepath, name, extra, progress_id_d) file_elt = desc_elt.addElement("file") @@ -449,7 +449,7 @@ file_data["namespace"] = extra["namespace"] if "path" in extra: file_data["path"] = extra["path"] - self.buildFileElementFromDict( + self.build_file_element_from_dict( client, file_data, file_elt=file_elt, file_hash="") else: # we request a file @@ -462,16 +462,16 @@ file_data["file_hash"] = file_hash file_data["hash_algo"] = extra["hash_algo"] else: - file_data["hash_algo"] = self._hash.getDefaultAlgo() + file_data["hash_algo"] = self._hash.get_default_algo() if "namespace" in extra: file_data["namespace"] = extra["namespace"] if "path" in extra: file_data["path"] = extra["path"] - self.buildFileElementFromDict(client, file_data, file_elt=file_elt) + self.build_file_element_from_dict(client, file_data, file_elt=file_elt) return desc_elt - async def jingleRequestConfirmation( + async def jingle_request_confirmation( self, client, action, session, content_name, desc_elt ): """This method request confirmation for a jingle session""" @@ -485,11 +485,11 @@ file_elt = next(desc_elt.elements(NS_JINGLE_FT, "file")) except StopIteration: raise failure.Failure(exceptions.DataError) - file_data = {"progress_id": self.getProgressId(session, content_name)} + file_data = {"progress_id": self.get_progress_id(session, content_name)} if senders == self._j.ROLE_RESPONDER: # we send the file - return await self._fileSendingRequestConf( + return await self._file_sending_request_conf( client, session, content_data, content_name, file_data, file_elt ) else: @@ -498,16 +498,16 @@ client, session, content_data, content_name, file_data, file_elt ) - async def _fileSendingRequestConf( + async def _file_sending_request_conf( self, client, session, content_data, content_name, file_data, file_elt ): """parse file_elt, and handle file retrieving/permission checking""" - await self.parseFileElement(client, file_elt, file_data) + await self.parse_file_element(client, file_elt, file_data) content_data["application_data"]["file_data"] = file_data finished_d = content_data["finished_d"] = defer.Deferred() # confirmed_d is a deferred returning confimed value (only used if cont is False) - cont, confirmed_d = self.host.trigger.returnPoint( + cont, confirmed_d = self.host.trigger.return_point( "XEP-0234_fileSendingRequest", client, session, @@ -521,7 +521,7 @@ if confirmed: args = [client, session, content_name, content_data] finished_d.addCallbacks( - self._finishedCb, self._finishedEb, args, None, args + self._finished_cb, self._finished_eb, args, None, args ) return confirmed @@ -532,18 +532,18 @@ self, client, session, content_data, content_name, file_data, file_elt ): """parse file_elt, and handle user permission/file opening""" - await self.parseFileElement(client, file_elt, file_data, given=True) + await self.parse_file_element(client, file_elt, file_data, given=True) try: - hash_algo, file_data["given_file_hash"] = self._hash.parseHashElt(file_elt) + hash_algo, file_data["given_file_hash"] = self._hash.parse_hash_elt(file_elt) except exceptions.NotFound: try: - hash_algo = self._hash.parseHashUsedElt(file_elt) + hash_algo = self._hash.parse_hash_used_elt(file_elt) except exceptions.NotFound: raise failure.Failure(exceptions.DataError) if hash_algo is not None: file_data["hash_algo"] = hash_algo - file_data["hash_hasher"] = hasher = self._hash.getHasher(hash_algo) + file_data["hash_hasher"] = hasher = self._hash.get_hasher(hash_algo) file_data["data_cb"] = lambda data: hasher.update(data) try: @@ -564,21 +564,21 @@ # deferred to track end of transfer finished_d = content_data["finished_d"] = defer.Deferred() - confirmed = await self._f.getDestDir( + confirmed = await self._f.get_dest_dir( client, session["peer_jid"], content_data, file_data, stream_object=True ) if confirmed: - await self.host.trigger.asyncPoint( + await self.host.trigger.async_point( "XEP-0234_file_receiving_request_conf", client, session, content_data, file_elt ) args = [client, session, content_name, content_data] finished_d.addCallbacks( - self._finishedCb, self._finishedEb, args, None, args + self._finished_cb, self._finished_eb, args, None, args ) return confirmed - async def jingleHandler(self, client, action, session, content_name, desc_elt): + async def jingle_handler(self, client, action, session, content_name, desc_elt): content_data = session["contents"][content_name] application_data = content_data["application_data"] if action in (self._j.A_ACCEPTED_ACK,): @@ -608,8 +608,8 @@ size = None # XXX: hash security is not critical here, so we just take the higher # mandatory one - hasher = file_data["hash_hasher"] = self._hash.getHasher() - progress_id = self.getProgressId(session, content_name) + hasher = file_data["hash_hasher"] = self._hash.get_hasher() + progress_id = self.get_progress_id(session, content_name) try: content_data["stream_object"] = stream.FileStreamObject( self.host, @@ -621,7 +621,7 @@ data_cb=lambda data: hasher.update(data), ) except Exception as e: - self.host.bridge.progressError( + self.host.bridge.progress_error( progress_id, C.PROGRESS_ERROR_FAILED, client.profile ) await self._j.terminate( @@ -632,19 +632,19 @@ size = file_data["size"] # XXX: hash security is not critical here, so we just take the higher # mandatory one - hasher = file_data["hash_hasher"] = self._hash.getHasher() + hasher = file_data["hash_hasher"] = self._hash.get_hasher() content_data["stream_object"] = stream.FileStreamObject( self.host, client, file_path, - uid=self.getProgressId(session, content_name), + uid=self.get_progress_id(session, content_name), size=size, data_cb=lambda data: hasher.update(data), ) finished_d = content_data["finished_d"] = defer.Deferred() args = [client, session, content_name, content_data] - finished_d.addCallbacks(self._finishedCb, self._finishedEb, args, None, args) - await self.host.trigger.asyncPoint( + finished_d.addCallbacks(self._finished_cb, self._finished_eb, args, None, args) + await self.host.trigger.async_point( "XEP-0234_jingle_handler", client, session, content_data, desc_elt ) @@ -652,7 +652,7 @@ log.warning("FIXME: unmanaged action {}".format(action)) return desc_elt - def jingleSessionInfo(self, client, action, session, content_name, jingle_elt): + def jingle_session_info(self, client, action, session, content_name, jingle_elt): """Called on session-info action manage checksum, and ignore <received/> element @@ -681,7 +681,7 @@ file_elt = next(elt.elements(NS_JINGLE_FT, "file")) except StopIteration: raise exceptions.DataError - algo, file_data["given_file_hash"] = self._hash.parseHashElt(file_elt) + algo, file_data["given_file_hash"] = self._hash.parse_hash_elt(file_elt) if algo != file_data.get("hash_algo"): log.warning( "Hash algorithm used in given hash ({peer_algo}) doesn't correspond to the one we have used ({our_algo}) [{profile}]".format( @@ -691,21 +691,21 @@ ) ) else: - self._receiverTryTerminate( + self._receiver_try_terminate( client, session, content_name, content_data ) else: raise NotImplementedError - def jingleTerminate(self, client, action, session, content_name, jingle_elt): + def jingle_terminate(self, client, action, session, content_name, jingle_elt): if jingle_elt.decline: # progress is the only way to tell to frontends that session has been declined - progress_id = self.getProgressId(session, content_name) - self.host.bridge.progressError( + progress_id = self.get_progress_id(session, content_name) + self.host.bridge.progress_error( progress_id, C.PROGRESS_ERROR_DECLINED, client.profile ) elif not jingle_elt.success: - progress_id = self.getProgressId(session, content_name) + progress_id = self.get_progress_id(session, content_name) first_child = jingle_elt.firstChildElement() if first_child is not None: reason = first_child.name @@ -713,25 +713,25 @@ reason = f"{reason} - {jingle_elt.text}" else: reason = C.PROGRESS_ERROR_FAILED - self.host.bridge.progressError( + self.host.bridge.progress_error( progress_id, reason, client.profile ) - def _sendCheckSum(self, client, session, content_name, content_data): + def _send_check_sum(self, client, session, content_name, content_data): """Send the session-info with the hash checksum""" file_data = content_data["application_data"]["file_data"] hasher = file_data["hash_hasher"] hash_ = hasher.hexdigest() log.debug("Calculated hash: {}".format(hash_)) - iq_elt, jingle_elt = self._j.buildSessionInfo(client, session) + iq_elt, jingle_elt = self._j.build_session_info(client, session) checksum_elt = jingle_elt.addElement((NS_JINGLE_FT, "checksum")) checksum_elt["creator"] = content_data["creator"] checksum_elt["name"] = content_name file_elt = checksum_elt.addElement("file") - file_elt.addChild(self._hash.buildHashElt(hash_)) + file_elt.addChild(self._hash.build_hash_elt(hash_)) iq_elt.send() - def _receiverTryTerminate( + def _receiver_try_terminate( self, client, session, content_name, content_data, last_try=False ): """Try to terminate the session @@ -753,7 +753,7 @@ profile=client.profile ) ) - self._j.delayedContentTerminate(client, session, content_name) + self._j.delayed_content_terminate(client, session, content_name) content_data["stream_object"].close() return True return False @@ -775,7 +775,7 @@ algo=file_data["hash_algo"], given=given_hash, our=hash_ ) - self._j.delayedContentTerminate(client, session, content_name) + self._j.delayed_content_terminate(client, session, content_name) content_data["stream_object"].close(progress_metadata, error) # we may have the last_try timer still active, so we try to cancel it try: @@ -784,19 +784,19 @@ pass return True - def _finishedCb(self, __, client, session, content_name, content_data): + def _finished_cb(self, __, client, session, content_name, content_data): log.info("File transfer terminated") if content_data["senders"] != session["role"]: # we terminate the session only if we are the receiver, # as recommanded in XEP-0234 §2 (after example 6) content_data["transfer_finished"] = True - if not self._receiverTryTerminate( + if not self._receiver_try_terminate( client, session, content_name, content_data ): # we have not received the hash yet, we wait 5 more seconds content_data["last_try_timer"] = reactor.callLater( 5, - self._receiverTryTerminate, + self._receiver_try_terminate, client, session, content_name, @@ -805,13 +805,13 @@ ) else: # we are the sender, we send the checksum - self._sendCheckSum(client, session, content_name, content_data) + self._send_check_sum(client, session, content_name, content_data) content_data["stream_object"].close() - def _finishedEb(self, failure, client, session, content_name, content_data): + def _finished_eb(self, failure, client, session, content_name, content_data): log.warning("Error while streaming file: {}".format(failure)) content_data["stream_object"].close() - self._j.contentTerminate( + self._j.content_terminate( client, session, content_name, reason=self._j.REASON_FAILED_TRANSPORT )
--- a/sat/plugins/plugin_xep_0249.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0249.py Sat Apr 08 13:54:42 2023 +0200 @@ -86,18 +86,18 @@ def __init__(self, host): log.info(_("Plugin XEP_0249 initialization")) self.host = host - host.memory.updateParams(self.params) - host.bridge.addMethod( - "inviteMUC", ".plugin", in_sign="ssa{ss}s", out_sign="", method=self._invite + host.memory.update_params(self.params) + host.bridge.add_method( + "muc_invite", ".plugin", in_sign="ssa{ss}s", out_sign="", method=self._invite ) try: - self.host.plugins[C.TEXT_CMDS].registerTextCommands(self) + self.host.plugins[C.TEXT_CMDS].register_text_commands(self) except KeyError: log.info(_("Text commands not available")) - host.registerNamespace('x-conference', NS_X_CONFERENCE) - host.trigger.add("messageReceived", self._messageReceivedTrigger) + host.register_namespace('x-conference', NS_X_CONFERENCE) + host.trigger.add("messageReceived", self._message_received_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP_0249_handler() def _invite(self, guest_jid_s, room_jid_s, options, profile_key): @@ -109,7 +109,7 @@ @param profile_key: %(doc_profile_key)s """ # TODO: check parameters validity - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) self.invite(client, jid.JID(guest_jid_s), jid.JID(room_jid_s, options)) def invite(self, client, guest, room, options={}): @@ -136,7 +136,7 @@ @param room (jid.JID): JID of the room """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) log.info( _("Invitation accepted for room %(room)s [%(profile)s]") % {"room": room_jid.userhost(), "profile": client.profile} @@ -146,7 +146,7 @@ ) return d - def _messageReceivedTrigger(self, client, message_elt, post_treat): + def _message_received_trigger(self, client, message_elt, post_treat): """Check if a direct invitation is in the message, and handle it""" x_elt = next(message_elt.elements(NS_X_CONFERENCE, 'x'), None) if x_elt is None: @@ -165,7 +165,7 @@ from_jid_s = message_elt["from"] room_jid = jid.JID(room_jid_s) try: - self.host.plugins["XEP-0045"].checkRoomJoined(client, room_jid) + self.host.plugins["XEP-0045"].check_room_joined(client, room_jid) except exceptions.NotFound: pass else: @@ -174,7 +174,7 @@ ) return - autojoin = self.host.memory.getParamA( + autojoin = self.host.memory.param_get_a( AUTOJOIN_NAME, AUTOJOIN_KEY, profile_key=client.profile ) @@ -186,14 +186,14 @@ "declined according to your personal settings." ) % {"user": from_jid_s, "room": room_jid_s} title = D_("MUC invitation") - xml_tools.quickNote(self.host, client, msg, title, C.XMLUI_DATA_LVL_INFO) + xml_tools.quick_note(self.host, client, msg, title, C.XMLUI_DATA_LVL_INFO) else: # leave the default value here confirm_msg = D_( "You have been invited by %(user)s to join the room %(room)s. " "Do you accept?" ) % {"user": from_jid_s, "room": room_jid_s} confirm_title = D_("MUC invitation") - d = xml_tools.deferConfirm( + d = xml_tools.defer_confirm( self.host, confirm_msg, confirm_title, profile=client.profile ) @@ -219,7 +219,7 @@ "You must provide a valid JID to invite, like in '/invite " "contact@{host}'" ).format(host=my_host) - self.host.plugins[C.TEXT_CMDS].feedBack(client, feedback, mess_data) + self.host.plugins[C.TEXT_CMDS].feed_back(client, feedback, mess_data) return False if not contact_jid.user: contact_jid.user, contact_jid.host = contact_jid.host, my_host
--- a/sat/plugins/plugin_xep_0260.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0260.py Sat Apr 08 13:54:42 2023 +0200 @@ -69,12 +69,12 @@ self._jingle_ibb = host.plugins["XEP-0261"] except KeyError: self._jingle_ibb = None - self._j.registerTransport(NS_JINGLE_S5B, self._j.TRANSPORT_STREAMING, self, 100) + self._j.register_transport(NS_JINGLE_S5B, self._j.TRANSPORT_STREAMING, self, 100) - def getHandler(self, client): + def get_handler(self, client): return XEP_0260_handler() - def _parseCandidates(self, transport_elt): + def _parse_candidates(self, transport_elt): """Parse <candidate> elements @param transport_elt(domish.Element): parent <transport> element @@ -96,7 +96,7 @@ # self._s5b.registerCandidate(candidate) return candidates - def _buildCandidates(self, session, candidates, sid, session_hash, client, mode=None): + def _build_candidates(self, session, candidates, sid, session_hash, client, mode=None): """Build <transport> element with candidates @param session(dict): jingle session data @@ -135,47 +135,47 @@ return transport_elt @defer.inlineCallbacks - def jingleSessionInit(self, client, session, content_name): + def jingle_session_init(self, client, session, content_name): content_data = session["contents"][content_name] transport_data = content_data["transport_data"] sid = transport_data["sid"] = str(uuid.uuid4()) - session_hash = transport_data["session_hash"] = self._s5b.getSessionHash( + session_hash = transport_data["session_hash"] = self._s5b.get_session_hash( session["local_jid"], session["peer_jid"], sid ) - transport_data["peer_session_hash"] = self._s5b.getSessionHash( + transport_data["peer_session_hash"] = self._s5b.get_session_hash( session["peer_jid"], session["local_jid"], sid ) # requester and target are inversed for peer candidates - transport_data["stream_d"] = self._s5b.registerHash(client, session_hash, None) - candidates = transport_data["candidates"] = yield self._s5b.getCandidates( + transport_data["stream_d"] = self._s5b.register_hash(client, session_hash, None) + candidates = transport_data["candidates"] = yield self._s5b.get_candidates( client, session["local_jid"]) mode = "tcp" # XXX: we only manage tcp for now - transport_elt = self._buildCandidates( + transport_elt = self._build_candidates( session, candidates, sid, session_hash, client, mode ) defer.returnValue(transport_elt) - def _proxyActivatedCb(self, iq_result_elt, client, candidate, session, content_name): + def _proxy_activated_cb(self, iq_result_elt, client, candidate, session, content_name): """Called when activation confirmation has been received from proxy cf XEP-0260 § 2.4 """ # now that the proxy is activated, we have to inform other peer - iq_elt, transport_elt = self._j.buildAction( + iq_elt, transport_elt = self._j.build_action( client, self._j.A_TRANSPORT_INFO, session, content_name ) activated_elt = transport_elt.addElement("activated") activated_elt["cid"] = candidate.id iq_elt.send() - def _proxyActivatedEb(self, stanza_error, client, candidate, session, content_name): + def _proxy_activated_eb(self, stanza_error, client, candidate, session, content_name): """Called when activation error has been received from proxy cf XEP-0260 § 2.4 """ # TODO: fallback to IBB # now that the proxy is activated, we have to inform other peer - iq_elt, transport_elt = self._j.buildAction( + iq_elt, transport_elt = self._j.build_action( client, self._j.A_TRANSPORT_INFO, session, content_name ) transport_elt.addElement("proxy-error") @@ -185,9 +185,9 @@ reason=stanza_error.value.condition ) ) - self.doFallback(session, content_name, client) + self.do_fallback(session, content_name, client) - def _foundPeerCandidate( + def _found_peer_candidate( self, candidate, session, transport_data, content_name, client ): """Called when the best candidate from other peer is found @@ -207,7 +207,7 @@ continue c.discard() del transport_data["peer_candidates"] - iq_elt, transport_elt = self._j.buildAction( + iq_elt, transport_elt = self._j.build_action( client, self._j.A_TRANSPORT_INFO, session, content_name ) if candidate is None: @@ -218,9 +218,9 @@ candidate_elt = transport_elt.addElement("candidate-used") candidate_elt["cid"] = candidate.id iq_elt.send() # TODO: check result stanza - self._checkCandidates(session, content_name, transport_data, client) + self._check_candidates(session, content_name, transport_data, client) - def _checkCandidates(self, session, content_name, transport_data, client): + def _check_candidates(self, session, content_name, transport_data, client): """Called when a candidate has been choosed if we have both candidates, we select one, or fallback to an other transport @@ -261,7 +261,7 @@ if choosed_candidate is None: log.warning("Socks5 negociation failed, we need to fallback to IBB") - self.doFallback(session, content_name, client) + self.do_fallback(session, content_name, client) else: if choosed_candidate == peer_best_candidate: # peer_best_candidate was choosed from the candidates we have sent @@ -288,7 +288,7 @@ # the stream transfer need to wait for proxy activation # (see XEP-0260 § 2.4) if our_candidate: - d = self._s5b.connectCandidate( + d = self._s5b.connect_candidate( client, choosed_candidate, transport_data["session_hash"] ) d.addCallback( @@ -298,7 +298,7 @@ ) args = [client, choosed_candidate, session, content_name] d.addCallbacks( - self._proxyActivatedCb, self._proxyActivatedEb, args, None, args + self._proxy_activated_cb, self._proxy_activated_eb, args, None, args ) else: # this Deferred will be called when we'll receive activation confirmation from other peer @@ -309,13 +309,13 @@ if content_data["senders"] == session["role"]: # we can now start the stream transfer (or start it after proxy activation) d.addCallback( - lambda __: choosed_candidate.startTransfer( + lambda __: choosed_candidate.start_transfer( transport_data["session_hash"] ) ) - d.addErrback(self._startEb, session, content_name, client) + d.addErrback(self._start_eb, session, content_name, client) - def _startEb(self, fail, session, content_name, client): + def _start_eb(self, fail, session, content_name, client): """Called when it's not possible to start the transfer Will try to fallback to IBB @@ -325,9 +325,9 @@ except AttributeError: reason = str(fail) log.warning("Cant start transfert, we'll try fallback method: {}".format(reason)) - self.doFallback(session, content_name, client) + self.do_fallback(session, content_name, client) - def _candidateInfo( + def _candidate_info( self, candidate_elt, session, content_name, transport_data, client ): """Called when best candidate has been received from peer (or if none is working) @@ -368,9 +368,9 @@ log.info("Other peer best candidate: {}".format(candidate)) del transport_data["candidates"] - self._checkCandidates(session, content_name, transport_data, client) + self._check_candidates(session, content_name, transport_data, client) - def _proxyActivationInfo( + def _proxy_activation_info( self, proxy_elt, session, content_name, transport_data, client ): """Called when proxy has been activated (or has sent an error) @@ -393,7 +393,7 @@ activation_d.errback(ProxyError()) @defer.inlineCallbacks - def jingleHandler(self, client, action, session, content_name, transport_elt): + def jingle_handler(self, client, action, session, content_name, transport_elt): content_data = session["contents"][content_name] transport_data = content_data["transport_data"] @@ -403,21 +403,21 @@ elif action == self._j.A_SESSION_ACCEPT: # initiator side, we select a candidate in the ones sent by responder assert "peer_candidates" not in transport_data - transport_data["peer_candidates"] = self._parseCandidates(transport_elt) + transport_data["peer_candidates"] = self._parse_candidates(transport_elt) elif action == self._j.A_START: session_hash = transport_data["session_hash"] peer_candidates = transport_data["peer_candidates"] stream_object = content_data["stream_object"] - self._s5b.associateStreamObject(client, session_hash, stream_object) + self._s5b.associate_stream_object(client, session_hash, stream_object) stream_d = transport_data.pop("stream_d") stream_d.chainDeferred(content_data["finished_d"]) peer_session_hash = transport_data["peer_session_hash"] - d = self._s5b.getBestCandidate( + d = self._s5b.get_best_candidate( client, peer_candidates, session_hash, peer_session_hash ) d.addCallback( - self._foundPeerCandidate, session, transport_data, content_name, client + self._found_peer_candidate, session, transport_data, content_name, client ) elif action == self._j.A_SESSION_INITIATE: @@ -425,27 +425,27 @@ # and we give our candidates assert "peer_candidates" not in transport_data sid = transport_data["sid"] = transport_elt["sid"] - session_hash = transport_data["session_hash"] = self._s5b.getSessionHash( + session_hash = transport_data["session_hash"] = self._s5b.get_session_hash( session["local_jid"], session["peer_jid"], sid ) peer_session_hash = transport_data[ "peer_session_hash" - ] = self._s5b.getSessionHash( + ] = self._s5b.get_session_hash( session["peer_jid"], session["local_jid"], sid ) # requester and target are inversed for peer candidates - peer_candidates = transport_data["peer_candidates"] = self._parseCandidates( + peer_candidates = transport_data["peer_candidates"] = self._parse_candidates( transport_elt ) stream_object = content_data["stream_object"] - stream_d = self._s5b.registerHash(client, session_hash, stream_object) + stream_d = self._s5b.register_hash(client, session_hash, stream_object) stream_d.chainDeferred(content_data["finished_d"]) - d = self._s5b.getBestCandidate( + d = self._s5b.get_best_candidate( client, peer_candidates, session_hash, peer_session_hash ) d.addCallback( - self._foundPeerCandidate, session, transport_data, content_name, client + self._found_peer_candidate, session, transport_data, content_name, client ) - candidates = yield self._s5b.getCandidates(client, session["local_jid"]) + candidates = yield self._s5b.get_candidates(client, session["local_jid"]) # we remove duplicate candidates candidates = [ candidate for candidate in candidates if candidate not in peer_candidates @@ -453,7 +453,7 @@ transport_data["candidates"] = candidates # we can now build a new <transport> element with our candidates - transport_elt = self._buildCandidates( + transport_elt = self._build_candidates( session, candidates, sid, session_hash, client ) @@ -462,8 +462,8 @@ candidate_elt = None for method, names in ( - (self._candidateInfo, ("candidate-used", "candidate-error")), - (self._proxyActivationInfo, ("activated", "proxy-error")), + (self._candidate_info, ("candidate-used", "candidate-error")), + (self._proxy_activation_info, ("activated", "proxy-error")), ): for name in names: try: @@ -483,21 +483,21 @@ elif action == self._j.A_DESTROY: # the transport is replaced (fallback ?), We need mainly to kill XEP-0065 session. # note that sid argument is not necessary for sessions created by this plugin - self._s5b.killSession(None, transport_data["session_hash"], None, client) + self._s5b.kill_session(None, transport_data["session_hash"], None, client) else: log.warning("FIXME: unmanaged action {}".format(action)) defer.returnValue(transport_elt) - def jingleTerminate(self, client, action, session, content_name, reason_elt): + def jingle_terminate(self, client, action, session, content_name, reason_elt): if reason_elt.decline: log.debug("Session declined, deleting S5B session") # we just need to clean the S5B session if it is declined content_data = session["contents"][content_name] transport_data = content_data["transport_data"] - self._s5b.killSession(None, transport_data["session_hash"], None, client) + self._s5b.kill_session(None, transport_data["session_hash"], None, client) - def _doFallback(self, feature_checked, session, content_name, client): + def _do_fallback(self, feature_checked, session, content_name, client): """Do the fallback, method called once feature is checked @param feature_checked(bool): True if other peer can do IBB @@ -508,11 +508,11 @@ ) self._j.terminate(client, self._j.REASON_CONNECTIVITY_ERROR, session) else: - self._j.transportReplace( + self._j.transport_replace( client, self._jingle_ibb.NAMESPACE, session, content_name ) - def doFallback(self, session, content_name, client): + def do_fallback(self, session, content_name, client): """Fallback to IBB transport, used in last resort @param session(dict): session data @@ -531,7 +531,7 @@ d = self.host.hasFeature( client, self._jingle_ibb.NAMESPACE, session["peer_jid"] ) - d.addCallback(self._doFallback, session, content_name, client) + d.addCallback(self._do_fallback, session, content_name, client) return d
--- a/sat/plugins/plugin_xep_0261.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0261.py Sat Apr 08 13:54:42 2023 +0200 @@ -56,14 +56,14 @@ self.host = host self._j = host.plugins["XEP-0166"] # shortcut to access jingle self._ibb = host.plugins["XEP-0047"] # and in-band bytestream - self._j.registerTransport( + self._j.register_transport( NS_JINGLE_IBB, self._j.TRANSPORT_STREAMING, self, -10000 ) # must be the lowest priority - def getHandler(self, client): + def get_handler(self, client): return XEP_0261_handler() - def jingleSessionInit(self, client, session, content_name): + def jingle_session_init(self, client, session, content_name): transport_elt = domish.Element((NS_JINGLE_IBB, "transport")) content_data = session["contents"][content_name] transport_data = content_data["transport_data"] @@ -72,7 +72,7 @@ transport_elt["sid"] = transport_data["sid"] = str(uuid.uuid4()) return transport_elt - def jingleHandler(self, client, action, session, content_name, transport_elt): + def jingle_handler(self, client, action, session, content_name, transport_elt): content_data = session["contents"][content_name] transport_data = content_data["transport_data"] if action in ( @@ -90,12 +90,12 @@ stream_object = content_data["stream_object"] if action == self._j.A_START: block_size = transport_data["block_size"] - d = self._ibb.startStream( + d = self._ibb.start_stream( client, stream_object, local_jid, peer_jid, sid, block_size ) d.chainDeferred(content_data["finished_d"]) else: - d = self._ibb.createSession( + d = self._ibb.create_session( client, stream_object, local_jid, peer_jid, sid) d.chainDeferred(content_data["finished_d"]) else:
--- a/sat/plugins/plugin_xep_0264.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0264.py Sat Apr 08 13:54:42 2023 +0200 @@ -81,15 +81,15 @@ def __init__(self, host): log.info(_("Plugin XEP_0264 initialization")) self.host = host - host.trigger.add("XEP-0234_buildFileElement", self._addFileThumbnails) - host.trigger.add("XEP-0234_parseFileElement", self._getFileThumbnails) + host.trigger.add("XEP-0234_buildFileElement", self._add_file_thumbnails) + host.trigger.add("XEP-0234_parseFileElement", self._get_file_thumbnails) - def getHandler(self, client): + def get_handler(self, client): return XEP_0264_handler() ## triggers ## - def _addFileThumbnails(self, client, file_elt, extra_args): + def _add_file_thumbnails(self, client, file_elt, extra_args): try: thumbnails = extra_args["extra"][C.KEY_THUMBNAILS] except KeyError: @@ -103,7 +103,7 @@ thumbnail_elt["height"] = str(height) return True - def _getFileThumbnails(self, client, file_elt, file_data): + def _get_file_thumbnails(self, client, file_elt, file_data): thumbnails = [] for thumbnail_elt in file_elt.elements(NS_THUMBS, "thumbnail"): uri = thumbnail_elt["uri"] @@ -130,7 +130,7 @@ ## thumbnails generation ## - def getThumbId(self, image_uid, size): + def get_thumb_id(self, image_uid, size): """return an ID unique for image/size combination @param image_uid(unicode): unique id of the image @@ -140,13 +140,13 @@ """ return hashlib.sha256(repr((image_uid, size)).encode()).hexdigest() - def _blockingGenThumb( + def _blocking_gen_thumb( self, source_path, size=None, max_age=None, image_uid=None, fix_orientation=True): """Generate a thumbnail for image This is a blocking method and must be executed in a thread - params are the same as for [generateThumbnail] + params are the same as for [generate_thumbnail] """ if size is None: size = self.SIZE_SMALL @@ -159,9 +159,9 @@ if fix_orientation: img = ImageOps.exif_transpose(img) - uid = self.getThumbId(image_uid or source_path, size) + uid = self.get_thumb_id(image_uid or source_path, size) - with self.host.common_cache.cacheData( + with self.host.common_cache.cache_data( PLUGIN_INFO[C.PI_IMPORT_NAME], uid, MIME_TYPE, max_age ) as f: img.save(f, SAVE_FORMAT) @@ -170,7 +170,7 @@ return img.size, uid - def generateThumbnail( + def generate_thumbnail( self, source_path, size=None, max_age=None, image_uid=None, fix_orientation=True): """Generate a thumbnail of image @@ -178,7 +178,7 @@ @param size(int, None): max size of the thumbnail can be one of self.SIZE_* None to use default value (i.e. self.SIZE_SMALL) - @param max_age(int, None): same as for [memory.cache.Cache.cacheData]) + @param max_age(int, None): same as for [memory.cache.Cache.cache_data]) @param image_uid(unicode, None): unique ID to identify the image use hash whenever possible if None, source_path will be used @@ -188,7 +188,7 @@ - unique Id of the thumbnail """ d = threads.deferToThread( - self._blockingGenThumb, source_path, size, max_age, image_uid=image_uid, + self._blocking_gen_thumb, source_path, size, max_age, image_uid=image_uid, fix_orientation=fix_orientation ) d.addErrback(
--- a/sat/plugins/plugin_xep_0277.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0277.py Sat Apr 08 13:54:42 2023 +0200 @@ -84,13 +84,13 @@ def __init__(self, host): log.info(_("Microblogging plugin initialization")) self.host = host - host.registerNamespace("microblog", NS_MICROBLOG) + host.register_namespace("microblog", NS_MICROBLOG) self._p = self.host.plugins[ "XEP-0060" ] # this facilitate the access to pubsub plugin ps_cache = self.host.plugins.get("PUBSUB_CACHE") if ps_cache is not None: - ps_cache.registerAnalyser( + ps_cache.register_analyser( { "name": "XEP-0277", "node": NS_MICROBLOG, @@ -98,119 +98,119 @@ "type": "blog", "to_sync": True, "parser": self.item_2_mb_data, - "match_cb": self._cacheNodeMatchCb, + "match_cb": self._cache_node_match_cb, } ) self.rt_sessions = sat_defer.RTDeferredSessions() - self.host.plugins["XEP-0060"].addManagedNode( - NS_MICROBLOG, items_cb=self._itemsReceived + self.host.plugins["XEP-0060"].add_managed_node( + NS_MICROBLOG, items_cb=self._items_received ) - host.bridge.addMethod( - "mbSend", + host.bridge.add_method( + "mb_send", ".plugin", in_sign="ssss", out_sign="s", - method=self._mbSend, + method=self._mb_send, async_=True, ) - host.bridge.addMethod( - "mbRepeat", + host.bridge.add_method( + "mb_repeat", ".plugin", in_sign="sssss", out_sign="s", - method=self._mbRepeat, + method=self._mb_repeat, async_=True, ) - host.bridge.addMethod( - "mbPreview", + host.bridge.add_method( + "mb_preview", ".plugin", in_sign="ssss", out_sign="s", - method=self._mbPreview, + method=self._mb_preview, async_=True, ) - host.bridge.addMethod( - "mbRetract", + host.bridge.add_method( + "mb_retract", ".plugin", in_sign="ssss", out_sign="", - method=self._mbRetract, + method=self._mb_retract, async_=True, ) - host.bridge.addMethod( - "mbGet", + host.bridge.add_method( + "mb_get", ".plugin", in_sign="ssiasss", out_sign="s", - method=self._mbGet, + method=self._mb_get, async_=True, ) - host.bridge.addMethod( - "mbRename", + host.bridge.add_method( + "mb_rename", ".plugin", in_sign="sssss", out_sign="", - method=self._mbRename, + method=self._mb_rename, async_=True, ) - host.bridge.addMethod( - "mbSetAccess", + host.bridge.add_method( + "mb_access_set", ".plugin", in_sign="ss", out_sign="", - method=self.mbSetAccess, + method=self.mb_access_set, async_=True, ) - host.bridge.addMethod( - "mbSubscribeToMany", + host.bridge.add_method( + "mb_subscribe_to_many", ".plugin", in_sign="sass", out_sign="s", - method=self._mbSubscribeToMany, + method=self._mb_subscribe_to_many, ) - host.bridge.addMethod( - "mbGetFromManyRTResult", + host.bridge.add_method( + "mb_get_from_many_rt_result", ".plugin", in_sign="ss", out_sign="(ua(sssasa{ss}))", - method=self._mbGetFromManyRTResult, + method=self._mb_get_from_many_rt_result, async_=True, ) - host.bridge.addMethod( - "mbGetFromMany", + host.bridge.add_method( + "mb_get_from_many", ".plugin", in_sign="sasia{ss}s", out_sign="s", - method=self._mbGetFromMany, + method=self._mb_get_from_many, ) - host.bridge.addMethod( - "mbGetFromManyWithCommentsRTResult", + host.bridge.add_method( + "mb_get_from_many_with_comments_rt_result", ".plugin", in_sign="ss", out_sign="(ua(sssa(sa(sssasa{ss}))a{ss}))", - method=self._mbGetFromManyWithCommentsRTResult, + method=self._mb_get_from_many_with_comments_rt_result, async_=True, ) - host.bridge.addMethod( - "mbGetFromManyWithComments", + host.bridge.add_method( + "mb_get_from_many_with_comments", ".plugin", in_sign="sasiia{ss}a{ss}s", out_sign="s", - method=self._mbGetFromManyWithComments, + method=self._mb_get_from_many_with_comments, ) - host.bridge.addMethod( - "mbIsCommentNode", + host.bridge.add_method( + "mb_is_comment_node", ".plugin", in_sign="s", out_sign="b", - method=self.isCommentNode, + method=self.is_comment_node, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0277_handler() - def _cacheNodeMatchCb( + def _cache_node_match_cb( self, client: SatXMPPEntity, analyse: dict, @@ -219,25 +219,25 @@ if analyse["node"].startswith(NS_COMMENT_PREFIX): analyse["subtype"] = "comment" - def _checkFeaturesCb(self, available): + def _check_features_cb(self, available): return {"available": C.BOOL_TRUE} - def _checkFeaturesEb(self, fail): + def _check_features_eb(self, fail): return {"available": C.BOOL_FALSE} - def getFeatures(self, profile): - client = self.host.getClient(profile) - d = self.host.checkFeatures(client, [], identity=("pubsub", "pep")) - d.addCallbacks(self._checkFeaturesCb, self._checkFeaturesEb) + def features_get(self, profile): + client = self.host.get_client(profile) + d = self.host.check_features(client, [], identity=("pubsub", "pep")) + d.addCallbacks(self._check_features_cb, self._check_features_eb) return d ## plugin management methods ## - def _itemsReceived(self, client, itemsEvent): + def _items_received(self, client, itemsEvent): """Callback which manage items notifications (publish + retract)""" - def manageItem(data, event): - self.host.bridge.psEvent( + def manage_item(data, event): + self.host.bridge.ps_event( C.PS_MICROBLOG, itemsEvent.sender.full(), itemsEvent.nodeIdentifier, @@ -250,10 +250,10 @@ if item.name == C.PS_ITEM: # FIXME: service and node should be used here self.item_2_mb_data(client, item, None, None).addCallbacks( - manageItem, lambda failure: None, (C.PS_PUBLISH,) + manage_item, lambda failure: None, (C.PS_PUBLISH,) ) elif item.name == C.PS_RETRACT: - manageItem({"id": item["id"]}, C.PS_RETRACT) + manage_item({"id": item["id"]}, C.PS_RETRACT) else: raise exceptions.InternalError("Invalid event value") @@ -334,7 +334,7 @@ ) key = check_conflict("{}_xhtml".format(elem.name)) data = data_elt.toXml() - microblog_data[key] = yield self.host.plugins["TEXT_SYNTAXES"].cleanXHTML( + microblog_data[key] = yield self.host.plugins["TEXT_SYNTAXES"].clean_xhtml( data ) else: @@ -360,7 +360,7 @@ # FIXME: node should alway be set in the future, check FIXME in method signature if node is not None: microblog_data["node"] = node - microblog_data['uri'] = xmpp_uri.buildXMPPUri( + microblog_data['uri'] = xmpp_uri.build_xmpp_uri( "pubsub", path=service.full(), node=node, @@ -466,7 +466,7 @@ "uri": uri, } try: - comment_service, comment_node = self.parseCommentUrl(uri) + comment_service, comment_node = self.parse_comment_url(uri) except Exception as e: log.warning(f"Can't parse comments url: {e}") continue @@ -596,7 +596,7 @@ microblog_data["author_jid"] = publisher microblog_data["author_jid_verified"] = True else: - iq_elt = xml_tools.findAncestor(item_elt, "iq", C.NS_STREAM) + iq_elt = xml_tools.find_ancestor(item_elt, "iq", C.NS_STREAM) microblog_data["author_jid"] = iq_elt["from"] microblog_data["author_jid_verified"] = False @@ -644,7 +644,7 @@ if type_: if type_ == "_rich": # convert input from current syntax to XHTML xml_content = await synt.convert( - mb_data[attr], synt.getCurrentSyntax(client.profile), "XHTML" + mb_data[attr], synt.get_current_syntax(client.profile), "XHTML" ) if f"{elem_name}_xhtml" in mb_data: raise failure.Failure( @@ -724,7 +724,7 @@ log.warning(f"non HTTP URL in attachment, ignoring: {attachment}") continue link_elt = entry_elt.addElement("link") - # XXX: "uri" is set in self._manageComments if not already existing + # XXX: "uri" is set in self._manage_comments if not already existing link_elt["href"] = url if attachment.get("external", False): # this is a link to an external data such as a website @@ -779,7 +779,7 @@ ## id ## entry_id = mb_data.get( "id", - xmpp_uri.buildXMPPUri( + xmpp_uri.build_xmpp_uri( "pubsub", path=service.full() if service is not None else client.jid.userhost(), node=node, @@ -791,7 +791,7 @@ ## comments ## for comments_data in mb_data.get('comments', []): link_elt = entry_elt.addElement("link") - # XXX: "uri" is set in self._manageComments if not already existing + # XXX: "uri" is set in self._manage_comments if not already existing link_elt["href"] = comments_data["uri"] link_elt["rel"] = "replies" link_elt["title"] = "comments" @@ -820,20 +820,20 @@ ## publish/preview ## - def isCommentNode(self, node: str) -> bool: + def is_comment_node(self, node: str) -> bool: """Indicate if the node is prefixed with comments namespace""" return node.startswith(NS_COMMENT_PREFIX) - def getParentItem(self, item_id: str) -> str: + def get_parent_item(self, item_id: str) -> str: """Return parent of a comment node @param item_id: a comment node """ - if not self.isCommentNode(item_id): + if not self.is_comment_node(item_id): raise ValueError("This node is not a comment node") return item_id[len(NS_COMMENT_PREFIX):] - def getCommentsNode(self, item_id): + def get_comments_node(self, item_id): """Generate comment node @param item_id(unicode): id of the parent item @@ -841,7 +841,7 @@ """ return f"{NS_COMMENT_PREFIX}{item_id}" - def getCommentsService(self, client, parent_service=None): + def get_comments_service(self, client, parent_service=None): """Get prefered PubSub service to create comment node @param pubsub_service(jid.JID, None): PubSub service of the parent item @@ -855,7 +855,7 @@ pass else: # other server, let's try to find a non PEP service there - d = self.host.findServiceEntity( + d = self.host.find_service_entity( client, "pubsub", "service", parent_service ) d.addCallback(lambda entity: entity or parent_service) @@ -867,7 +867,7 @@ client.pubsub_service if client.pubsub_service is not None else parent_service ) - async def _manageComments(self, client, mb_data, service, node, item_id, access=None): + async def _manage_comments(self, client, mb_data, service, node, item_id, access=None): """Check comments keys in mb_data and create comments node if necessary if a comments node metadata is set in the mb_data['comments'] list, it is used @@ -931,7 +931,7 @@ comments_service = None if uri: - uri_service, uri_node = self.parseCommentUrl(uri) + uri_service, uri_node = self.parse_comment_url(uri) if ((comments_node is not None and comments_node!=uri_node) or (comments_service is not None and comments_service!=uri_service)): raise ValueError( @@ -941,15 +941,15 @@ comments_data['node'] = comments_node = uri_node else: if not comments_node: - comments_node = self.getCommentsNode(item_id) + comments_node = self.get_comments_node(item_id) comments_data['node'] = comments_node if comments_service is None: - comments_service = await self.getCommentsService(client, service) + comments_service = await self.get_comments_service(client, service) if comments_service is None: comments_service = client.jid.userhostJID() comments_data['service'] = comments_service - comments_data['uri'] = xmpp_uri.buildXMPPUri( + comments_data['uri'] = xmpp_uri.build_xmpp_uri( "pubsub", path=comments_service.full(), node=comments_node, @@ -969,7 +969,7 @@ else: if access == self._p.ACCESS_WHITELIST: # for whitelist access we need to copy affiliations from parent item - comments_affiliations = await self._p.getNodeAffiliations( + comments_affiliations = await self._p.get_node_affiliations( client, service, node ) # …except for "member", that we transform to publisher @@ -978,14 +978,14 @@ if affiliation == "member": comments_affiliations[jid_] == "publisher" - await self._p.setNodeAffiliations( + await self._p.set_node_affiliations( client, comments_service, comments_node, comments_affiliations ) - def friendlyId(self, data): + def friendly_id(self, data): """Generate a user friendly id from title or content""" # TODO: rich content should be converted to plain text - id_base = regex.urlFriendlyText( + id_base = regex.url_friendly_text( data.get('title') or data.get('title_rich') or data.get('content') @@ -994,10 +994,10 @@ ) return f"{id_base}-{token_urlsafe(3)}" - def _mbSend(self, service, node, data, profile_key): + def _mb_send(self, service, node, data, profile_key): service = jid.JID(service) if service else None node = node if node else NS_MICROBLOG - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) data = data_format.deserialise(data) return defer.ensureDeferred(self.send(client, data, service, node)) @@ -1028,17 +1028,17 @@ item_id = data.get("id") if item_id is None: if data.get("user_friendly_id", True): - item_id = self.friendlyId(data) + item_id = self.friendly_id(data) else: item_id = str(shortuuid.uuid()) try: - await self._manageComments(client, data, service, node, item_id, access=None) + await self._manage_comments(client, data, service, node, item_id, access=None) except error.StanzaError: log.warning("Can't create comments node for item {}".format(item_id)) item = await self.mb_data_2_entry_elt(client, data, item_id, service, node) - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0277_send", client, service, node, item, data ): return None @@ -1052,7 +1052,7 @@ await self._p.publish(client, service, node, [item], extra=extra) return item_id - def _mbRepeat( + def _mb_repeat( self, service_s: str, node: str, @@ -1062,7 +1062,7 @@ ) -> defer.Deferred: service = jid.JID(service_s) if service_s else None node = node if node else NS_MICROBLOG - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) d = defer.ensureDeferred( self.repeat(client, item, service, node, extra) @@ -1088,7 +1088,7 @@ service = client.jid.userhostJID() # we first get the post to repeat - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, node, @@ -1121,27 +1121,27 @@ next(author_elt.elements(NS_ATOM, "uri")) except StopIteration: entry_elt.addElement( - "uri", content=xmpp_uri.buildXMPPUri(None, path=service.full()) + "uri", content=xmpp_uri.build_xmpp_uri(None, path=service.full()) ) # we add the link indicating that it's a repeated post link_elt = entry_elt.addElement("link") link_elt["rel"] = "via" - link_elt["href"] = xmpp_uri.buildXMPPUri( + link_elt["href"] = xmpp_uri.build_xmpp_uri( "pubsub", path=service.full(), node=node, item=item ) - return await self._p.sendItem( + return await self._p.send_item( client, client.jid.userhostJID(), NS_MICROBLOG, entry_elt ) - def _mbPreview(self, service, node, data, profile_key): + def _mb_preview(self, service, node, data, profile_key): service = jid.JID(service) if service else None node = node if node else NS_MICROBLOG - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) data = data_format.deserialise(data) d = defer.ensureDeferred(self.preview(client, data, service, node)) d.addCallback(data_format.serialise) @@ -1172,9 +1172,9 @@ ## retract ## - def _mbRetract(self, service_jid_s, nodeIdentifier, itemIdentifier, profile_key): - """Call self._p._retractItem, but use default node if node is empty""" - return self._p._retractItem( + def _mb_retract(self, service_jid_s, nodeIdentifier, itemIdentifier, profile_key): + """Call self._p._retract_item, but use default node if node is empty""" + return self._p._retract_item( service_jid_s, nodeIdentifier or NS_MICROBLOG, itemIdentifier, @@ -1184,29 +1184,29 @@ ## get ## - def _mbGetSerialise(self, data): + def _mb_get_serialise(self, data): items, metadata = data metadata['items'] = items return data_format.serialise(metadata) - def _mbGet(self, service="", node="", max_items=10, item_ids=None, extra="", + def _mb_get(self, service="", node="", max_items=10, item_ids=None, extra="", profile_key=C.PROF_KEY_NONE): """ @param max_items(int): maximum number of item to get, C.NO_LIMIT for no limit @param item_ids (list[unicode]): list of item IDs """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None max_items = None if max_items == C.NO_LIMIT else max_items - extra = self._p.parseExtra(data_format.deserialise(extra)) + extra = self._p.parse_extra(data_format.deserialise(extra)) d = defer.ensureDeferred( - self.mbGet(client, service, node or None, max_items, item_ids, + self.mb_get(client, service, node or None, max_items, item_ids, extra.rsm_request, extra.extra) ) - d.addCallback(self._mbGetSerialise) + d.addCallback(self._mb_get_serialise) return d - async def mbGet( + async def mb_get( self, client: SatXMPPEntity, service: Optional[jid.JID] = None, @@ -1233,7 +1233,7 @@ node = NS_MICROBLOG if rsm_request: max_items = None - items_data = await self._p.getItems( + items_data = await self._p.get_items( client, service, node, @@ -1242,7 +1242,7 @@ rsm_request=rsm_request, extra=extra, ) - mb_data_list, metadata = await self._p.transItemsDataD( + mb_data_list, metadata = await self._p.trans_items_data_d( items_data, partial(self.item_2_mb_data, client, service=service, node=node)) encrypted = metadata.pop("encrypted", None) if encrypted is not None: @@ -1253,16 +1253,16 @@ pass return (mb_data_list, metadata) - def _mbRename(self, service, node, item_id, new_id, profile_key): - return defer.ensureDeferred(self.mbRename( - self.host.getClient(profile_key), + def _mb_rename(self, service, node, item_id, new_id, profile_key): + return defer.ensureDeferred(self.mb_rename( + self.host.get_client(profile_key), jid.JID(service) if service else None, node or None, item_id, new_id )) - async def mbRename( + async def mb_rename( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -1272,9 +1272,9 @@ ) -> None: if not node: node = NS_MICROBLOG - await self._p.renameItem(client, service, node, item_id, new_id) + await self._p.rename_item(client, service, node, item_id, new_id) - def parseCommentUrl(self, node_url): + def parse_comment_url(self, node_url): """Parse a XMPP URI Determine the fields comments_service and comments_node of a microblog data @@ -1284,7 +1284,7 @@ @return (tuple[jid.JID, unicode]): service and node """ try: - parsed_url = xmpp_uri.parseXMPPUri(node_url) + parsed_url = xmpp_uri.parse_xmpp_uri(node_url) service = jid.JID(parsed_url["path"]) node = parsed_url["node"] except Exception as e: @@ -1294,7 +1294,7 @@ ## configure ## - def mbSetAccess(self, access="presence", profile_key=C.PROF_KEY_NONE): + def mb_access_set(self, access="presence", profile_key=C.PROF_KEY_NONE): """Create a microblog node on PEP with given access If the node already exists, it change options @@ -1302,7 +1302,7 @@ @param profile_key: profile key """ # FIXME: check if this mehtod is need, deprecate it if not - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) _options = { self._p.OPT_ACCESS_MODEL: access, @@ -1351,7 +1351,7 @@ # common - def _getClientAndNodeData(self, publishers_type, publishers, profile_key): + def _get_client_and_node_data(self, publishers_type, publishers, profile_key): """Helper method to construct node_data from publishers_type/publishers @param publishers_type: type of the list of publishers, one of: @@ -1362,15 +1362,15 @@ list of groups or list of jids) @param profile_key: %(doc_profile_key)s """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if publishers_type == C.JID: jids_set = set(publishers) else: - jids_set = client.roster.getJidsSet(publishers_type, publishers) + jids_set = client.roster.get_jids_set(publishers_type, publishers) if publishers_type == C.ALL: try: # display messages from salut-a-toi@libervia.org or other PEP services - services = self.host.plugins["EXTRA-PEP"].getFollowedEntities( + services = self.host.plugins["EXTRA-PEP"].get_followed_entities( profile_key ) except KeyError: @@ -1388,7 +1388,7 @@ node_data.append((jid_, NS_MICROBLOG)) return client, node_data - def _checkPublishers(self, publishers_type, publishers): + def _check_publishers(self, publishers_type, publishers): """Helper method to deserialise publishers coming from bridge publishers_type(unicode): type of the list of publishers, one of: @@ -1410,15 +1410,15 @@ # subscribe # - def _mbSubscribeToMany(self, publishers_type, publishers, profile_key): + def _mb_subscribe_to_many(self, publishers_type, publishers, profile_key): """ @return (str): session id: Use pubsub.getSubscribeRTResult to get the results """ - publishers_type, publishers = self._checkPublishers(publishers_type, publishers) - return self.mbSubscribeToMany(publishers_type, publishers, profile_key) + publishers_type, publishers = self._check_publishers(publishers_type, publishers) + return self.mb_subscribe_to_many(publishers_type, publishers, profile_key) - def mbSubscribeToMany(self, publishers_type, publishers, profile_key): + def mb_subscribe_to_many(self, publishers_type, publishers, profile_key): """Subscribe microblogs for a list of groups or jids @param publishers_type: type of the list of publishers, one of: @@ -1430,17 +1430,17 @@ @param profile: %(doc_profile)s @return (str): session id """ - client, node_data = self._getClientAndNodeData( + client, node_data = self._get_client_and_node_data( publishers_type, publishers, profile_key ) - return self._p.subscribeToMany( + return self._p.subscribe_to_many( node_data, client.jid.userhostJID(), profile_key=profile_key ) # get # - def _mbGetFromManyRTResult(self, session_id, profile_key=C.PROF_KEY_DEFAULT): - """Get real-time results for mbGetFromMany session + def _mb_get_from_many_rt_result(self, session_id, profile_key=C.PROF_KEY_DEFAULT): + """Get real-time results for mb_get_from_many session @param session_id: id of the real-time deferred session @param return (tuple): (remaining, results) where: @@ -1449,16 +1449,16 @@ - service (unicode): pubsub service - node (unicode): pubsub node - failure (unicode): empty string in case of success, error message else - - items_data(list): data as returned by [mbGet] - - items_metadata(dict): metadata as returned by [mbGet] + - items_data(list): data as returned by [mb_get] + - items_metadata(dict): metadata as returned by [mb_get] @param profile_key: %(doc_profile_key)s """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) def onSuccess(items_data): """convert items elements to list of microblog data in items_data""" - d = self._p.transItemsDataD( + d = self._p.trans_items_data_d( items_data, # FIXME: service and node should be used here partial(self.item_2_mb_data, client), @@ -1467,7 +1467,7 @@ d.addCallback(lambda serialised: ("", serialised)) return d - d = self._p.getRTResults( + d = self._p.get_rt_results( session_id, on_success=onSuccess, on_error=lambda failure: (str(failure.value), ([], {})), @@ -1486,15 +1486,15 @@ ) return d - def _mbGetFromMany(self, publishers_type, publishers, max_items=10, extra_dict=None, + def _mb_get_from_many(self, publishers_type, publishers, max_items=10, extra_dict=None, profile_key=C.PROF_KEY_NONE): """ @param max_items(int): maximum number of item to get, C.NO_LIMIT for no limit """ max_items = None if max_items == C.NO_LIMIT else max_items - publishers_type, publishers = self._checkPublishers(publishers_type, publishers) - extra = self._p.parseExtra(extra_dict) - return self.mbGetFromMany( + publishers_type, publishers = self._check_publishers(publishers_type, publishers) + extra = self._p.parse_extra(extra_dict) + return self.mb_get_from_many( publishers_type, publishers, max_items, @@ -1503,7 +1503,7 @@ profile_key, ) - def mbGetFromMany(self, publishers_type, publishers, max_items=None, rsm_request=None, + def mb_get_from_many(self, publishers_type, publishers, max_items=None, rsm_request=None, extra=None, profile_key=C.PROF_KEY_NONE): """Get the published microblogs for a list of groups or jids @@ -1518,21 +1518,21 @@ @return (str): RT Deferred session id """ # XXX: extra is unused here so far - client, node_data = self._getClientAndNodeData( + client, node_data = self._get_client_and_node_data( publishers_type, publishers, profile_key ) - return self._p.getFromMany( + return self._p.get_from_many( node_data, max_items, rsm_request, profile_key=profile_key ) # comments # - def _mbGetFromManyWithCommentsRTResultSerialise(self, data): + def _mb_get_from_many_with_comments_rt_result_serialise(self, data): """Serialisation of result This is probably the longest method name of whole SàT ecosystem ^^ @param data(dict): data as received by rt_sessions - @return (tuple): see [_mbGetFromManyWithCommentsRTResult] + @return (tuple): see [_mb_get_from_many_with_comments_rt_result] """ ret = [] data_iter = iter(data[1].items()) @@ -1550,9 +1550,9 @@ return data[0], ret - def _mbGetFromManyWithCommentsRTResult(self, session_id, + def _mb_get_from_many_with_comments_rt_result(self, session_id, profile_key=C.PROF_KEY_DEFAULT): - """Get real-time results for [mbGetFromManyWithComments] session + """Get real-time results for [mb_get_from_many_with_comments] session @param session_id: id of the real-time deferred session @param return (tuple): (remaining, results) where: @@ -1572,12 +1572,12 @@ - metadata(dict): original node metadata @param profile_key: %(doc_profile_key)s """ - profile = self.host.getClient(profile_key).profile - d = self.rt_sessions.getResults(session_id, profile=profile) - d.addCallback(self._mbGetFromManyWithCommentsRTResultSerialise) + profile = self.host.get_client(profile_key).profile + d = self.rt_sessions.get_results(session_id, profile=profile) + d.addCallback(self._mb_get_from_many_with_comments_rt_result_serialise) return d - def _mbGetFromManyWithComments(self, publishers_type, publishers, max_items=10, + def _mb_get_from_many_with_comments(self, publishers_type, publishers, max_items=10, max_comments=C.NO_LIMIT, extra_dict=None, extra_comments_dict=None, profile_key=C.PROF_KEY_NONE): """ @@ -1587,10 +1587,10 @@ """ max_items = None if max_items == C.NO_LIMIT else max_items max_comments = None if max_comments == C.NO_LIMIT else max_comments - publishers_type, publishers = self._checkPublishers(publishers_type, publishers) - extra = self._p.parseExtra(extra_dict) - extra_comments = self._p.parseExtra(extra_comments_dict) - return self.mbGetFromManyWithComments( + publishers_type, publishers = self._check_publishers(publishers_type, publishers) + extra = self._p.parse_extra(extra_dict) + extra_comments = self._p.parse_extra(extra_comments_dict) + return self.mb_get_from_many_with_comments( publishers_type, publishers, max_items, @@ -1602,7 +1602,7 @@ profile_key, ) - def mbGetFromManyWithComments(self, publishers_type, publishers, max_items=None, + def mb_get_from_many_with_comments(self, publishers_type, publishers, max_items=None, max_comments=None, rsm_request=None, extra=None, rsm_comments=None, extra_comments=None, profile_key=C.PROF_KEY_NONE): @@ -1625,11 +1625,11 @@ # to serialise and associate the data, but it make life in frontends side # a lot easier - client, node_data = self._getClientAndNodeData( + client, node_data = self._get_client_and_node_data( publishers_type, publishers, profile_key ) - def getComments(items_data): + def get_comments(items_data): """Retrieve comments and add them to the items_data @param items_data: serialised items data @@ -1649,7 +1649,7 @@ node = item["{}{}".format(prefix, "_node")] # time to get the comments d = defer.ensureDeferred( - self._p.getItems( + self._p.get_items( client, service, node, @@ -1660,7 +1660,7 @@ ) # then serialise d.addCallback( - lambda items_data: self._p.transItemsDataD( + lambda items_data: self._p.trans_items_data_d( items_data, partial( self.item_2_mb_data, client, service=service, node=node @@ -1698,20 +1698,20 @@ deferreds = {} for service, node in node_data: - d = deferreds[(service, node)] = defer.ensureDeferred(self._p.getItems( + d = deferreds[(service, node)] = defer.ensureDeferred(self._p.get_items( client, service, node, max_items, rsm_request=rsm_request, extra=extra )) d.addCallback( - lambda items_data: self._p.transItemsDataD( + lambda items_data: self._p.trans_items_data_d( items_data, partial(self.item_2_mb_data, client, service=service, node=node), ) ) - d.addCallback(getComments) + d.addCallback(get_comments) d.addCallback(lambda items_comments_data: ("", items_comments_data)) d.addErrback(lambda failure: (str(failure.value), ([], {}))) - return self.rt_sessions.newSession(deferreds, client.profile) + return self.rt_sessions.new_session(deferreds, client.profile) @implementer(iwokkel.IDisco)
--- a/sat/plugins/plugin_xep_0280.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0280.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,13 +74,13 @@ def __init__(self, host): log.info(_("Plugin XEP_0280 initialization")) self.host = host - host.memory.updateParams(self.params) - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=200000) + host.memory.update_params(self.params) + host.trigger.add("messageReceived", self.message_received_trigger, priority=200000) - def getHandler(self, client): + def get_handler(self, client): return XEP_0280_handler() - def setPrivate(self, message_elt): + def set_private(self, message_elt): """Add a <private/> element to a message this method is intented to be called on final domish.Element by other plugins @@ -93,16 +93,16 @@ message_elt.addElement((NS_CARBONS, "private")) @defer.inlineCallbacks - def profileConnected(self, client): + def profile_connected(self, client): """activate message carbons on connection if possible and activated in config""" - activate = self.host.memory.getParamA( + activate = self.host.memory.param_get_a( PARAM_NAME, PARAM_CATEGORY, profile_key=client.profile ) if not activate: log.info(_("Not activating message carbons as requested in params")) return try: - yield self.host.checkFeatures(client, (NS_CARBONS,)) + yield self.host.check_features(client, (NS_CARBONS,)) except exceptions.FeatureNotFound: log.warning(_("server doesn't handle message carbons")) else: @@ -116,7 +116,7 @@ else: log.info(_("message carbons activated")) - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """get message and handle it if carbons namespace is present""" carbons_elt = None for e in message_elt.elements():
--- a/sat/plugins/plugin_xep_0292.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0292.py Sat Apr 08 13:54:42 2023 +0200 @@ -71,7 +71,7 @@ # it is not done here. It is expected that this dedicated protocol will be removed # from a future version of the XEP. log.info(_("vCard4 Over XMPP initialization")) - host.registerNamespace("vcard4", NS_VCARD4) + host.register_namespace("vcard4", NS_VCARD4) self.host = host self._p = host.plugins["XEP-0060"] self._i = host.plugins['IDENTITY'] @@ -79,21 +79,21 @@ IMPORT_NAME, 'nicknames', partial(self.getValue, field="nicknames"), - partial(self.setValue, field="nicknames"), + partial(self.set_value, field="nicknames"), priority=1000 ) self._i.register( IMPORT_NAME, 'description', partial(self.getValue, field="description"), - partial(self.setValue, field="description"), + partial(self.set_value, field="description"), priority=1000 ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0292_Handler() - def vcard2Dict(self, vcard_elt: domish.Element) -> Dict[str, Any]: + def vcard_2_dict(self, vcard_elt: domish.Element) -> Dict[str, Any]: """Convert vcard element to equivalent identity metadata""" vcard: Dict[str, Any] = {} @@ -102,7 +102,7 @@ for source_field, dest_field in text_fields.items(): if metadata_elt.name == source_field: if metadata_elt.text is not None: - dest_type = self._i.getFieldType(dest_field) + dest_type = self._i.get_field_type(dest_field) value = str(metadata_elt.text) if dest_type is str: if dest_field in vcard: @@ -122,7 +122,7 @@ ) return vcard - def dict2VCard(self, vcard: dict[str, Any]) -> domish.Element: + def dict_2_v_card(self, vcard: dict[str, Any]) -> domish.Element: """Convert vcard metadata to vCard4 element""" vcard_elt = domish.Element((NS_VCARD4, "vcard")) for field, elt_name in text_fields_inv.items(): @@ -142,9 +142,9 @@ return vcard_elt @async_lru(5) - async def getCard(self, client: SatXMPPEntity, entity: jid.JID) -> dict: + async def get_card(self, client: SatXMPPEntity, entity: jid.JID) -> dict: try: - items, metadata = await self._p.getItems( + items, metadata = await self._p.get_items( client, entity, VCARD4_NODE, item_ids=["current"] ) except exceptions.NotFound: @@ -157,9 +157,9 @@ log.info(f"vCard element is not present for {entity}") return {} - return self.vcard2Dict(vcard_elt) + return self.vcard_2_dict(vcard_elt) - async def updateVCardElt( + async def update_vcard_elt( self, client: SatXMPPEntity, vcard_elt: domish.Element, @@ -176,12 +176,12 @@ self._p.OPT_ACCESS_MODEL: self._p.ACCESS_OPEN, self._p.OPT_PUBLISH_MODEL: self._p.PUBLISH_MODEL_PUBLISHERS } - await self._p.createIfNewNode(client, service, VCARD4_NODE, node_options) - await self._p.sendItem( + await self._p.create_if_new_node(client, service, VCARD4_NODE, node_options) + await self._p.send_item( client, service, VCARD4_NODE, vcard_elt, item_id=self._p.ID_SINGLETON ) - async def updateVCard( + async def update_v_card( self, client: SatXMPPEntity, vcard: Dict[str, Any], @@ -198,11 +198,11 @@ """ service = entity or client.jid.userhostJID() if update: - current_vcard = await self.getCard(client, service) + current_vcard = await self.get_card(client, service) current_vcard.update(vcard) vcard = current_vcard - vcard_elt = self.dict2VCard(vcard) - await self.updateVCardElt(client, vcard_elt, service) + vcard_elt = self.dict_2_v_card(vcard) + await self.update_vcard_elt(client, vcard_elt, service) async def getValue( self, @@ -217,10 +217,10 @@ This has to be a string field @return request value """ - vcard_data = await self.getCard(client, entity) + vcard_data = await self.get_card(client, entity) return vcard_data.get(field) - async def setValue( + async def set_value( self, client: SatXMPPEntity, value: Union[str, List[str]], @@ -233,7 +233,7 @@ @param field: name of the field to get This has to be a string field """ - await self.updateVCard(client, {field: value}, entity) + await self.update_v_card(client, {field: value}, entity) @implementer(iwokkel.IDisco)
--- a/sat/plugins/plugin_xep_0297.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0297.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,11 +54,11 @@ log.info(_("Stanza Forwarding plugin initialization")) self.host = host - def getHandler(self, client): + def get_handler(self, client): return XEP_0297_handler(self, client.profile) @classmethod - def updateUri(cls, element, uri): + def update_uri(cls, element, uri): """Update recursively the element URI. @param element (domish.Element): element to update @@ -70,7 +70,7 @@ element.defaultUri = uri for child in element.children: if isinstance(child, domish.Element) and not child.uri: - XEP_0297.updateUri(child, uri) + XEP_0297.update_uri(child, uri) def forward(self, stanza, to_jid, stamp, body="", profile_key=C.PROF_KEY_NONE): """Forward a message to the given JID. @@ -82,7 +82,7 @@ @param profile_key (unicode): %(doc_profile_key)s @return: a Deferred when the message has been sent """ - # FIXME: this method is not used and doesn't use mess_data which should be used for client.sendMessageData + # FIXME: this method is not used and doesn't use mess_data which should be used for client.send_message_data # should it be deprecated? A method constructing the element without sending it seems more natural log.warning( "THIS METHOD IS DEPRECATED" @@ -99,14 +99,14 @@ delay_elt = self.host.plugins["XEP-0203"].delay(stamp) forwarded_elt.addChild(delay_elt) if not stanza.uri: # None or '' - XEP_0297.updateUri(stanza, "jabber:client") + XEP_0297.update_uri(stanza, "jabber:client") forwarded_elt.addChild(stanza) msg.addChild(body_elt) msg.addChild(forwarded_elt) - client = self.host.getClient(profile_key) - return defer.ensureDeferred(client.sendMessageData({"xml": msg})) + client = self.host.get_client(profile_key) + return defer.ensureDeferred(client.send_message_data({"xml": msg})) @implementer(iwokkel.IDisco)
--- a/sat/plugins/plugin_xep_0300.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0300.py Sat Apr 08 13:54:42 2023 +0200 @@ -68,12 +68,12 @@ def __init__(self, host): log.info(_("plugin Hashes initialization")) - host.registerNamespace("hashes", NS_HASHES) + host.register_namespace("hashes", NS_HASHES) - def getHandler(self, client): + def get_handler(self, client): return XEP_0300_handler() - def getHasher(self, algo=ALGO_DEFAULT): + def get_hasher(self, algo=ALGO_DEFAULT): """Return hasher instance @param algo(unicode): one of the XEP_300.ALGOS keys @@ -83,11 +83,11 @@ """ return self.ALGOS[algo]() - def getDefaultAlgo(self): + def get_default_algo(self): return ALGO_DEFAULT @defer.inlineCallbacks - def getBestPeerAlgo(self, to_jid, profile): + def get_best_peer_algo(self, to_jid, profile): """Return the best available hashing algorith of other peer @param to_jid(jid.JID): peer jid @@ -95,7 +95,7 @@ @return (D(unicode, None)): best available algorithm, or None if hashing is not possible """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) for algo in reversed(XEP_0300.ALGOS): has_feature = yield self.host.hasFeature( client, NS_HASHES_FUNCTIONS.format(algo), to_jid @@ -108,10 +108,10 @@ ) defer.returnValue(algo) - def _calculateHashBlocking(self, file_obj, hasher): + def _calculate_hash_blocking(self, file_obj, hasher): """Calculate hash in a blocking way - /!\\ blocking method, please use calculateHash instead + /!\\ blocking method, please use calculate_hash instead @param file_obj(file): a file-like object @param hasher(hash object): the method to call to initialise hash object @return (str): the hex digest of the hash @@ -123,10 +123,10 @@ hasher.update(buf) return hasher.hexdigest() - def calculateHash(self, file_obj, hasher): - return threads.deferToThread(self._calculateHashBlocking, file_obj, hasher) + def calculate_hash(self, file_obj, hasher): + return threads.deferToThread(self._calculate_hash_blocking, file_obj, hasher) - def calculateHashElt(self, file_obj=None, algo=ALGO_DEFAULT): + def calculate_hash_elt(self, file_obj=None, algo=ALGO_DEFAULT): """Compute hash and build hash element @param file_obj(file, None): file-like object to use to calculate the hash @@ -134,20 +134,20 @@ @return (D(domish.Element)): hash element """ - def hashCalculated(hash_): - return self.buildHashElt(hash_, algo) + def hash_calculated(hash_): + return self.build_hash_elt(hash_, algo) - hasher = self.getHasher(algo) - hash_d = self.calculateHash(file_obj, hasher) - hash_d.addCallback(hashCalculated) + hasher = self.get_hasher(algo) + hash_d = self.calculate_hash(file_obj, hasher) + hash_d.addCallback(hash_calculated) return hash_d - def buildHashUsedElt(self, algo=ALGO_DEFAULT): + def build_hash_used_elt(self, algo=ALGO_DEFAULT): hash_used_elt = domish.Element((NS_HASHES, "hash-used")) hash_used_elt["algo"] = algo return hash_used_elt - def parseHashUsedElt(self, parent): + def parse_hash_used_elt(self, parent): """Find and parse a hash-used element @param (domish.Element): parent of <hash/> element @@ -164,7 +164,7 @@ raise exceptions.DataError return algo - def buildHashElt(self, hash_, algo=ALGO_DEFAULT): + def build_hash_elt(self, hash_, algo=ALGO_DEFAULT): """Compute hash and build hash element @param hash_(str): hash to use @@ -180,7 +180,7 @@ hash_elt["algo"] = algo return hash_elt - def parseHashElt(self, parent: domish.Element) -> Tuple[str, bytes]: + def parse_hash_elt(self, parent: domish.Element) -> Tuple[str, bytes]: """Find and parse a hash element if multiple elements are found, the strongest managed one is returned
--- a/sat/plugins/plugin_xep_0313.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0313.py Sat Apr 08 13:54:42 2023 +0200 @@ -63,27 +63,27 @@ def __init__(self, host): log.info(_("Message Archive Management plugin initialization")) self.host = host - self.host.registerNamespace("mam", mam.NS_MAM) - host.registerNamespace("fulltextmam", NS_FTS) + self.host.register_namespace("mam", mam.NS_MAM) + host.register_namespace("fulltextmam", NS_FTS) self._rsm = host.plugins["XEP-0059"] self._sid = host.plugins["XEP-0359"] # Deferred used to store last stanza id in order of reception self._last_stanza_id_d = defer.Deferred() self._last_stanza_id_d.callback(None) - host.bridge.addMethod( - "MAMGet", ".plugin", in_sign='sss', - out_sign='(a(sdssa{ss}a{ss}ss)ss)', method=self._getArchives, + host.bridge.add_method( + "mam_get", ".plugin", in_sign='sss', + out_sign='(a(sdssa{ss}a{ss}ss)ss)', method=self._get_archives, async_=True) async def resume(self, client): """Retrieve one2one messages received since the last we have in local storage""" - stanza_id_data = await self.host.memory.storage.getPrivates( + stanza_id_data = await self.host.memory.storage.get_privates( mam.NS_MAM, [KEY_LAST_STANZA_ID], profile=client.profile) stanza_id = stanza_id_data.get(KEY_LAST_STANZA_ID) rsm_req = None if stanza_id is None: log.info("can't retrieve last stanza ID, checking history") - last_mess = await self.host.memory.historyGet( + last_mess = await self.host.memory.history_get( None, None, limit=1, filters={'not_types': C.MESS_TYPE_GROUPCHAT, 'last_stanza_id': True}, profile=client.profile) @@ -99,7 +99,7 @@ complete = False count = 0 while not complete: - mam_data = await self.getArchives(client, mam_req, + mam_data = await self.get_archives(client, mam_req, service=client.jid.userhostJID()) elt_list, rsm_response, mam_response = mam_data complete = mam_response["complete"] @@ -114,7 +114,7 @@ for mess_elt in elt_list: try: - fwd_message_elt = self.getMessageFromResult( + fwd_message_elt = self.get_message_from_result( client, mess_elt, mam_req) except exceptions.DataError: continue @@ -142,9 +142,9 @@ from_jid=from_jid.full(), xml=mess_elt.toXml())) continue # adding message to history - mess_data = client.messageProt.parseMessage(fwd_message_elt) + mess_data = client.messageProt.parse_message(fwd_message_elt) try: - await client.messageProt.addToHistory(mess_data) + await client.messageProt.add_to_history(mess_data) except exceptions.CancelError as e: log.warning( "message has not been added to history: {e}".format(e=e)) @@ -159,14 +159,14 @@ log.info(_("We have received {num_mess} message(s) while offline.") .format(num_mess=count)) - def profileConnected(self, client): + def profile_connected(self, client): defer.ensureDeferred(self.resume(client)) - def getHandler(self, client): + def get_handler(self, client): mam_client = client._mam = SatMAMClient(self) return mam_client - def parseExtra(self, extra, with_rsm=True): + def parse_extra(self, extra, with_rsm=True): """Parse extra dictionnary to retrieve MAM arguments @param extra(dict): data for parse @@ -208,7 +208,7 @@ continue if with_rsm: - rsm_request = self._rsm.parseExtra(extra) + rsm_request = self._rsm.parse_extra(extra) if rsm_request is not None: mam_args["rsm_"] = rsm_request @@ -224,7 +224,7 @@ return mam.MAMRequest(**mam_args) if mam_args else None - def getMessageFromResult(self, client, mess_elt, mam_req, service=None): + def get_message_from_result(self, client, mess_elt, mam_req, service=None): """Extract usable <message/> from MAM query result The message will be validated, and stanza-id/delay will be added if necessary. @@ -267,7 +267,7 @@ log.error("Unexpected query id (was expecting {query_id}): {xml}" .format(query_id=mam.query_id, xml=mess_elt.toXml())) raise exceptions.DataError("Invalid element") - stanza_id = self._sid.getStanzaId(fwd_message_elt, + stanza_id = self._sid.get_stanza_id(fwd_message_elt, service_jid) if stanza_id is None: # not stanza-id element is present, we add one so message @@ -279,7 +279,7 @@ log.warning('Invalid MAM result: missing "id" attribute: {xml}' .format(xml=result_elt.toXml())) raise exceptions.DataError("Invalid element") - self._sid.addStanzaId(client, fwd_message_elt, stanza_id, by=service_jid) + self._sid.add_stanza_id(client, fwd_message_elt, stanza_id, by=service_jid) if delay_elt is not None: fwd_message_elt.addChild(delay_elt) @@ -304,14 +304,14 @@ """ return client._mam.queryArchive(mam_req, service) - def _appendMessage(self, elt_list, message_cb, message_elt): + def _append_message(self, elt_list, message_cb, message_elt): if message_cb is not None: elt_list.append(message_cb(message_elt)) else: elt_list.append(message_elt) - def _queryFinished(self, iq_result, client, elt_list, event): - client.xmlstream.removeObserver(event, self._appendMessage) + def _query_finished(self, iq_result, client, elt_list, event): + client.xmlstream.removeObserver(event, self._append_message) try: fin_elt = next(iq_result.elements(mam.NS_MAM, "fin")) except StopIteration: @@ -327,39 +327,39 @@ return (elt_list, rsm_response, mam_response) - def serializeArchiveResult(self, data, client, mam_req, service): + def serialize_archive_result(self, data, client, mam_req, service): elt_list, rsm_response, mam_response = data mess_list = [] for elt in elt_list: - fwd_message_elt = self.getMessageFromResult(client, elt, mam_req, + fwd_message_elt = self.get_message_from_result(client, elt, mam_req, service=service) - mess_data = client.messageProt.parseMessage(fwd_message_elt) - mess_list.append(client.messageGetBridgeArgs(mess_data)) + mess_data = client.messageProt.parse_message(fwd_message_elt) + mess_list.append(client.message_get_bridge_args(mess_data)) metadata = { 'rsm': self._rsm.response2dict(rsm_response), 'mam': mam_response } return mess_list, data_format.serialise(metadata), client.profile - def _getArchives(self, service, extra_ser, profile_key): + def _get_archives(self, service, extra_ser, profile_key): """ @return: tuple with: - - list of message with same data as in bridge.messageNew + - list of message with same data as in bridge.message_new - response metadata with: - rsm data (first, last, count, index) - mam data (complete, stable) - profile """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None extra = data_format.deserialise(extra_ser, {}) - mam_req = self.parseExtra(extra) + mam_req = self.parse_extra(extra) - d = self.getArchives(client, mam_req, service=service) - d.addCallback(self.serializeArchiveResult, client, mam_req, service) + d = self.get_archives(client, mam_req, service=service) + d.addCallback(self.serialize_archive_result, client, mam_req, service) return d - def getArchives(self, client, query, service=None, message_cb=None): + def get_archives(self, client, query, service=None, message_cb=None): """Query archive and gather page result @param query(mam.MAMRequest): MAM request @@ -378,12 +378,12 @@ query.query_id = str(uuid.uuid4()) elt_list = [] event = MESSAGE_RESULT.format(mam_ns=mam.NS_MAM, query_id=query.query_id) - client.xmlstream.addObserver(event, self._appendMessage, 0, elt_list, message_cb) + client.xmlstream.addObserver(event, self._append_message, 0, elt_list, message_cb) d = self.queryArchive(client, query, service) - d.addCallback(self._queryFinished, client, elt_list, event) + d.addCallback(self._query_finished, client, elt_list, event) return d - def getPrefs(self, client, service=None): + def get_prefs(self, client, service=None): """Retrieve the current user preferences. @param service: entity offering the MAM service (None for user archives) @@ -392,7 +392,7 @@ # http://xmpp.org/extensions/xep-0313.html#prefs return client._mam.queryPrefs(service) - def _setPrefs(self, service_s=None, default="roster", always=None, never=None, + def _set_prefs(self, service_s=None, default="roster", always=None, never=None, profile_key=C.PROF_KEY_NONE): service = jid.JID(service_s) if service_s else None always_jid = [jid.JID(entity) for entity in always] @@ -413,7 +413,7 @@ # http://xmpp.org/extensions/xep-0313.html#prefs return client._mam.setPrefs(service, default, always, never) - def onMessageStanzaId(self, message_elt, client): + def on_message_stanza_id(self, message_elt, client): """Called when a message with a stanza-id is received the messages' stanza ids are stored when received, so the last one can be used @@ -421,14 +421,14 @@ @param message_elt(domish.Element): <message> with a stanza-id """ service_jid = client.jid.userhostJID() - stanza_id = self._sid.getStanzaId(message_elt, service_jid) + stanza_id = self._sid.get_stanza_id(message_elt, service_jid) if stanza_id is None: log.debug("Ignoring <message>, stanza id is not from our server") else: # we use self._last_stanza_id_d do be sure that last_stanza_id is stored in # the order of reception self._last_stanza_id_d.addCallback( - lambda __: self.host.memory.storage.setPrivateValue( + lambda __: self.host.memory.storage.set_private_value( namespace=mam.NS_MAM, key=KEY_LAST_STANZA_ID, value=stanza_id, @@ -449,7 +449,7 @@ observer_xpath = MESSAGE_STANZA_ID.format( ns_stanza_id=self.host.ns_map['stanza_id']) self.xmlstream.addObserver( - observer_xpath, self.plugin_parent.onMessageStanzaId, client=self.parent + observer_xpath, self.plugin_parent.on_message_stanza_id, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0329.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0329.py Sat Apr 08 13:54:42 2023 +0200 @@ -92,7 +92,7 @@ if name == "..": name = "--" else: - name = regex.pathEscape(name) + name = regex.path_escape(name) self.name = name self.children = {} self.type = type_ @@ -128,7 +128,7 @@ def values(self): return self.children.values() - def getOrCreate(self, name, type_=TYPE_VIRTUAL, access=None): + def get_or_create(self, name, type_=TYPE_VIRTUAL, access=None): """Get a node or create a virtual node and return it""" if access is None: access = {C.ACCESS_PERM_READ: {KEY_TYPE: C.ACCESS_TYPE_PUBLIC}} @@ -144,7 +144,7 @@ node.parent = self self.children[node.name] = node - def removeFromParent(self): + def remove_from_parent(self): try: del self.parent.children[self.name] except TypeError: @@ -155,7 +155,7 @@ raise exceptions.InternalError("node not found in parent's children") self.parent = None - def _checkNodePermission(self, client, node, perms, peer_jid): + def _check_node_permission(self, client, node, perms, peer_jid): """Check access to this node for peer_jid @param node(SharedNode): node to check access @@ -165,13 +165,13 @@ """ file_data = {"access": self.access, "owner": client.jid.userhostJID()} try: - self.host.memory.checkFilePermission(file_data, peer_jid, perms) + self.host.memory.check_file_permission(file_data, peer_jid, perms) except exceptions.PermissionError: return False else: return True - def checkPermissions( + def check_permissions( self, client, peer_jid, perms=(C.ACCESS_PERM_READ,), check_parents=True ): """Check that peer_jid can access this node and all its parents @@ -188,7 +188,7 @@ parent = self while parent != None: - if not self._checkNodePermission(client, parent, perms, peer_jid): + if not self._check_node_permission(client, parent, perms, peer_jid): return False parent = parent.parent @@ -225,27 +225,27 @@ elif node.type == TYPE_PATH: break - if not node.checkPermissions(client, peer_jid, perms=perms): + if not node.check_permissions(client, peer_jid, perms=perms): raise exceptions.PermissionError("permission denied") return node, "/".join(path_elts) - def findByLocalPath(self, path): + def find_by_local_path(self, path): """retrieve nodes linking to local path @return (list[ShareNode]): found nodes associated to path @raise exceptions.NotFound: no node has been found with this path """ - shared_paths = self.getSharedPaths() + shared_paths = self.get_shared_paths() try: return shared_paths[path] except KeyError: raise exceptions.NotFound - def _getSharedPaths(self, node, paths): + def _get_shared_paths(self, node, paths): if node.type == TYPE_VIRTUAL: for node in node.values(): - self._getSharedPaths(node, paths) + self._get_shared_paths(node, paths) elif node.type == TYPE_PATH: paths.setdefault(node.path, []).append(node) else: @@ -253,7 +253,7 @@ "unknown node type: {type}".format(type=node.type) ) - def getSharedPaths(self): + def get_shared_paths(self): """retrieve nodes by shared path this method will retrieve recursively shared path in children of this node @@ -261,10 +261,10 @@ """ if self.type == TYPE_PATH: raise exceptions.InternalError( - "getSharedPaths must be used on a virtual node" + "get_shared_paths must be used on a virtual node" ) paths = {} - self._getSharedPaths(self, paths) + self._get_shared_paths(self, paths) return paths @@ -276,86 +276,86 @@ self._b = host.plugins["XEP-0231"] self._h = host.plugins["XEP-0300"] self._jf = host.plugins["XEP-0234"] - host.bridge.addMethod( - "FISList", + host.bridge.add_method( + "fis_list", ".plugin", in_sign="ssa{ss}s", out_sign="aa{ss}", - method=self._listFiles, + method=self._list_files, async_=True, ) - host.bridge.addMethod( - "FISLocalSharesGet", + host.bridge.add_method( + "fis_local_shares_get", ".plugin", in_sign="s", out_sign="as", - method=self._localSharesGet, + method=self._local_shares_get, ) - host.bridge.addMethod( - "FISSharePath", + host.bridge.add_method( + "fis_share_path", ".plugin", in_sign="ssss", out_sign="s", - method=self._sharePath, + method=self._share_path, ) - host.bridge.addMethod( - "FISUnsharePath", + host.bridge.add_method( + "fis_unshare_path", ".plugin", in_sign="ss", out_sign="", - method=self._unsharePath, + method=self._unshare_path, ) - host.bridge.addMethod( - "FISAffiliationsGet", + host.bridge.add_method( + "fis_affiliations_get", ".plugin", in_sign="ssss", out_sign="a{ss}", - method=self._affiliationsGet, + method=self._affiliations_get, async_=True, ) - host.bridge.addMethod( - "FISAffiliationsSet", + host.bridge.add_method( + "fis_affiliations_set", ".plugin", in_sign="sssa{ss}s", out_sign="", - method=self._affiliationsSet, + method=self._affiliations_set, async_=True, ) - host.bridge.addMethod( - "FISConfigurationGet", + host.bridge.add_method( + "fis_configuration_get", ".plugin", in_sign="ssss", out_sign="a{ss}", - method=self._configurationGet, + method=self._configuration_get, async_=True, ) - host.bridge.addMethod( - "FISConfigurationSet", + host.bridge.add_method( + "fis_configuration_set", ".plugin", in_sign="sssa{ss}s", out_sign="", - method=self._configurationSet, + method=self._configuration_set, async_=True, ) - host.bridge.addMethod( - "FISCreateDir", + host.bridge.add_method( + "fis_create_dir", ".plugin", in_sign="sssa{ss}s", out_sign="", - method=self._createDir, + method=self._create_dir, async_=True, ) - host.bridge.addSignal("FISSharedPathNew", ".plugin", signature="sss") - host.bridge.addSignal("FISSharedPathRemoved", ".plugin", signature="ss") - host.trigger.add("XEP-0234_fileSendingRequest", self._fileSendingRequestTrigger) - host.registerNamespace("fis", NS_FIS) + host.bridge.add_signal("fis_shared_path_new", ".plugin", signature="sss") + host.bridge.add_signal("fis_shared_path_removed", ".plugin", signature="ss") + host.trigger.add("XEP-0234_fileSendingRequest", self._file_sending_request_trigger) + host.register_namespace("fis", NS_FIS) - def getHandler(self, client): + def get_handler(self, client): return XEP_0329_handler(self) - def profileConnected(self, client): + def profile_connected(self, client): if client.is_component: - client._file_sharing_allowed_hosts = self.host.memory.getConfig( + client._file_sharing_allowed_hosts = self.host.memory.config_get( 'component file_sharing', 'http_upload_allowed_hosts_list') or [client.host] else: @@ -367,7 +367,7 @@ ) client._XEP_0329_names_data = {} # name to share map - def _fileSendingRequestTrigger( + def _file_sending_request_trigger( self, client, session, content_data, content_name, file_data, file_elt ): """This trigger check that a requested file is available, and fill suitable data @@ -425,7 +425,7 @@ else: return True, None parent_node = shared_file["parent"] - if not parent_node.checkPermissions(client, session["peer_jid"]): + if not parent_node.check_permissions(client, session["peer_jid"]): log.warning( _( "{peer_jid} requested a file (s)he can't access [{profile}]" @@ -436,14 +436,14 @@ file_data["size"] = size file_elt.addElement("size", content=str(size)) - hash_algo = file_data["hash_algo"] = self._h.getDefaultAlgo() - hasher = file_data["hash_hasher"] = self._h.getHasher(hash_algo) - file_elt.addChild(self._h.buildHashUsedElt(hash_algo)) + hash_algo = file_data["hash_algo"] = self._h.get_default_algo() + hasher = file_data["hash_hasher"] = self._h.get_hasher(hash_algo) + file_elt.addChild(self._h.build_hash_used_elt(hash_algo)) content_data["stream_object"] = stream.FileStreamObject( self.host, client, path, - uid=self._jf.getProgressId(session, content_name), + uid=self._jf.get_progress_id(session, content_name), size=size, data_cb=lambda data: hasher.update(data), ) @@ -451,32 +451,32 @@ # common methods - def _requestHandler(self, client, iq_elt, root_nodes_cb, files_from_node_cb): + def _request_handler(self, client, iq_elt, root_nodes_cb, files_from_node_cb): iq_elt.handled = True node = iq_elt.query.getAttribute("node") if not node: - d = utils.asDeferred(root_nodes_cb, client, iq_elt) + d = utils.as_deferred(root_nodes_cb, client, iq_elt) else: - d = utils.asDeferred(files_from_node_cb, client, iq_elt, node) + d = utils.as_deferred(files_from_node_cb, client, iq_elt, node) d.addErrback( lambda failure_: log.error( _("error while retrieving files: {msg}").format(msg=failure_) ) ) - def _iqError(self, client, iq_elt, condition="item-not-found"): + def _iq_error(self, client, iq_elt, condition="item-not-found"): error_elt = jabber_error.StanzaError(condition).toResponse(iq_elt) client.send(error_elt) # client - def _addPathData(self, client, query_elt, path, parent_node): + def _add_path_data(self, client, query_elt, path, parent_node): """Fill query_elt with files/directories found in path""" name = os.path.basename(path) if os.path.isfile(path): size = os.path.getsize(path) mime_type = mimetypes.guess_type(path, strict=False)[0] - file_elt = self._jf.buildFileElement( + file_elt = self._jf.build_file_element( client=client, name=name, size=size, mime_type=mime_type, modified=os.path.getmtime(path) ) @@ -497,15 +497,15 @@ directory_elt = query_elt.addElement("directory") directory_elt["name"] = name - def _pathNodeHandler(self, client, iq_elt, query_elt, node, path): + def _path_node_handler(self, client, iq_elt, query_elt, node, path): """Fill query_elt for path nodes, i.e. physical directories""" path = os.path.join(node.path, path) if not os.path.exists(path): # path may have been moved since it has been shared - return self._iqError(client, iq_elt) + return self._iq_error(client, iq_elt) elif os.path.isfile(path): - self._addPathData(client, query_elt, path, node) + self._add_path_data(client, query_elt, path, node) else: for name in sorted(os.listdir(path.encode("utf-8")), key=lambda n: n.lower()): try: @@ -518,44 +518,44 @@ ) continue full_path = os.path.join(path, name) - self._addPathData(client, query_elt, full_path, node) + self._add_path_data(client, query_elt, full_path, node) - def _virtualNodeHandler(self, client, peer_jid, iq_elt, query_elt, node): + def _virtual_node_handler(self, client, peer_jid, iq_elt, query_elt, node): """Fill query_elt for virtual nodes""" for name, child_node in node.items(): - if not child_node.checkPermissions(client, peer_jid, check_parents=False): + if not child_node.check_permissions(client, peer_jid, check_parents=False): continue node_type = child_node.type if node_type == TYPE_VIRTUAL: directory_elt = query_elt.addElement("directory") directory_elt["name"] = name elif node_type == TYPE_PATH: - self._addPathData(client, query_elt, child_node.path, child_node) + self._add_path_data(client, query_elt, child_node.path, child_node) else: raise exceptions.InternalError( _("unexpected type: {type}").format(type=node_type) ) - def _getRootNodesCb(self, client, iq_elt): + def _get_root_nodes_cb(self, client, iq_elt): peer_jid = jid.JID(iq_elt["from"]) iq_result_elt = xmlstream.toResponse(iq_elt, "result") query_elt = iq_result_elt.addElement((NS_FIS, "query")) for name, node in client._XEP_0329_root_node.items(): - if not node.checkPermissions(client, peer_jid, check_parents=False): + if not node.check_permissions(client, peer_jid, check_parents=False): continue directory_elt = query_elt.addElement("directory") directory_elt["name"] = name client.send(iq_result_elt) - def _getFilesFromNodeCb(self, client, iq_elt, node_path): + def _get_files_from_node_cb(self, client, iq_elt, node_path): """Main method to retrieve files/directories from a node_path""" peer_jid = jid.JID(iq_elt["from"]) try: node, path = ShareNode.find(client, node_path, peer_jid) except (exceptions.PermissionError, exceptions.NotFound): - return self._iqError(client, iq_elt) + return self._iq_error(client, iq_elt) except exceptions.DataError: - return self._iqError(client, iq_elt, condition="not-acceptable") + return self._iq_error(client, iq_elt, condition="not-acceptable") node_type = node.type peer_jid = jid.JID(iq_elt["from"]) @@ -566,10 +566,10 @@ # we now fill query_elt according to node_type if node_type == TYPE_PATH: # it's a physical path - self._pathNodeHandler(client, iq_elt, query_elt, node, path) + self._path_node_handler(client, iq_elt, query_elt, node, path) elif node_type == TYPE_VIRTUAL: assert not path - self._virtualNodeHandler(client, peer_jid, iq_elt, query_elt, node) + self._virtual_node_handler(client, peer_jid, iq_elt, query_elt, node) else: raise exceptions.InternalError( _("unknown node type: {type}").format(type=node_type) @@ -577,23 +577,23 @@ client.send(iq_result_elt) - def onRequest(self, iq_elt, client): - return self._requestHandler( - client, iq_elt, self._getRootNodesCb, self._getFilesFromNodeCb + def on_request(self, iq_elt, client): + return self._request_handler( + client, iq_elt, self._get_root_nodes_cb, self._get_files_from_node_cb ) # Component - def _compParseJids(self, client, iq_elt): + def _comp_parse_jids(self, client, iq_elt): """Retrieve peer_jid and owner to use from IQ stanza @param iq_elt(domish.Element): IQ stanza of the FIS request @return (tuple[jid.JID, jid.JID]): peer_jid and owner """ - async def _compGetRootNodesCb(self, client, iq_elt): - peer_jid, owner = client.getOwnerAndPeer(iq_elt) - files_data = await self.host.memory.getFiles( + async def _comp_get_root_nodes_cb(self, client, iq_elt): + peer_jid, owner = client.get_owner_and_peer(iq_elt) + files_data = await self.host.memory.get_files( client, peer_jid=peer_jid, parent="", @@ -608,7 +608,7 @@ directory_elt["name"] = name client.send(iq_result_elt) - async def _compGetFilesFromNodeCb(self, client, iq_elt, node_path): + async def _comp_get_files_from_node_cb(self, client, iq_elt, node_path): """Retrieve files from local files repository according to permissions result stanza is then built and sent to requestor @@ -616,21 +616,21 @@ files_data): can be used to add data/elements """ - peer_jid, owner = client.getOwnerAndPeer(iq_elt) + peer_jid, owner = client.get_owner_and_peer(iq_elt) try: - files_data = await self.host.memory.getFiles( + files_data = await self.host.memory.get_files( client, peer_jid=peer_jid, path=node_path, owner=owner ) except exceptions.NotFound: - self._iqError(client, iq_elt) + self._iq_error(client, iq_elt) return except exceptions.PermissionError: - self._iqError(client, iq_elt, condition='not-allowed') + self._iq_error(client, iq_elt, condition='not-allowed') return except Exception as e: tb = traceback.format_tb(e.__traceback__) log.error(f"internal server error: {e}\n{''.join(tb)}") - self._iqError(client, iq_elt, condition='internal-server-error') + self._iq_error(client, iq_elt, condition='internal-server-error') return iq_result_elt = xmlstream.toResponse(iq_elt, "result") query_elt = iq_result_elt.addElement((NS_FIS, "query")) @@ -658,7 +658,7 @@ node_path, ) else: - file_elt = self._jf.buildFileElementFromDict( + file_elt = self._jf.build_file_element_from_dict( client, file_data, modified=file_data.get("modified", file_data["created"]) @@ -666,12 +666,12 @@ query_elt.addChild(file_elt) client.send(iq_result_elt) - def onComponentRequest(self, iq_elt, client): - return self._requestHandler( - client, iq_elt, self._compGetRootNodesCb, self._compGetFilesFromNodeCb + def on_component_request(self, iq_elt, client): + return self._request_handler( + client, iq_elt, self._comp_get_root_nodes_cb, self._comp_get_files_from_node_cb ) - async def _parseResult(self, client, peer_jid, iq_elt): + async def _parse_result(self, client, peer_jid, iq_elt): query_elt = next(iq_elt.elements(NS_FIS, "query")) files = [] @@ -679,7 +679,7 @@ if elt.name == "file": # we have a file try: - file_data = await self._jf.parseFileElement(client, elt) + file_data = await self._jf.parse_file_element(client, elt) except exceptions.DataError: continue file_data["type"] = C.FILE_TYPE_FILE @@ -691,7 +691,7 @@ for thumb in thumbs: if 'url' not in thumb and "id" in thumb: try: - file_path = await self._b.getFile(client, peer_jid, thumb['id']) + file_path = await self._b.get_file(client, peer_jid, thumb['id']) except Exception as e: log.warning(f"Can't get thumbnail {thumb['id']!r} for {file_data}: {e}") else: @@ -718,14 +718,14 @@ # affiliations # - async def _parseElement(self, client, iq_elt, element, namespace): - peer_jid, owner = client.getOwnerAndPeer(iq_elt) + async def _parse_element(self, client, iq_elt, element, namespace): + peer_jid, owner = client.get_owner_and_peer(iq_elt) elt = next(iq_elt.elements(namespace, element)) path = Path("/", elt['path']) if len(path.parts) < 2: raise RootPathException namespace = elt.getAttribute('namespace') - files_data = await self.host.memory.getFiles( + files_data = await self.host.memory.get_files( client, peer_jid=peer_jid, path=str(path.parent), @@ -739,8 +739,8 @@ file_data = files_data[0] return peer_jid, elt, path, namespace, file_data - def _affiliationsGet(self, service_jid_s, namespace, path, profile): - client = self.host.getClient(profile) + def _affiliations_get(self, service_jid_s, namespace, path, profile): + client = self.host.get_client(profile) service = jid.JID(service_jid_s) d = defer.ensureDeferred(self.affiliationsGet( client, service, namespace or None, path)) @@ -782,8 +782,8 @@ return affiliations - def _affiliationsSet(self, service_jid_s, namespace, path, affiliations, profile): - client = self.host.getClient(profile) + def _affiliations_set(self, service_jid_s, namespace, path, affiliations, profile): + client = self.host.get_client(profile) service = jid.JID(service_jid_s) affiliations = {jid.JID(e): a for e, a in affiliations.items()} return defer.ensureDeferred(self.affiliationsSet( @@ -811,20 +811,20 @@ affiliation_elt['affiliation'] = affiliation await iq_elt.send() - def _onComponentAffiliationsGet(self, iq_elt, client): + def _on_component_affiliations_get(self, iq_elt, client): iq_elt.handled = True - defer.ensureDeferred(self.onComponentAffiliationsGet(client, iq_elt)) + defer.ensureDeferred(self.on_component_affiliations_get(client, iq_elt)) - async def onComponentAffiliationsGet(self, client, iq_elt): + async def on_component_affiliations_get(self, client, iq_elt): try: ( from_jid, affiliations_elt, path, namespace, file_data - ) = await self._parseElement(client, iq_elt, "affiliations", NS_FIS_AFFILIATION) + ) = await self._parse_element(client, iq_elt, "affiliations", NS_FIS_AFFILIATION) except exceptions.CancelError: return except RootPathException: # if root path is requested, we only get owner affiliation - peer_jid, owner = client.getOwnerAndPeer(iq_elt) + peer_jid, owner = client.get_owner_and_peer(iq_elt) is_owner = peer_jid.userhostJID() == owner affiliations = {owner: 'owner'} except exceptions.NotFound: @@ -836,7 +836,7 @@ else: from_jid_bare = from_jid.userhostJID() is_owner = from_jid_bare == file_data.get('owner') - affiliations = self.host.memory.getFileAffiliations(file_data) + affiliations = self.host.memory.get_file_affiliations(file_data) iq_result_elt = xmlstream.toResponse(iq_elt, "result") affiliations_elt = iq_result_elt.addElement((NS_FIS_AFFILIATION, 'affiliations')) for entity_jid, affiliation in affiliations.items(): @@ -848,15 +848,15 @@ affiliation_elt['affiliation'] = affiliation client.send(iq_result_elt) - def _onComponentAffiliationsSet(self, iq_elt, client): + def _on_component_affiliations_set(self, iq_elt, client): iq_elt.handled = True - defer.ensureDeferred(self.onComponentAffiliationsSet(client, iq_elt)) + defer.ensureDeferred(self.on_component_affiliations_set(client, iq_elt)) - async def onComponentAffiliationsSet(self, client, iq_elt): + async def on_component_affiliations_set(self, client, iq_elt): try: ( from_jid, affiliations_elt, path, namespace, file_data - ) = await self._parseElement(client, iq_elt, "affiliations", NS_FIS_AFFILIATION) + ) = await self._parse_element(client, iq_elt, "affiliations", NS_FIS_AFFILIATION) except exceptions.CancelError: return except RootPathException: @@ -890,17 +890,17 @@ client.sendError(iq_elt, 'internal-server-error', f"{e}") return - await self.host.memory.setFileAffiliations(client, file_data, affiliations) + await self.host.memory.set_file_affiliations(client, file_data, affiliations) iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) # configuration - def _configurationGet(self, service_jid_s, namespace, path, profile): - client = self.host.getClient(profile) + def _configuration_get(self, service_jid_s, namespace, path, profile): + client = self.host.get_client(profile) service = jid.JID(service_jid_s) - d = defer.ensureDeferred(self.configurationGet( + d = defer.ensureDeferred(self.configuration_get( client, service, namespace or None, path)) d.addCallback( lambda configuration: { @@ -909,7 +909,7 @@ ) return d - async def configurationGet( + async def configuration_get( self, client: SatXMPPEntity, service: jid.JID, @@ -935,13 +935,13 @@ return configuration - def _configurationSet(self, service_jid_s, namespace, path, configuration, profile): - client = self.host.getClient(profile) + def _configuration_set(self, service_jid_s, namespace, path, configuration, profile): + client = self.host.get_client(profile) service = jid.JID(service_jid_s) - return defer.ensureDeferred(self.configurationSet( + return defer.ensureDeferred(self.configuration_set( client, service, namespace or None, path, configuration)) - async def configurationSet( + async def configuration_set( self, client: SatXMPPEntity, service: jid.JID, @@ -962,15 +962,15 @@ configuration_elt.addChild(form.toElement()) await iq_elt.send() - def _onComponentConfigurationGet(self, iq_elt, client): + def _on_component_configuration_get(self, iq_elt, client): iq_elt.handled = True - defer.ensureDeferred(self.onComponentConfigurationGet(client, iq_elt)) + defer.ensureDeferred(self.on_component_configuration_get(client, iq_elt)) - async def onComponentConfigurationGet(self, client, iq_elt): + async def on_component_configuration_get(self, client, iq_elt): try: ( from_jid, configuration_elt, path, namespace, file_data - ) = await self._parseElement(client, iq_elt, "configuration", NS_FIS_CONFIGURATION) + ) = await self._parse_element(client, iq_elt, "configuration", NS_FIS_CONFIGURATION) except exceptions.CancelError: return except RootPathException: @@ -990,25 +990,25 @@ configuration_elt.addChild(form.toElement()) client.send(iq_result_elt) - async def _setConfiguration(self, client, configuration_elt, file_data): + async def _set_configuration(self, client, configuration_elt, file_data): form = data_form.findForm(configuration_elt, NS_FIS_CONFIGURATION) for name, value in form.items(): if name == 'access_model': - await self.host.memory.setFileAccessModel(client, file_data, value) + await self.host.memory.set_file_access_model(client, file_data, value) else: # TODO: send a IQ error? log.warning( f"Trying to set a not implemented configuration option: {name}") - def _onComponentConfigurationSet(self, iq_elt, client): + def _on_component_configuration_set(self, iq_elt, client): iq_elt.handled = True - defer.ensureDeferred(self.onComponentConfigurationSet(client, iq_elt)) + defer.ensureDeferred(self.on_component_configuration_set(client, iq_elt)) - async def onComponentConfigurationSet(self, client, iq_elt): + async def on_component_configuration_set(self, client, iq_elt): try: ( from_jid, configuration_elt, path, namespace, file_data - ) = await self._parseElement(client, iq_elt, "configuration", NS_FIS_CONFIGURATION) + ) = await self._parse_element(client, iq_elt, "configuration", NS_FIS_CONFIGURATION) except exceptions.CancelError: return except RootPathException: @@ -1025,20 +1025,20 @@ client.sendError(iq_elt, 'forbidden') return - await self._setConfiguration(client, configuration_elt, file_data) + await self._set_configuration(client, configuration_elt, file_data) iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) # directory creation - def _createDir(self, service_jid_s, namespace, path, configuration, profile): - client = self.host.getClient(profile) + def _create_dir(self, service_jid_s, namespace, path, configuration, profile): + client = self.host.get_client(profile) service = jid.JID(service_jid_s) - return defer.ensureDeferred(self.createDir( + return defer.ensureDeferred(self.create_dir( client, service, namespace or None, path, configuration or None)) - async def createDir( + async def create_dir( self, client: SatXMPPEntity, service: jid.JID, @@ -1061,12 +1061,12 @@ configuration_elt.addChild(form.toElement()) await iq_elt.send() - def _onComponentCreateDir(self, iq_elt, client): + def _on_component_create_dir(self, iq_elt, client): iq_elt.handled = True - defer.ensureDeferred(self.onComponentCreateDir(client, iq_elt)) + defer.ensureDeferred(self.on_component_create_dir(client, iq_elt)) - async def onComponentCreateDir(self, client, iq_elt): - peer_jid, owner = client.getOwnerAndPeer(iq_elt) + async def on_component_create_dir(self, client, iq_elt): + peer_jid, owner = client.get_owner_and_peer(iq_elt) if peer_jid.host not in client._file_sharing_allowed_hosts: client.sendError(iq_elt, 'forbidden') return @@ -1084,8 +1084,8 @@ ) client.sendError(iq_elt, 'forbidden', "You can't create a directory there") return - # when going further into the path, the permissions will be checked by getFiles - files_data = await self.host.memory.getFiles( + # when going further into the path, the permissions will be checked by get_files + files_data = await self.host.memory.get_files( client, peer_jid=peer_jid, path=path.parent, @@ -1107,7 +1107,7 @@ except StopIteration: configuration_elt = None - await self.host.memory.setFile( + await self.host.memory.set_file( client, path.name, path=path.parent, @@ -1118,7 +1118,7 @@ ) if configuration_elt is not None: - file_data = (await self.host.memory.getFiles( + file_data = (await self.host.memory.get_files( client, peer_jid=peer_jid, path=path.parent, @@ -1127,14 +1127,14 @@ owner=owner, ))[0] - await self._setConfiguration(client, configuration_elt, file_data) + await self._set_configuration(client, configuration_elt, file_data) iq_result_elt = xmlstream.toResponse(iq_elt, "result") client.send(iq_result_elt) # file methods # - def _serializeData(self, files_data): + def _serialize_data(self, files_data): for file_data in files_data: for key, value in file_data.items(): file_data[key] = ( @@ -1142,14 +1142,14 @@ ) return files_data - def _listFiles(self, target_jid, path, extra, profile): - client = self.host.getClient(profile) + def _list_files(self, target_jid, path, extra, profile): + client = self.host.get_client(profile) target_jid = client.jid if not target_jid else jid.JID(target_jid) - d = defer.ensureDeferred(self.listFiles(client, target_jid, path or None)) - d.addCallback(self._serializeData) + d = defer.ensureDeferred(self.list_files(client, target_jid, path or None)) + d.addCallback(self._serialize_data) return d - async def listFiles(self, client, peer_jid, path=None, extra=None): + async def list_files(self, client, peer_jid, path=None, extra=None): """List file shared by an entity @param peer_jid(jid.JID): jid of the sharing entity @@ -1164,21 +1164,21 @@ if path: query_elt["node"] = path iq_result_elt = await iq_elt.send() - return await self._parseResult(client, peer_jid, iq_result_elt) + return await self._parse_result(client, peer_jid, iq_result_elt) - def _localSharesGet(self, profile): - client = self.host.getClient(profile) - return self.localSharesGet(client) + def _local_shares_get(self, profile): + client = self.host.get_client(profile) + return self.local_shares_get(client) - def localSharesGet(self, client): - return list(client._XEP_0329_root_node.getSharedPaths().keys()) + def local_shares_get(self, client): + return list(client._XEP_0329_root_node.get_shared_paths().keys()) - def _sharePath(self, name, path, access, profile): - client = self.host.getClient(profile) + def _share_path(self, name, path, access, profile): + client = self.host.get_client(profile) access = json.loads(access) - return self.sharePath(client, name or None, path, access) + return self.share_path(client, name or None, path, access) - def sharePath(self, client, name, path, access): + def share_path(self, client, name, path, access): if client.is_component: raise exceptions.ClientTypeError if not os.path.exists(path): @@ -1193,7 +1193,7 @@ if os.path.isfile(path): # we have a single file, the workflow is diferrent as we store all single # files in the same dir - node = node.getOrCreate(SINGLE_FILES_DIR) + node = node.get_or_create(SINGLE_FILES_DIR) if not name: name = os.path.basename(path.rstrip(" /")) @@ -1212,18 +1212,18 @@ "[{profile}]".format( new_name=new_name, profile=client.profile))) ShareNode(name=name, parent=node, type_=node_type, access=access, path=path) - self.host.bridge.FISSharedPathNew(path, name, client.profile) + self.host.bridge.fis_shared_path_new(path, name, client.profile) return name - def _unsharePath(self, path, profile): - client = self.host.getClient(profile) - return self.unsharePath(client, path) + def _unshare_path(self, path, profile): + client = self.host.get_client(profile) + return self.unshare_path(client, path) - def unsharePath(self, client, path): - nodes = client._XEP_0329_root_node.findByLocalPath(path) + def unshare_path(self, client, path): + nodes = client._XEP_0329_root_node.find_by_local_path(path) for node in nodes: - node.removeFromParent() - self.host.bridge.FISSharedPathRemoved(path, client.profile) + node.remove_from_parent() + self.host.bridge.fis_shared_path_removed(path, client.profile) @implementer(iwokkel.IDisco) @@ -1236,36 +1236,36 @@ def connectionInitialized(self): if self.parent.is_component: self.xmlstream.addObserver( - IQ_FIS_REQUEST, self.plugin_parent.onComponentRequest, client=self.parent + IQ_FIS_REQUEST, self.plugin_parent.on_component_request, client=self.parent ) self.xmlstream.addObserver( IQ_FIS_AFFILIATION_GET, - self.plugin_parent._onComponentAffiliationsGet, + self.plugin_parent._on_component_affiliations_get, client=self.parent ) self.xmlstream.addObserver( IQ_FIS_AFFILIATION_SET, - self.plugin_parent._onComponentAffiliationsSet, + self.plugin_parent._on_component_affiliations_set, client=self.parent ) self.xmlstream.addObserver( IQ_FIS_CONFIGURATION_GET, - self.plugin_parent._onComponentConfigurationGet, + self.plugin_parent._on_component_configuration_get, client=self.parent ) self.xmlstream.addObserver( IQ_FIS_CONFIGURATION_SET, - self.plugin_parent._onComponentConfigurationSet, + self.plugin_parent._on_component_configuration_set, client=self.parent ) self.xmlstream.addObserver( IQ_FIS_CREATE_DIR, - self.plugin_parent._onComponentCreateDir, + self.plugin_parent._on_component_create_dir, client=self.parent ) else: self.xmlstream.addObserver( - IQ_FIS_REQUEST, self.plugin_parent.onRequest, client=self.parent + IQ_FIS_REQUEST, self.plugin_parent.on_request, client=self.parent ) def getDiscoInfo(self, requestor, target, nodeIdentifier=""):
--- a/sat/plugins/plugin_xep_0334.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0334.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,7 +48,7 @@ D_( """\ Frontends can use HINT_* constants in mess_data['extra'] in a serialized 'hints' dict. - Internal plugins can use directly addHint([HINT_* constant]). + Internal plugins can use directly add_hint([HINT_* constant]). Will set mess_data['extra']['history'] to 'skipped' when no store is requested and message is not saved in history.""" ) ), @@ -67,13 +67,13 @@ def __init__(self, host): log.info(_("Message Processing Hints plugin initialization")) self.host = host - host.trigger.add("sendMessage", self.sendMessageTrigger) - host.trigger.add("messageReceived", self.messageReceivedTrigger, priority=-1000) + host.trigger.add("sendMessage", self.send_message_trigger) + host.trigger.add("messageReceived", self.message_received_trigger, priority=-1000) - def getHandler(self, client): + def get_handler(self, client): return XEP_0334_handler() - def addHint(self, mess_data, hint): + def add_hint(self, mess_data, hint): if hint == self.HINT_NO_COPY and not mess_data["to"].resource: log.error( "{hint} can only be used with full jids! Ignoring it.".format(hint=hint) @@ -85,7 +85,7 @@ else: log.error("Unknown hint: {}".format(hint)) - def addHintElements(self, message_elt: domish.Element, hints: Iterable[str]) -> None: + def add_hint_elements(self, message_elt: domish.Element, hints: Iterable[str]) -> None: """Add hints elements to message stanza @param message_elt: stanza where hints must be added @@ -97,27 +97,27 @@ else: log.debug('Not adding {hint!r} hint: it is already present in <message>') - def _sendPostXmlTreatment(self, mess_data): + def _send_post_xml_treatment(self, mess_data): if "hints" in mess_data: - self.addHintElements(mess_data["xml"], mess_data["hints"]) + self.add_hint_elements(mess_data["xml"], mess_data["hints"]) return mess_data - def sendMessageTrigger( + def send_message_trigger( self, client, mess_data, pre_xml_treatments, post_xml_treatments ): """Add the hints element to the message to be sent""" if "hints" in mess_data["extra"]: for hint in data_format.dict2iter("hints", mess_data["extra"], pop=True): - self.addHint(hint) + self.add_hint(hint) - post_xml_treatments.addCallback(self._sendPostXmlTreatment) + post_xml_treatments.addCallback(self._send_post_xml_treatment) return True - def _receivedSkipHistory(self, mess_data): + def _received_skip_history(self, mess_data): mess_data["history"] = C.HISTORY_SKIP return mess_data - def messageReceivedTrigger(self, client, message_elt, post_treat): + def message_received_trigger(self, client, message_elt, post_treat): """Check for hints in the received message""" for elt in message_elt.elements(): if elt.uri == NS_HINTS and elt.name in ( @@ -125,7 +125,7 @@ self.HINT_NO_STORE, ): log.debug("history will be skipped for this message, as requested") - post_treat.addCallback(self._receivedSkipHistory) + post_treat.addCallback(self._received_skip_history) break return True
--- a/sat/plugins/plugin_xep_0346.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0346.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,81 +61,81 @@ self.host = host self._p = self.host.plugins["XEP-0060"] self._i = self.host.plugins["IDENTITY"] - host.bridge.addMethod( - "psSchemaGet", + host.bridge.add_method( + "ps_schema_get", ".plugin", in_sign="sss", out_sign="s", - method=self._getSchema, + method=self._get_schema, async_=True, ) - host.bridge.addMethod( - "psSchemaSet", + host.bridge.add_method( + "ps_schema_set", ".plugin", in_sign="ssss", out_sign="", - method=self._setSchema, + method=self._set_schema, async_=True, ) - host.bridge.addMethod( - "psSchemaUIGet", + host.bridge.add_method( + "ps_schema_ui_get", ".plugin", in_sign="sss", out_sign="s", - method=lambda service, nodeIdentifier, profile_key: self._getUISchema( + method=lambda service, nodeIdentifier, profile_key: self._get_ui_schema( service, nodeIdentifier, default_node=None, profile_key=profile_key), async_=True, ) - host.bridge.addMethod( - "psSchemaDictGet", + host.bridge.add_method( + "ps_schema_dict_get", ".plugin", in_sign="sss", out_sign="s", - method=self._getSchemaDict, + method=self._get_schema_dict, async_=True, ) - host.bridge.addMethod( - "psSchemaApplicationNSGet", + host.bridge.add_method( + "ps_schema_application_ns_get", ".plugin", in_sign="s", out_sign="s", - method=self.getApplicationNS, + method=self.get_application_ns, ) - host.bridge.addMethod( - "psSchemaTemplateNodeGet", + host.bridge.add_method( + "ps_schema_template_node_get", ".plugin", in_sign="s", out_sign="s", - method=self.getTemplateNS, + method=self.get_template_ns, ) - host.bridge.addMethod( - "psSchemaSubmittedNodeGet", + host.bridge.add_method( + "ps_schema_submitted_node_get", ".plugin", in_sign="s", out_sign="s", - method=self.getSubmittedNS, + method=self.get_submitted_ns, ) - host.bridge.addMethod( - "psItemsFormGet", + host.bridge.add_method( + "ps_items_form_get", ".plugin", in_sign="ssssiassss", out_sign="(asa{ss})", - method=self._getDataFormItems, + method=self._get_data_form_items, async_=True, ) - host.bridge.addMethod( - "psItemFormSend", + host.bridge.add_method( + "ps_item_form_send", ".plugin", in_sign="ssa{sas}ssa{ss}s", out_sign="s", - method=self._sendDataFormItem, + method=self._send_data_form_item, async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return SchemaHandler() - def getApplicationNS(self, namespace): + def get_application_ns(self, namespace): """Retrieve application namespace, i.e. namespace without FDP prefix""" if namespace.startswith(SUBMITTED_PREFIX): namespace = namespace[len(SUBMITTED_PREFIX):] @@ -143,28 +143,28 @@ namespace = namespace[len(TEMPLATE_PREFIX):] return namespace - def getTemplateNS(self, namespace: str) -> str: + def get_template_ns(self, namespace: str) -> str: """Returns node used for data template (i.e. schema)""" - app_ns = self.getApplicationNS(namespace) + app_ns = self.get_application_ns(namespace) return f"{TEMPLATE_PREFIX}{app_ns}" - def getSubmittedNS(self, namespace: str) -> str: + def get_submitted_ns(self, namespace: str) -> str: """Returns node to use to submit forms""" - return f"{SUBMITTED_PREFIX}{self.getApplicationNS(namespace)}" + return f"{SUBMITTED_PREFIX}{self.get_application_ns(namespace)}" - def _getSchemaBridgeCb(self, schema_elt): + def _get_schema_bridge_cb(self, schema_elt): if schema_elt is None: return "" return schema_elt.toXml() - def _getSchema(self, service, nodeIdentifier, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + def _get_schema(self, service, nodeIdentifier, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) - d = defer.ensureDeferred(self.getSchema(client, service, nodeIdentifier)) - d.addCallback(self._getSchemaBridgeCb) + d = defer.ensureDeferred(self.get_schema(client, service, nodeIdentifier)) + d.addCallback(self._get_schema_bridge_cb) return d - async def getSchema(self, client, service, nodeIdentifier): + async def get_schema(self, client, service, nodeIdentifier): """retrieve PubSub node schema @param service(jid.JID, None): jid of PubSub service @@ -173,9 +173,9 @@ @return (domish.Element, None): schema (<x> element) None if no schema has been set on this node """ - app_ns = self.getApplicationNS(nodeIdentifier) + app_ns = self.get_application_ns(nodeIdentifier) node_id = f"{TEMPLATE_PREFIX}{app_ns}" - items_data = await self._p.getItems(client, service, node_id, max_items=1) + items_data = await self._p.get_items(client, service, node_id, max_items=1) try: schema = next(items_data[0][0].elements(data_form.NS_X_DATA, 'x')) except IndexError: @@ -188,7 +188,7 @@ schema = None return schema - async def getSchemaForm(self, client, service, nodeIdentifier, schema=None, + async def get_schema_form(self, client, service, nodeIdentifier, schema=None, form_type="form", copy_form=True): """Get data form from node's schema @@ -200,13 +200,13 @@ if None, it will be retrieved from node (imply one additional XMPP request) @param form_type(unicode): type of the form @param copy_form(bool): if True and if schema is already a data_form.Form, will deep copy it before returning - needed when the form is reused and it will be modified (e.g. in sendDataFormItem) + needed when the form is reused and it will be modified (e.g. in send_data_form_item) @return(data_form.Form): data form the form should not be modified if copy_form is not set """ if schema is None: log.debug(_("unspecified schema, we need to request it")) - schema = await self.getSchema(client, service, nodeIdentifier) + schema = await self.get_schema(client, service, nodeIdentifier) if schema is None: raise exceptions.DataError( _( @@ -234,43 +234,43 @@ form.formType = form_type return form - def schema2XMLUI(self, schema_elt): + def schema_2_xmlui(self, schema_elt): form = data_form.Form.fromElement(schema_elt) - xmlui = xml_tools.dataForm2XMLUI(form, "") + xmlui = xml_tools.data_form_2_xmlui(form, "") return xmlui - def _getUISchema(self, service, nodeIdentifier, default_node=None, + def _get_ui_schema(self, service, nodeIdentifier, default_node=None, profile_key=C.PROF_KEY_NONE): if not nodeIdentifier: if not default_node: raise ValueError(_("nodeIndentifier needs to be set")) nodeIdentifier = default_node - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) - d = self.getUISchema(client, service, nodeIdentifier) + d = self.get_ui_schema(client, service, nodeIdentifier) d.addCallback(lambda xmlui: xmlui.toXml()) return d - def getUISchema(self, client, service, nodeIdentifier): - d = defer.ensureDeferred(self.getSchema(client, service, nodeIdentifier)) - d.addCallback(self.schema2XMLUI) + def get_ui_schema(self, client, service, nodeIdentifier): + d = defer.ensureDeferred(self.get_schema(client, service, nodeIdentifier)) + d.addCallback(self.schema_2_xmlui) return d - def _setSchema(self, service, nodeIdentifier, schema, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + def _set_schema(self, service, nodeIdentifier, schema, profile_key=C.PROF_KEY_NONE): + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) schema = generic.parseXml(schema.encode()) return defer.ensureDeferred( - self.setSchema(client, service, nodeIdentifier, schema) + self.set_schema(client, service, nodeIdentifier, schema) ) - async def setSchema(self, client, service, nodeIdentifier, schema): + async def set_schema(self, client, service, nodeIdentifier, schema): """Set or replace PubSub node schema @param schema(domish.Element, None): schema to set None if schema need to be removed """ - node_id = self.getTemplateNS(nodeIdentifier) + node_id = self.get_template_ns(nodeIdentifier) node_options = { self._p.OPT_ACCESS_MODEL: self._p.ACCESS_OPEN, self._p.OPT_PERSIST_ITEMS: 1, @@ -279,17 +279,17 @@ self._p.OPT_SEND_ITEM_SUBSCRIBE: 1, self._p.OPT_PUBLISH_MODEL: self._p.PUBLISH_MODEL_PUBLISHERS, } - await self._p.createIfNewNode(client, service, node_id, node_options) - await self._p.sendItem(client, service, node_id, schema, self._p.ID_SINGLETON) + await self._p.create_if_new_node(client, service, node_id, node_options) + await self._p.send_item(client, service, node_id, schema, self._p.ID_SINGLETON) - def _getSchemaDict(self, service, nodeIdentifier, profile): + def _get_schema_dict(self, service, nodeIdentifier, profile): service = None if not service else jid.JID(service) - client = self.host.getClient(profile) - d = defer.ensureDeferred(self.getSchemaDict(client, service, nodeIdentifier)) + client = self.host.get_client(profile) + d = defer.ensureDeferred(self.get_schema_dict(client, service, nodeIdentifier)) d.addCallback(data_format.serialise) return d - async def getSchemaDict( + async def get_schema_dict( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -298,13 +298,13 @@ The dictionary is made so it can be easily serialisable """ - schema_form = await self.getSchemaForm(client, service, nodeIdentifier) - return xml_tools.dataForm2dataDict(schema_form) + schema_form = await self.get_schema_form(client, service, nodeIdentifier) + return xml_tools.data_form_2_data_dict(schema_form) - def _getDataFormItems(self, form_ns="", service="", node="", schema="", max_items=10, + def _get_data_form_items(self, form_ns="", service="", node="", schema="", max_items=10, item_ids=None, sub_id=None, extra="", profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None if not node: raise exceptions.DataError(_("empty node is not allowed")) @@ -313,9 +313,9 @@ else: schema = None max_items = None if max_items == C.NO_LIMIT else max_items - extra = self._p.parseExtra(data_format.deserialise(extra)) + extra = self._p.parse_extra(data_format.deserialise(extra)) d = defer.ensureDeferred( - self.getDataFormItems( + self.get_data_form_items( client, service, node, @@ -328,10 +328,10 @@ form_ns=form_ns or None, ) ) - d.addCallback(self._p.transItemsData) + d.addCallback(self._p.trans_items_data) return d - async def getDataFormItems(self, client, service, nodeIdentifier, schema=None, + async def get_data_form_items(self, client, service, nodeIdentifier, schema=None, max_items=None, item_ids=None, sub_id=None, rsm_request=None, extra=None, default_node=None, form_ns=None, filters=None): """Get items known as being data forms, and convert them to XMLUI @@ -341,8 +341,8 @@ @param default_node(unicode): node to use if nodeIdentifier is None or empty @param form_ns (unicode, None): namespace of the form None to accept everything, even if form has no namespace - @param filters(dict, None): same as for xml_tools.dataFormResult2XMLUI - other parameters as the same as for [getItems] + @param filters(dict, None): same as for xml_tools.data_form_result_2_xmlui + other parameters as the same as for [get_items] @return (list[unicode]): XMLUI of the forms if an item is invalid (not corresponding to form_ns or not a data_form) it will be skipped @@ -354,12 +354,12 @@ _("default_node must be set if nodeIdentifier is not set") ) nodeIdentifier = default_node - submitted_ns = self.getSubmittedNS(nodeIdentifier) + submitted_ns = self.get_submitted_ns(nodeIdentifier) # we need the initial form to get options of fields when suitable - schema_form = await self.getSchemaForm( + schema_form = await self.get_schema_form( client, service, nodeIdentifier, schema, form_type="result", copy_form=False ) - items_data = await self._p.getItems( + items_data = await self._p.get_items( client, service, submitted_ns, @@ -391,7 +391,7 @@ pass else: prepend.append(("jid", publisher, "publisher")) - xmlui = xml_tools.dataFormResult2XMLUI( + xmlui = xml_tools.data_form_result_2_xmlui( form, schema_form, # FIXME: conflicts with schema (i.e. if "id" or "publisher" already exists) @@ -404,16 +404,16 @@ break return (items_xmlui, metadata) - def _sendDataFormItem(self, service, nodeIdentifier, values, schema=None, + def _send_data_form_item(self, service, nodeIdentifier, values, schema=None, item_id=None, extra=None, profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) if schema: schema = generic.parseXml(schema.encode("utf-8")) else: schema = None d = defer.ensureDeferred( - self.sendDataFormItem( + self.send_data_form_item( client, service, nodeIdentifier, @@ -427,7 +427,7 @@ d.addCallback(lambda ret: ret or "") return d - async def sendDataFormItem( + async def send_data_form_item( self, client, service, nodeIdentifier, values, schema=None, item_id=None, extra=None, deserialise=False): """Publish an item as a dataform when we know that there is a schema @@ -439,13 +439,13 @@ Schema is needed to construct data form to publish @param deserialise(bool): if True, data are list of unicode and must be deserialized according to expected type. - This is done in this method and not directly in _sendDataFormItem because we + This is done in this method and not directly in _send_data_form_item because we need to know the data type which is in the form, not availablable in - _sendDataFormItem - other parameters as the same as for [self._p.sendItem] + _send_data_form_item + other parameters as the same as for [self._p.send_item] @return (unicode): id of the created item """ - form = await self.getSchemaForm( + form = await self.get_schema_form( client, service, nodeIdentifier, schema, form_type="submit" ) @@ -469,7 +469,7 @@ values_list = list( itertools.chain(*[v.splitlines() for v in values_list]) ) - elif xml_tools.isXHTMLField(field): + elif xml_tools.is_xhtml_field(field): values_list = [generic.parseXml(v.encode("utf-8")) for v in values_list] elif "jid" in (field.fieldType or ""): @@ -505,14 +505,14 @@ field.values = values_list - return await self._p.sendItem( + return await self._p.send_item( client, service, nodeIdentifier, form.toElement(), item_id, extra ) ## filters ## # filters useful for data form to XMLUI conversion # - def valueOrPublisherFilter(self, form_xmlui, widget_type, args, kwargs): + def value_or_publisher_filter(self, form_xmlui, widget_type, args, kwargs): """Replace missing value by publisher's user part""" if not args[0]: # value is not filled: we use user part of publisher (if we have it) @@ -524,7 +524,7 @@ args[0] = publisher.user.capitalize() return widget_type, args, kwargs - def textbox2ListFilter(self, form_xmlui, widget_type, args, kwargs): + def textbox_2_list_filter(self, form_xmlui, widget_type, args, kwargs): """Split lines of a textbox in a list main use case is using a textbox for labels @@ -540,7 +540,7 @@ } return widget_type, args, kwargs - def dateFilter(self, form_xmlui, widget_type, args, kwargs): + def date_filter(self, form_xmlui, widget_type, args, kwargs): """Convert a string with a date to a unix timestamp""" if widget_type != "string" or not args[0]: return widget_type, args, kwargs @@ -553,27 +553,27 @@ ## Helper methods ## - def prepareBridgeGet(self, service, node, max_items, sub_id, extra, profile_key): + def prepare_bridge_get(self, service, node, max_items, sub_id, extra, profile_key): """Parse arguments received from bridge *Get methods and return higher level data @return (tuple): (client, service, node, max_items, extra, sub_id) usable for internal methods """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = jid.JID(service) if service else None if not node: node = None max_items = None if max_items == C.NO_LIMIT else max_items if not sub_id: sub_id = None - extra = self._p.parseExtra(extra) + extra = self._p.parse_extra(extra) return client, service, node, max_items, extra, sub_id def _get(self, service="", node="", max_items=10, item_ids=None, sub_id=None, extra="", default_node=None, form_ns=None, filters=None, profile_key=C.PROF_KEY_NONE): - """Bridge method to retrieve data from node with schema + """bridge method to retrieve data from node with schema this method is a helper so dependant plugins can use it directly when adding *Get methods @@ -588,12 +588,12 @@ # have to modify them if C.bool(extra.get("labels_as_list", C.BOOL_FALSE)): filters = filters.copy() - filters["labels"] = self.textbox2ListFilter - client, service, node, max_items, extra, sub_id = self.prepareBridgeGet( + filters["labels"] = self.textbox_2_list_filter + client, service, node, max_items, extra, sub_id = self.prepare_bridge_get( service, node, max_items, sub_id, extra, profile_key ) d = defer.ensureDeferred( - self.getDataFormItems( + self.get_data_form_items( client, service, node or None, @@ -607,17 +607,17 @@ filters=filters, ) ) - d.addCallback(self._p.transItemsData) + d.addCallback(self._p.trans_items_data) d.addCallback(lambda data: data_format.serialise(data)) return d - def prepareBridgeSet(self, service, node, schema, item_id, extra, profile_key): + def prepare_bridge_set(self, service, node, schema, item_id, extra, profile_key): """Parse arguments received from bridge *Set methods and return higher level data @return (tuple): (client, service, node, schema, item_id, extra) usable for internal methods """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) if schema: schema = generic.parseXml(schema.encode("utf-8")) @@ -626,13 +626,13 @@ extra = data_format.deserialise(extra) return client, service, node or None, schema, item_id or None, extra - async def copyMissingValues(self, client, service, node, item_id, form_ns, values): + async def copy_missing_values(self, client, service, node, item_id, form_ns, values): """Retrieve values existing in original item and missing in update Existing item will be retrieve, and values not already specified in values will be filled - @param service: same as for [XEP_0060.getItems] - @param node: same as for [XEP_0060.getItems] + @param service: same as for [XEP_0060.get_items] + @param node: same as for [XEP_0060.get_items] @param item_id(unicode): id of the item to retrieve @param form_ns (unicode, None): namespace of the form @param values(dict): values to fill @@ -641,7 +641,7 @@ """ try: # we get previous item - items_data = await self._p.getItems( + items_data = await self._p.get_items( client, service, node, item_ids=[item_id] ) item_elt = items_data[0][0] @@ -666,12 +666,12 @@ def _set(self, service, node, values, schema=None, item_id=None, extra=None, default_node=None, form_ns=None, fill_author=True, profile_key=C.PROF_KEY_NONE): - """Bridge method to set item in node with schema + """bridge method to set item in node with schema this method is a helper so dependant plugins can use it directly when adding *Set methods """ - client, service, node, schema, item_id, extra = self.prepareBridgeSet( + client, service, node, schema, item_id, extra = self.prepare_bridge_set( service, node, schema, item_id, extra ) d = defer.ensureDeferred(self.set( @@ -701,13 +701,13 @@ 'created' and 'updated' will be forced to current time: - 'created' is set if item_id is None, i.e. if it's a new ticket - 'updated' is set everytime - @param extra(dict, None): same as for [XEP-0060.sendItem] with additional keys: + @param extra(dict, None): same as for [XEP-0060.send_item] with additional keys: - update(bool): if True, get previous item data to merge with current one if True, item_id must be set @param form_ns (unicode, None): namespace of the form needed when an update is done @param default_node(unicode, None): value to use if node is not set - other arguments are same as for [self._s.sendDataFormItem] + other arguments are same as for [self._s.send_data_form_item] @return (unicode): id of the created item """ if extra is None: @@ -716,7 +716,7 @@ if default_node is None: raise ValueError(_("default_node must be set if node is not set")) node = default_node - node = self.getSubmittedNS(node) + node = self.get_submitted_ns(node) now = utils.xmpp_date() if not item_id: values["created"] = now @@ -725,16 +725,16 @@ raise exceptions.DataError( _('if extra["update"] is set, item_id must be set too') ) - await self.copyMissingValues(client, service, node, item_id, form_ns, values) + await self.copy_missing_values(client, service, node, item_id, form_ns, values) values["updated"] = now if fill_author: if not values.get("author"): - id_data = await self._i.getIdentity(client, None, ["nicknames"]) + id_data = await self._i.get_identity(client, None, ["nicknames"]) values["author"] = id_data['nicknames'][0] if not values.get("author_jid"): values["author_jid"] = client.jid.full() - item_id = await self.sendDataFormItem( + item_id = await self.send_data_form_item( client, service, node, values, schema, item_id, extra, deserialise ) return item_id
--- a/sat/plugins/plugin_xep_0352.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0352.py Sat Apr 08 13:54:42 2023 +0200 @@ -44,21 +44,21 @@ def __init__(self, host): log.info(_("Client State Indication plugin initialization")) self.host = host - host.registerNamespace("csi", NS_CSI) + host.register_namespace("csi", NS_CSI) - def isActive(self, client): + def is_active(self, client): try: if not client._xep_0352_enabled: return True return client._xep_0352_active except AttributeError: - # _xep_0352_active can not be set if isActive is called before - # profileConnected has been called - log.debug("isActive called when XEP-0352 plugin has not yet set the " + # _xep_0352_active can not be set if is_active is called before + # profile_connected has been called + log.debug("is_active called when XEP-0352 plugin has not yet set the " "attributes") return True - def profileConnected(self, client): + def profile_connected(self, client): if (NS_CSI, 'csi') in client.xmlstream.features: log.info(_("Client State Indication is available on this server")) client._xep_0352_enabled = True @@ -68,15 +68,15 @@ " bandwidth optimisations can't be used.")) client._xep_0352_enabled = False - def setInactive(self, client): - if self.isActive(client): + def set_inactive(self, client): + if self.is_active(client): inactive_elt = domish.Element((NS_CSI, 'inactive')) client.send(inactive_elt) client._xep_0352_active = False log.info("inactive state set") - def setActive(self, client): - if not self.isActive(client): + def set_active(self, client): + if not self.is_active(client): active_elt = domish.Element((NS_CSI, 'active')) client.send(active_elt) client._xep_0352_active = True
--- a/sat/plugins/plugin_xep_0353.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0353.py Sat Apr 08 13:54:42 2023 +0200 @@ -52,19 +52,19 @@ def __init__(self, host): log.info(_("plugin {name} initialization").format(name=PLUGIN_INFO[C.PI_NAME])) self.host = host - host.registerNamespace("jingle-message", NS_JINGLE_MESSAGE) + host.register_namespace("jingle-message", NS_JINGLE_MESSAGE) self._j = host.plugins["XEP-0166"] - host.trigger.add("XEP-0166_initiate", self._onInitiateTrigger) - host.trigger.add("messageReceived", self._onMessageReceived) + host.trigger.add("XEP-0166_initiate", self._on_initiate_trigger) + host.trigger.add("messageReceived", self._on_message_received) - def getHandler(self, client): + def get_handler(self, client): return Handler() - def profileConnecting(self, client): + def profile_connecting(self, client): # mapping from session id to deferred used to wait for destinee answer client._xep_0353_pending_sessions = {} - def buildMessageData(self, client, peer_jid, verb, session_id): + def build_message_data(self, client, peer_jid, verb, session_id): mess_data = { 'from': client.jid, 'to': peer_jid, @@ -74,19 +74,19 @@ 'subject': {}, 'extra': {} } - client.generateMessageXML(mess_data) + client.generate_message_xml(mess_data) verb_elt = mess_data["xml"].addElement((NS_JINGLE_MESSAGE, verb)) verb_elt["id"] = session_id return mess_data - async def _onInitiateTrigger(self, client, session, contents): + async def _on_initiate_trigger(self, client, session, contents): # FIXME: check that at least one resource of the peer_jid can handle the feature peer_jid = session['peer_jid'] if peer_jid.resource: return True try: - infos = await self.host.memory.disco.getInfos(client, peer_jid) + infos = await self.host.memory.disco.get_infos(client, peer_jid) except error.StanzaError as e: if e.condition == "service-unavailable": categories = {} @@ -103,18 +103,18 @@ # according to XEP-0353 §3.1 await client.presence.available(peer_jid) - mess_data = self.buildMessageData(client, peer_jid, "propose", session['id']) + mess_data = self.build_message_data(client, peer_jid, "propose", session['id']) for content in contents: - application, app_args, app_kwargs, content_name = self._j.getContentData( + application, app_args, app_kwargs, content_name = self._j.get_content_data( content) try: - jingleDescriptionElt = application.handler.jingleDescriptionElt + jingle_description_elt = application.handler.jingle_description_elt except AttributeError: - log.debug(f"no jingleDescriptionElt set for {application.handler}") + log.debug(f"no jingle_description_elt set for {application.handler}") description_elt = domish.Element((content["app_ns"], "description")) else: - description_elt = await utils.asDeferred( - jingleDescriptionElt, + description_elt = await utils.as_deferred( + jingle_description_elt, client, session, content_name, *app_args, **app_kwargs ) mess_data["xml"].propose.addChild(description_elt) @@ -122,7 +122,7 @@ # we wait for 2 min before cancelling the session init response_d.addTimeout(2*60, reactor) client._xep_0353_pending_sessions[session['id']] = response_d - await client.sendMessageData(mess_data) + await client.send_message_data(mess_data) try: accepting_jid = await response_d except defer.TimeoutError: @@ -134,25 +134,25 @@ del client._xep_0353_pending_sessions[session['id']] return True - async def _onMessageReceived(self, client, message_elt, post_treat): + async def _on_message_received(self, client, message_elt, post_treat): for elt in message_elt.elements(): if elt.uri == NS_JINGLE_MESSAGE: if elt.name == "propose": - return await self._handlePropose(client, message_elt, elt) + return await self._handle_propose(client, message_elt, elt) elif elt.name == "retract": - return self._handleRetract(client, message_elt, elt) + return self._handle_retract(client, message_elt, elt) elif elt.name == "proceed": - return self._handleProceed(client, message_elt, elt) + return self._handle_proceed(client, message_elt, elt) elif elt.name == "accept": - return self._handleAccept(client, message_elt, elt) + return self._handle_accept(client, message_elt, elt) elif elt.name == "reject": - return self._handleAccept(client, message_elt, elt) + return self._handle_accept(client, message_elt, elt) else: log.warning(f"invalid element: {elt.toXml}") return True return True - async def _handlePropose(self, client, message_elt, elt): + async def _handle_propose(self, client, message_elt, elt): peer_jid = jid.JID(message_elt["from"]) session_id = elt["id"] if peer_jid.userhostJID() not in client.roster: @@ -176,7 +176,7 @@ "possibly you IP (internet localisation), do you accept?" ).format(peer_jid=peer_jid, human_name=human_name) confirm_title = D_("Invitation from an unknown contact") - accept = await xml_tools.deferConfirm( + accept = await xml_tools.defer_confirm( self.host, confirm_msg, confirm_title, profile=client.profile, action_extra={ "meta_type": C.META_TYPE_NOT_IN_ROSTER_LEAK, @@ -185,27 +185,27 @@ } ) if not accept: - mess_data = self.buildMessageData( + mess_data = self.build_message_data( client, client.jid.userhostJID(), "reject", session_id) - await client.sendMessageData(mess_data) + await client.send_message_data(mess_data) # we don't sent anything to sender, to avoid leaking presence return False else: await client.presence.available(peer_jid) session_id = elt["id"] - mess_data = self.buildMessageData( + mess_data = self.build_message_data( client, client.jid.userhostJID(), "accept", session_id) - await client.sendMessageData(mess_data) - mess_data = self.buildMessageData( + await client.send_message_data(mess_data) + mess_data = self.build_message_data( client, peer_jid, "proceed", session_id) - await client.sendMessageData(mess_data) + await client.send_message_data(mess_data) return False - def _handleRetract(self, client, message_elt, proceed_elt): + def _handle_retract(self, client, message_elt, proceed_elt): log.warning("retract is not implemented yet") return False - def _handleProceed(self, client, message_elt, proceed_elt): + def _handle_proceed(self, client, message_elt, proceed_elt): try: session_id = proceed_elt["id"] except KeyError: @@ -223,10 +223,10 @@ response_d.callback(jid.JID(message_elt["from"])) return False - def _handleAccept(self, client, message_elt, accept_elt): + def _handle_accept(self, client, message_elt, accept_elt): pass - def _handleReject(self, client, message_elt, accept_elt): + def _handle_reject(self, client, message_elt, accept_elt): pass
--- a/sat/plugins/plugin_xep_0359.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0359.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,29 +50,29 @@ def __init__(self, host): log.info(_("Unique and Stable Stanza IDs plugin initialization")) self.host = host - host.registerNamespace("stanza_id", NS_SID) - host.trigger.add("message_parse", self._message_parseTrigger) - host.trigger.add("sendMessageData", self._sendMessageDataTrigger) + host.register_namespace("stanza_id", NS_SID) + host.trigger.add("message_parse", self._message_parse_trigger) + host.trigger.add("send_message_data", self._send_message_data_trigger) - def _message_parseTrigger(self, client, message_elt, mess_data): + def _message_parse_trigger(self, client, message_elt, mess_data): """Check if message has a stanza-id""" - stanza_id = self.getStanzaId(message_elt, client.jid.userhostJID()) + stanza_id = self.get_stanza_id(message_elt, client.jid.userhostJID()) if stanza_id is not None: mess_data['extra']['stanza_id'] = stanza_id - origin_id = self.getOriginId(message_elt) + origin_id = self.get_origin_id(message_elt) if origin_id is not None: mess_data['extra']['origin_id'] = origin_id return True - def _sendMessageDataTrigger(self, client, mess_data): + def _send_message_data_trigger(self, client, mess_data): origin_id = mess_data["extra"].get("origin_id") if not origin_id: origin_id = str(uuid.uuid4()) mess_data["extra"]["origin_id"] = origin_id message_elt = mess_data["xml"] - self.addOriginId(message_elt, origin_id) + self.add_origin_id(message_elt, origin_id) - def getStanzaId(self, element, by): + def get_stanza_id(self, element, by): """Return stanza-id if found in element @param element(domish.Element): element to parse @@ -92,7 +92,7 @@ return stanza_id - def addStanzaId(self, client, element, stanza_id, by=None): + def add_stanza_id(self, client, element, stanza_id, by=None): """Add a <stanza-id/> to a stanza @param element(domish.Element): stanza where the <stanza-id/> must be added @@ -103,7 +103,7 @@ sid_elt["by"] = client.jid.userhost() if by is None else by.userhost() sid_elt["id"] = stanza_id - def getOriginId(self, element: domish.Element) -> Optional[str]: + def get_origin_id(self, element: domish.Element) -> Optional[str]: """Return origin-id if found in element @param element: element to parse @@ -116,7 +116,7 @@ else: return origin_elt.getAttribute("id") - def addOriginId(self, element, origin_id=None): + def add_origin_id(self, element, origin_id=None): """Add a <origin-id/> to a stanza @param element(domish.Element): stanza where the <origin-id/> must be added @@ -129,7 +129,7 @@ sid_elt["id"] = origin_id return origin_id - def getHandler(self, client): + def get_handler(self, client): return XEP_0359_handler()
--- a/sat/plugins/plugin_xep_0363.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0363.py Sat Apr 08 13:54:42 2023 +0200 @@ -86,33 +86,33 @@ def __init__(self, host): log.info(_("plugin HTTP File Upload initialization")) self.host = host - host.bridge.addMethod( - "fileHTTPUpload", + host.bridge.add_method( + "file_http_upload", ".plugin", in_sign="sssbs", out_sign="", method=self._file_http_upload, ) - host.bridge.addMethod( - "fileHTTPUploadGetSlot", + host.bridge.add_method( + "file_http_upload_get_slot", ".plugin", in_sign="sisss", out_sign="(ssaa{ss})", - method=self._getSlot, + method=self._get_slot, async_=True, ) host.plugins["UPLOAD"].register( - "HTTP Upload", self.getHTTPUploadEntity, self.file_http_upload + "HTTP Upload", self.get_http_upload_entity, self.file_http_upload ) # list of callbacks used when a request is done to a component self.handlers = [] # XXX: there is not yet official short name, so we use "http_upload" - host.registerNamespace("http_upload", NS_HTTP_UPLOAD) + host.register_namespace("http_upload", NS_HTTP_UPLOAD) - def getHandler(self, client): + def get_handler(self, client): return XEP_0363_handler(self) - def registerHandler(self, callback, priority=0): + def register_handler(self, callback, priority=0): """Register a request handler @param callack: method to call when a request is done @@ -127,13 +127,13 @@ self.handlers.append(req_handler) self.handlers.sort(key=lambda handler: handler.priority, reverse=True) - def getFileTooLargeElt(self, max_size: int) -> domish.Element: + def get_file_too_large_elt(self, max_size: int) -> domish.Element: """Generate <file-too-large> app condition for errors""" file_too_large_elt = domish.Element((NS_HTTP_UPLOAD, "file-too-large")) file_too_large_elt.addElement("max-file-size", str(max_size)) return file_too_large_elt - async def getHTTPUploadEntity(self, client, upload_jid=None): + async def get_http_upload_entity(self, client, upload_jid=None): """Get HTTP upload capable entity upload_jid is checked, then its components @@ -144,7 +144,7 @@ try: entity = client.http_upload_service except AttributeError: - found_entities = await self.host.findFeaturesSet(client, (NS_HTTP_UPLOAD,)) + found_entities = await self.host.find_features_set(client, (NS_HTTP_UPLOAD,)) try: entity = client.http_upload_service = next(iter(found_entities)) except StopIteration: @@ -158,7 +158,7 @@ def _file_http_upload(self, filepath, filename="", upload_jid="", ignore_tls_errors=False, profile=C.PROF_KEY_NONE): assert os.path.isabs(filepath) and os.path.isfile(filepath) - client = self.host.getClient(profile) + client = self.host.get_client(profile) return defer.ensureDeferred(self.file_http_upload( client, filepath, @@ -204,7 +204,7 @@ triggers_no_cancel=True ) try: - slot = await self.getSlot( + slot = await self.get_slot( client, file_metadata["filename"], file_metadata["size"], upload_jid=upload_jid ) @@ -235,7 +235,7 @@ headers[name] = value - await self.host.trigger.asyncPoint( + await self.host.trigger.async_point( "XEP-0363_upload", client, extra, sat_file, file_producer, slot, triggers_no_cancel=True) @@ -259,34 +259,34 @@ """Called once file is successfully uploaded @param sat_file(SatFile): file used for the upload - should be closed, but it is needed to send the progressFinished signal + should be closed, but it is needed to send the progress_finished signal @param slot(Slot): put/get urls """ log.info(f"HTTP upload finished ({slot.get})") - sat_file.progressFinished({"url": slot.get}) + sat_file.progress_finished({"url": slot.get}) return slot.get def _upload_eb(self, failure_, sat_file): """Called on unsuccessful upload @param sat_file(SatFile): file used for the upload - should be closed, be is needed to send the progressError signal + should be closed, be is needed to send the progress_error signal """ try: wrapped_fail = failure_.value.reasons[0] except (AttributeError, IndexError) as e: log.warning(_("upload failed: {reason}").format(reason=e)) - sat_file.progressError(str(failure_)) + sat_file.progress_error(str(failure_)) else: if wrapped_fail.check(sat_web.SSLError): msg = "TLS validation error, can't connect to HTTPS server" else: msg = "can't upload file" log.warning(msg + ": " + str(wrapped_fail.value)) - sat_file.progressError(msg) + sat_file.progress_error(msg) raise failure_ - def _getSlot(self, filename, size, content_type, upload_jid, + def _get_slot(self, filename, size, content_type, upload_jid, profile_key=C.PROF_KEY_NONE): """Get an upload slot @@ -297,15 +297,15 @@ @param content_type(unicode, None): MIME type of the content empty string or None to guess automatically """ - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) filename = filename.replace("/", "_") - d = defer.ensureDeferred(self.getSlot( + d = defer.ensureDeferred(self.get_slot( client, filename, size, content_type or None, jid.JID(upload_jid) or None )) d.addCallback(lambda slot: (slot.get, slot.put, slot.headers)) return d - async def getSlot(self, client, filename, size, content_type=None, upload_jid=None): + async def get_slot(self, client, filename, size, content_type=None, upload_jid=None): """Get a slot (i.e. download/upload links) @param filename(unicode): name to use for the upload @@ -327,8 +327,8 @@ try: upload_jid = client.http_upload_service except AttributeError: - found_entity = await self.getHTTPUploadEntity(client) - return await self.getSlot( + found_entity = await self.get_http_upload_entity(client) + return await self.get_slot( client, filename, size, content_type, found_entity) else: if upload_jid is None: @@ -374,11 +374,11 @@ # component - def onComponentRequest(self, iq_elt, client): + def on_component_request(self, iq_elt, client): iq_elt.handled=True - defer.ensureDeferred(self.handleComponentRequest(client, iq_elt)) + defer.ensureDeferred(self.handle_component_request(client, iq_elt)) - async def handleComponentRequest(self, client, iq_elt): + async def handle_component_request(self, client, iq_elt): try: request_elt = next(iq_elt.elements(NS_HTTP_UPLOAD, "request")) request = UploadRequest( @@ -395,7 +395,7 @@ for handler in self.handlers: try: - slot = await utils.asDeferred(handler.callback, client, request) + slot = await utils.as_deferred(handler.callback, client, request) except error.StanzaError as e: log.warning( "a stanza error has been raised while processing HTTP Upload of " @@ -436,7 +436,7 @@ if ((self.parent.is_component and PLUGIN_INFO[C.PI_IMPORT_NAME] in self.parent.enabled_features)): self.xmlstream.addObserver( - IQ_HTTP_UPLOAD_REQUEST, self.plugin_parent.onComponentRequest, + IQ_HTTP_UPLOAD_REQUEST, self.plugin_parent.on_component_request, client=self.parent )
--- a/sat/plugins/plugin_xep_0372.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0372.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,23 +62,23 @@ def __init__(self, host): log.info(_("References plugin initialization")) - host.registerNamespace("refs", NS_REFS) + host.register_namespace("refs", NS_REFS) self.host = host self._h = host.plugins["XEP-0334"] - host.trigger.add("messageReceived", self._messageReceivedTrigger) - host.bridge.addMethod( - "referenceSend", + host.trigger.add("messageReceived", self._message_received_trigger) + host.bridge.add_method( + "reference_send", ".plugin", in_sign="sssss", out_sign="", - method=self._sendReference, + method=self._send_reference, async_=False, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0372_Handler() - def refElementToRefData( + def ref_element_to_ref_data( self, reference_elt: domish.Element ) -> Dict[str, Union[str, int, dict]]: @@ -88,7 +88,7 @@ } if ref_data["uri"].startswith("xmpp:"): - ref_data["parsed_uri"] = xmpp_uri.parseXMPPUri(ref_data["uri"]) + ref_data["parsed_uri"] = xmpp_uri.parse_xmpp_uri(ref_data["uri"]) for attr in ("begin", "end"): try: @@ -100,10 +100,10 @@ if anchor is not None: ref_data["anchor"] = anchor if anchor.startswith("xmpp:"): - ref_data["parsed_anchor"] = xmpp_uri.parseXMPPUri(anchor) + ref_data["parsed_anchor"] = xmpp_uri.parse_xmpp_uri(anchor) return ref_data - async def _messageReceivedTrigger( + async def _message_received_trigger( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -114,18 +114,18 @@ if reference_elt is None: return True try: - ref_data = self.refElementToRefData(reference_elt) + ref_data = self.ref_element_to_ref_data(reference_elt) except KeyError: log.warning("invalid <reference> element: {reference_elt.toXml}") return True - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0372_ref_received", client, message_elt, ref_data ): return False return True - def buildRefElement( + def build_ref_element( self, uri: str, type_: str = "mention", @@ -148,7 +148,7 @@ reference_elt["anchor"] = anchor return reference_elt - def _sendReference( + def _send_reference( self, recipient: str, anchor: str, @@ -157,9 +157,9 @@ profile_key: str ) -> defer.Deferred: recipient_jid = jid.JID(recipient) - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) extra: dict = data_format.deserialise(extra_s, default={}) - self.sendReference( + self.send_reference( client, uri=extra.get("uri"), type_=type_, @@ -167,7 +167,7 @@ to_jid=recipient_jid ) - def sendReference( + def send_reference( self, client: "SatXMPPEntity", uri: Optional[str] = None, @@ -200,7 +200,7 @@ raise exceptions.InternalError( '"to_jid" must be set if "uri is None"' ) - uri = xmpp_uri.buildXMPPUri(path=to_jid.full()) + uri = xmpp_uri.build_xmpp_uri(path=to_jid.full()) if message_elt is None: message_elt = domish.Element((None, "message")) @@ -215,8 +215,8 @@ '{message_elt.toXml()}' ) - message_elt.addChild(self.buildRefElement(uri, type_, begin, end, anchor)) - self._h.addHintElements(message_elt, [self._h.HINT_STORE]) + message_elt.addChild(self.build_ref_element(uri, type_, begin, end, anchor)) + self._h.add_hint_elements(message_elt, [self._h.HINT_STORE]) client.send(message_elt)
--- a/sat/plugins/plugin_xep_0373.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0373.py Sat Apr 08 13:54:42 2023 +0200 @@ -197,7 +197,7 @@ @abstractmethod def import_public_key(self, packet: bytes) -> GPGPublicKey: - """Import a public key from a key material packet according to RFC 4880 §5.5. + """import a public key from a key material packet according to RFC 4880 §5.5. OpenPGP's ASCII Armor is not used. @@ -1013,7 +1013,7 @@ # Add configuration option to choose between manual trust and BTBV as the trust # model - host.memory.updateParams(DEFAULT_TRUST_MODEL_PARAM) + host.memory.update_params(DEFAULT_TRUST_MODEL_PARAM) self.__xep_0045 = cast(Optional[XEP_0045], host.plugins.get("XEP-0045")) self.__xep_0060 = cast(XEP_0060, host.plugins["XEP-0060"]) @@ -1021,7 +1021,7 @@ self.__storage: Dict[str, persistent.LazyPersistentBinaryDict] = {} xep_0163 = cast(XEP_0163, host.plugins["XEP-0163"]) - xep_0163.addPEPEvent( + xep_0163.add_pep_event( "OX_PUBLIC_KEYS_LIST", PUBLIC_KEYS_LIST_NODE, lambda items_event, profile: defer.ensureDeferred( @@ -1029,10 +1029,10 @@ ) ) - async def profileConnecting(self, client): + async def profile_connecting(self, client): client.gpg_provider = get_gpg_provider(self.host, client) - async def profileConnected( # pylint: disable=invalid-name + async def profile_connected( # pylint: disable=invalid-name self, client: SatXMPPClient ) -> None: @@ -1061,7 +1061,7 @@ @param profile: The profile this event belongs to. """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) sender = cast(jid.JID, items_event.sender) items = cast(List[domish.Element], items_event.items) @@ -1323,7 +1323,7 @@ encryption_keys: Set[GPGPublicKey] = set() for recipient_jid in recipient_jids: - # Import all keys of the recipient + # import all keys of the recipient all_public_keys = await self.import_all_public_keys(client, recipient_jid) # Filter for keys that can encrypt @@ -1379,7 +1379,7 @@ self.list_secret_keys(client) )) - # Import all keys of the sender + # import all keys of the sender all_public_keys = await self.import_all_public_keys(client, sender_jid) # Filter for keys that can sign @@ -1471,7 +1471,7 @@ pubkey_elt.addElement("data", content=base64.b64encode(packet).decode("ASCII")) try: - await self.__xep_0060.sendItem( + await self.__xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -1495,7 +1495,7 @@ client: SatXMPPClient, entity_jid: jid.JID ) -> Set[GPGPublicKey]: - """Import all public keys of a JID that have not been imported before. + """import all public keys of a JID that have not been imported before. @param client: The client. @param jid: The JID. Can be a bare JID. @@ -1539,7 +1539,7 @@ ) except Exception as e: log.warning( - f"Import of public key {missing_key.fingerprint} owned by" + f"import of public key {missing_key.fingerprint} owned by" f" {entity_jid.userhost()} failed, ignoring: {e}" ) @@ -1551,7 +1551,7 @@ jid: jid.JID, fingerprint: str ) -> GPGPublicKey: - """Import a public key. + """import a public key. @param client: The client. @param jid: The JID owning the public key. Can be a bare JID. @@ -1569,7 +1569,7 @@ node = f"urn:xmpp:openpgp:0:public-keys:{fingerprint}" try: - items, __ = await self.__xep_0060.getItems( + items, __ = await self.__xep_0060.get_items( client, jid.userhostJID(), node, @@ -1644,7 +1644,7 @@ pubkey_metadata_elt["date"] = format_datetime(public_key_metadata.timestamp) try: - await self.__xep_0060.sendItem( + await self.__xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -1681,7 +1681,7 @@ node = "urn:xmpp:openpgp:0:public-keys" try: - items, __ = await self.__xep_0060.getItems( + items, __ = await self.__xep_0060.get_items( client, jid.userhostJID(), node, @@ -1741,7 +1741,7 @@ """ try: - infos = cast(DiscoInfo, await self.host.memory.disco.getInfos( + infos = cast(DiscoInfo, await self.host.memory.disco.get_infos( client, client.jid.userhostJID() )) @@ -1769,7 +1769,7 @@ node = "urn:xmpp:openpgp:0:secret-key" try: - items, __ = await self.__xep_0060.getItems( + items, __ = await self.__xep_0060.get_items( client, client.jid.userhostJID(), node, @@ -1834,7 +1834,7 @@ secretkey_elt.addContent(base64.b64encode(ciphertext).decode("ASCII")) try: - await self.__xep_0060.sendItem( + await self.__xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -1888,7 +1888,7 @@ ciphertext: bytes, backup_code: str ) -> Set[GPGSecretKey]: - """Import previously downloaded secret keys. + """import previously downloaded secret keys. The downloading and importing steps are separate since a backup code is required for the import and it should be possible to try multiple backup codes without @@ -1930,7 +1930,7 @@ bare_jids: Set[jid.JID] = set() try: - room = cast(muc.Room, xep_0045.getRoom(client, room_jid)) + room = cast(muc.Room, xep_0045.get_room(client, room_jid)) except exceptions.NotFound as e: raise exceptions.InternalError( "Participant list of unjoined MUC requested." @@ -1988,7 +1988,7 @@ await self.__storage[client.profile].force(key, trust_level.name) - async def getTrustUI( # pylint: disable=invalid-name + async def get_trust_ui( # pylint: disable=invalid-name self, client: SatXMPPClient, entity: jid.JID @@ -2004,7 +2004,7 @@ raise ValueError("A bare JID is expected.") bare_jids: Set[jid.JID] - if self.__xep_0045 is not None and self.__xep_0045.isJoinedRoom(client, entity): + if self.__xep_0045 is not None and self.__xep_0045.is_joined_room(client, entity): bare_jids = self.__get_joined_muc_users(client, self.__xep_0045, entity) else: bare_jids = { entity.userhostJID() } @@ -2031,7 +2031,7 @@ data_form_result = cast( Dict[str, str], - xml_tools.XMLUIResult2DataFormResult(data) + xml_tools.xmlui_result_2_data_form_result(data) ) for key, value in data_form_result.items(): if not key.startswith("trust_"): @@ -2048,7 +2048,7 @@ return {} - submit_id = self.host.registerCallback(callback, with_data=True, one_shot=True) + submit_id = self.host.register_callback(callback, with_data=True, one_shot=True) result = xml_tools.XMLUI( panel_type=C.XMLUI_FORM, @@ -2070,7 +2070,7 @@ own_secret_keys = self.list_secret_keys(client) - trust_ui.changeContainer("label") + trust_ui.change_container("label") for index, secret_key in enumerate(own_secret_keys): trust_ui.addLabel(D_(f"Own secret key {index} fingerprint")) trust_ui.addText(secret_key.public_key.fingerprint)
--- a/sat/plugins/plugin_xep_0374.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0374.py Sat Apr 08 13:54:42 2023 +0200 @@ -92,9 +92,9 @@ sat.trigger.add("send", self.__send_trigger, priority=0) # Register the encryption plugin - sat.registerEncryptionPlugin(self, "OXIM", NS_OX, 102) + sat.register_encryption_plugin(self, "OXIM", NS_OX, 102) - async def getTrustUI( # pylint: disable=invalid-name + async def get_trust_ui( # pylint: disable=invalid-name self, client: SatXMPPClient, entity: jid.JID @@ -106,7 +106,7 @@ devices belonging to the entity. """ - return await self.__xep_0373.getTrustUI(client, entity) + return await self.__xep_0373.get_trust_ui(client, entity) @staticmethod def __get_joined_muc_users( @@ -126,7 +126,7 @@ bare_jids: Set[jid.JID] = set() try: - room = cast(muc.Room, xep_0045.getRoom(client, room_jid)) + room = cast(muc.Room, xep_0045.get_room(client, room_jid)) except exceptions.NotFound as e: raise exceptions.InternalError( "Participant list of unjoined MUC requested." @@ -176,7 +176,7 @@ room_jid = feedback_jid = sender_jid.userhostJID() try: - room = cast(muc.Room, self.__xep_0045.getRoom(client, room_jid)) + room = cast(muc.Room, self.__xep_0045.get_room(client, room_jid)) except exceptions.NotFound: log.warning( f"Ignoring MUC message from a room that has not been joined:" @@ -272,13 +272,13 @@ # Mark the message as trusted or untrusted. Undecided counts as untrusted here. trust_level = TrustLevel.UNDECIDED # TODO: Load the actual trust level if trust_level is TrustLevel.TRUSTED: - post_treat.addCallback(client.encryption.markAsTrusted) + post_treat.addCallback(client.encryption.mark_as_trusted) else: - post_treat.addCallback(client.encryption.markAsUntrusted) + post_treat.addCallback(client.encryption.mark_as_untrusted) # Mark the message as originally encrypted post_treat.addCallback( - client.encryption.markAsEncrypted, + client.encryption.mark_as_encrypted, namespace=NS_OX ) @@ -326,7 +326,7 @@ ) # Add a store hint if this is a message stanza - self.__xep_0334.addHintElements(stanza, [ "store" ]) + self.__xep_0334.add_hint_elements(stanza, [ "store" ]) # Let the flow continue. return True
--- a/sat/plugins/plugin_xep_0376.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0376.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,16 +49,16 @@ def __init__(self, host): log.info(_("Pubsub Account Management initialization")) self.host = host - host.registerNamespace("pam", NS_PAM) + host.register_namespace("pam", NS_PAM) self._p = self.host.plugins["XEP-0060"] host.trigger.add("XEP-0060_subscribe", self.subscribe) host.trigger.add("XEP-0060_unsubscribe", self.unsubscribe) host.trigger.add("XEP-0060_subscriptions", self.subscriptions) - def getHandler(self, client): + def get_handler(self, client): return XEP_0376_Handler() - async def profileConnected(self, client): + async def profile_connected(self, client): if not self.host.hasFeature(client, NS_PAM): log.warning( "Your server doesn't support Pubsub Account Management, this is used to " @@ -66,7 +66,7 @@ "install it." ) - async def _subRequest( + async def _sub_request( self, client: SatXMPPEntity, service: jid.JID, @@ -109,7 +109,7 @@ if not self.host.hasFeature(client, NS_PAM) or client.is_component: return True, None - await self._subRequest(client, service, nodeIdentifier, sub_jid, options, True) + await self._sub_request(client, service, nodeIdentifier, sub_jid, options, True) # TODO: actual result is sent with <message> stanza, we have to get and use them # to known the actual result. XEP-0376 returns an empty <iq> result, thus we don't @@ -130,7 +130,7 @@ ) -> bool: if not self.host.hasFeature(client, NS_PAM) or client.is_component: return True - await self._subRequest(client, service, nodeIdentifier, sub_jid, None, False) + await self._sub_request(client, service, nodeIdentifier, sub_jid, None, False) return False async def subscriptions(
--- a/sat/plugins/plugin_xep_0380.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0380.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,11 +47,11 @@ def __init__(self, host): self.host = host - host.trigger.add("sendMessage", self._sendMessageTrigger) - host.trigger.add("messageReceived", self._messageReceivedTrigger, priority=100) - host.registerNamespace("eme", NS_EME) + host.trigger.add("sendMessage", self._send_message_trigger) + host.trigger.add("messageReceived", self._message_received_trigger, priority=100) + host.register_namespace("eme", NS_EME) - def _addEMEElement(self, mess_data, namespace, name): + def _add_eme_element(self, mess_data, namespace, name): message_elt = mess_data['xml'] encryption_elt = message_elt.addElement((NS_EME, 'encryption')) encryption_elt['namespace'] = namespace @@ -59,7 +59,7 @@ encryption_elt['name'] = name return mess_data - def _sendMessageTrigger(self, client, mess_data, __, post_xml_treatments): + def _send_message_trigger(self, client, mess_data, __, post_xml_treatments): encryption = mess_data.get(C.MESS_KEY_ENCRYPTION) if encryption is not None: namespace = encryption['plugin'].namespace @@ -68,17 +68,17 @@ else: name = None post_xml_treatments.addCallback( - self._addEMEElement, namespace=namespace, name=name) + self._add_eme_element, namespace=namespace, name=name) return True - def _messageReceivedTrigger(self, client, message_elt, post_treat): + def _message_received_trigger(self, client, message_elt, post_treat): try: encryption_elt = next(message_elt.elements(NS_EME, 'encryption')) except StopIteration: return True namespace = encryption_elt['namespace'] - if namespace in client.encryption.getNamespaces(): + if namespace in client.encryption.get_namespaces(): # message is encrypted and we can decrypt it return True
--- a/sat/plugins/plugin_xep_0384.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0384.py Sat Apr 08 13:54:42 2023 +0200 @@ -205,7 +205,7 @@ def __init__(self, profile: str, own_bare_jid: str) -> None: """ @param profile: The profile this OMEMO data belongs to. - @param own_bare_jid: The own bare JID, to return by the :meth:`loadOwnData` call. + @param own_bare_jid: The own bare JID, to return by the :meth:`load_own_data` call. """ self.__storage = persistent.LazyPersistentBinaryDict("XEP-0384", profile) @@ -346,7 +346,7 @@ node = f"eu.siacs.conversations.axolotl.bundles:{device_id}" try: - items, __ = await xep_0060.getItems(client, jid.JID(bare_jid), node, max_items=1) + items, __ = await xep_0060.get_items(client, jid.JID(bare_jid), node, max_items=1) except Exception as e: raise omemo.BundleDownloadFailed( f"Bundle download failed for {bare_jid}: {device_id} under namespace" @@ -726,7 +726,7 @@ ) # Finally, encrypt and send the trust message! - message_data = client.generateMessageXML(MessageData({ + message_data = client.generate_message_xml(MessageData({ "from": own_jid, "to": recipient_jid, "uid": str(uuid.uuid4()), @@ -811,7 +811,7 @@ with XMPP interactions and trust handled via the SAT instance. """ - client = sat.getClient(profile) + client = sat.get_client(profile) xep_0060 = cast(XEP_0060, sat.plugins["XEP-0060"]) class SessionManagerImpl(omemo.SessionManager): @@ -827,7 +827,7 @@ node = "urn:xmpp:omemo:2:bundles" try: - await xep_0060.sendItem( + await xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -867,7 +867,7 @@ node = f"eu.siacs.conversations.axolotl.bundles:{bundle.device_id}" try: - await xep_0060.sendItem( + await xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -897,7 +897,7 @@ node = "urn:xmpp:omemo:2:bundles" try: - items, __ = await xep_0060.getItems( + items, __ = await xep_0060.get_items( client, jid.JID(bare_jid), node, @@ -951,7 +951,7 @@ node = "urn:xmpp:omemo:2:bundles" try: - await xep_0060.retractItems( + await xep_0060.retract_items( client, client.jid.userhostJID(), node, @@ -1000,7 +1000,7 @@ raise omemo.UnknownNamespace(f"Unknown namespace: {namespace}") try: - await xep_0060.sendItem( + await xep_0060.send_item( client, client.jid.userhostJID(), node, @@ -1050,7 +1050,7 @@ raise omemo.UnknownNamespace(f"Unknown namespace: {namespace}") try: - items, __ = await xep_0060.getItems(client, jid.JID(bare_jid), node) + items, __ = await xep_0060.get_items(client, jid.JID(bare_jid), node) except exceptions.NotFound: return {} except Exception as e: @@ -1117,7 +1117,7 @@ # on the trust system and phase if trust_level is TrustLevel.BLINDLY_TRUSTED: # Get the name of the active trust system - trust_system = cast(str, sat.memory.getParamA( + trust_system = cast(str, sat.memory.param_get_a( PARAM_NAME, PARAM_CATEGORY, profile_key=profile @@ -1247,7 +1247,7 @@ if element is None: raise omemo.UnknownNamespace(f"Unknown namespace: {message.namespace}") - message_data = client.generateMessageXML(MessageData({ + message_data = client.generate_message_xml(MessageData({ "from": client.jid, "to": jid.JID(bare_jid), "uid": str(uuid.uuid4()), @@ -1282,7 +1282,7 @@ """ # This session manager handles encryption with both twomemo and oldmemo, but - # both are currently registered as different plugins and the `deferXMLUI` + # both are currently registered as different plugins and the `defer_xmlui` # below requires a single namespace identifying the encryption plugin. Thus, # get the namespace of the requested encryption method from the encryption # session using the feedback JID. @@ -1313,7 +1313,7 @@ own_device, __ = await self.get_own_device_information() - trust_ui.changeContainer("label") + trust_ui.change_container("label") trust_ui.addLabel(D_("This device ID")) trust_ui.addText(str(own_device.device_id)) trust_ui.addLabel(D_("This device's fingerprint")) @@ -1333,11 +1333,11 @@ trust_ui.addLabel(D_("Fingerprint")) trust_ui.addText(" ".join(self.format_identity_key(device.identity_key))) trust_ui.addLabel(D_("Trust this device?")) - trust_ui.addBool(f"trust_{index}", value=C.boolConst(False)) + trust_ui.addBool(f"trust_{index}", value=C.bool_const(False)) trust_ui.addEmpty() trust_ui.addEmpty() - trust_ui_result = await xml_tools.deferXMLUI( + trust_ui_result = await xml_tools.defer_xmlui( sat, trust_ui, action_extra={ "meta_encryption_trust": namespace }, @@ -1347,7 +1347,7 @@ if C.bool(trust_ui_result.get("cancelled", "false")): raise omemo.TrustDecisionFailed("Trust UI cancelled.") - data_form_result = cast(Dict[str, str], xml_tools.XMLUIResult2DataFormResult( + data_form_result = cast(Dict[str, str], xml_tools.xmlui_result_2_data_form_result( trust_ui_result )) @@ -1375,7 +1375,7 @@ )) # Check whether ATM is enabled and handle everything in case it is - trust_system = cast(str, sat.memory.getParamA( + trust_system = cast(str, sat.memory.param_get_a( PARAM_NAME, PARAM_CATEGORY, profile_key=profile @@ -1434,7 +1434,7 @@ :meth:`~omemo.session_manager.SessionManager.create`. """ - client = sat.getClient(profile) + client = sat.get_client(profile) xep_0060 = cast(XEP_0060, sat.plugins["XEP-0060"]) storage = StorageImpl(profile) @@ -1549,7 +1549,7 @@ # Add configuration option to choose between manual trust and BTBV as the trust # model - sat.memory.updateParams(DEFAULT_TRUST_MODEL_PARAM) + sat.memory.update_params(DEFAULT_TRUST_MODEL_PARAM) # Plugins self.__xep_0045 = cast(Optional[XEP_0045], sat.plugins.get("XEP-0045")) @@ -1585,7 +1585,7 @@ priority=100050 ) sat.trigger.add( - "sendMessageData", + "send_message_data", self.__send_message_data_trigger, priority=100050 ) @@ -1596,18 +1596,18 @@ # including IQs. # Give twomemo a (slightly) higher priority than oldmemo - sat.registerEncryptionPlugin(self, "TWOMEMO", twomemo.twomemo.NAMESPACE, 101) - sat.registerEncryptionPlugin(self, "OLDMEMO", oldmemo.oldmemo.NAMESPACE, 100) + sat.register_encryption_plugin(self, "TWOMEMO", twomemo.twomemo.NAMESPACE, 101) + sat.register_encryption_plugin(self, "OLDMEMO", oldmemo.oldmemo.NAMESPACE, 100) xep_0163 = cast(XEP_0163, sat.plugins["XEP-0163"]) - xep_0163.addPEPEvent( + xep_0163.add_pep_event( "TWOMEMO_DEVICES", TWOMEMO_DEVICE_LIST_NODE, lambda items_event, profile: defer.ensureDeferred( self.__on_device_list_update(items_event, profile) ) ) - xep_0163.addPEPEvent( + xep_0163.add_pep_event( "OLDMEMO_DEVICES", OLDMEMO_DEVICE_LIST_NODE, lambda items_event, profile: defer.ensureDeferred( @@ -1620,9 +1620,9 @@ except KeyError: log.info(_("Text commands not available")) else: - self.__text_commands.registerTextCommands(self) - - def profileConnected( # pylint: disable=invalid-name + self.__text_commands.register_text_commands(self) + + def profile_connected( # pylint: disable=invalid-name self, client: SatXMPPClient ) -> None: @@ -1653,12 +1653,12 @@ """ twomemo_requested = \ - client.encryption.isEncryptionRequested(mess_data, twomemo.twomemo.NAMESPACE) + client.encryption.is_encryption_requested(mess_data, twomemo.twomemo.NAMESPACE) oldmemo_requested = \ - client.encryption.isEncryptionRequested(mess_data, oldmemo.oldmemo.NAMESPACE) + client.encryption.is_encryption_requested(mess_data, oldmemo.oldmemo.NAMESPACE) if not (twomemo_requested or oldmemo_requested): - self.__text_commands.feedBack( + self.__text_commands.feed_back( client, _("You need to have OMEMO encryption activated to reset the session"), mess_data @@ -1674,7 +1674,7 @@ log.debug(f"Replacing sessions with device {device}") await session_manager.replace_sessions(device) - self.__text_commands.feedBack( + self.__text_commands.feed_back( client, _("OMEMO session has been reset"), mess_data @@ -1682,7 +1682,7 @@ return False - async def getTrustUI( # pylint: disable=invalid-name + async def get_trust_ui( # pylint: disable=invalid-name self, client: SatXMPPClient, entity: jid.JID @@ -1698,7 +1698,7 @@ raise ValueError("A bare JID is expected.") bare_jids: Set[str] - if self.__xep_0045 is not None and self.__xep_0045.isJoinedRoom(client, entity): + if self.__xep_0045 is not None and self.__xep_0045.is_joined_room(client, entity): bare_jids = self.__get_joined_muc_users(client, self.__xep_0045, entity) else: bare_jids = { entity.userhost() } @@ -1729,7 +1729,7 @@ data_form_result = cast( Dict[str, str], - xml_tools.XMLUIResult2DataFormResult(data) + xml_tools.xmlui_result_2_data_form_result(data) ) trust_updates: Set[TrustUpdate] = set() @@ -1763,7 +1763,7 @@ )) # Check whether ATM is enabled and handle everything in case it is - trust_system = cast(str, self.__sat.memory.getParamA( + trust_system = cast(str, self.__sat.memory.param_get_a( PARAM_NAME, PARAM_CATEGORY, profile_key=profile @@ -1785,7 +1785,7 @@ return {} - submit_id = self.__sat.registerCallback(callback, with_data=True, one_shot=True) + submit_id = self.__sat.register_callback(callback, with_data=True, one_shot=True) result = xml_tools.XMLUI( panel_type=C.XMLUI_FORM, @@ -1809,7 +1809,7 @@ own_device, __ = await session_manager.get_own_device_information() - trust_ui.changeContainer("label") + trust_ui.change_container("label") trust_ui.addLabel(D_("This device ID")) trust_ui.addText(str(own_device.device_id)) trust_ui.addLabel(D_("This device's fingerprint")) @@ -1880,7 +1880,7 @@ bare_jids: Set[str] = set() try: - room = cast(muc.Room, xep_0045.getRoom(client, room_jid)) + room = cast(muc.Room, xep_0045.get_room(client, room_jid)) except exceptions.NotFound as e: raise exceptions.InternalError( "Participant list of unjoined MUC requested." @@ -2134,7 +2134,7 @@ room_jid = feedback_jid = sender_jid.userhostJID() try: - room = cast(muc.Room, self.__xep_0045.getRoom(client, room_jid)) + room = cast(muc.Room, self.__xep_0045.get_room(client, room_jid)) except exceptions.NotFound: log.warning( f"Ignoring MUC message from a room that has not been joined:" @@ -2165,7 +2165,7 @@ message_uid: Optional[str] = None if self.__xep_0359 is not None: - message_uid = self.__xep_0359.getOriginId(message_elt) + message_uid = self.__xep_0359.get_origin_id(message_elt) if message_uid is None: message_uid = message_elt.getAttribute("id") if message_uid is not None: @@ -2358,13 +2358,13 @@ await session_manager._evaluate_custom_trust_level(device_information) if trust_level is omemo.TrustLevel.TRUSTED: - post_treat.addCallback(client.encryption.markAsTrusted) + post_treat.addCallback(client.encryption.mark_as_trusted) else: - post_treat.addCallback(client.encryption.markAsUntrusted) + post_treat.addCallback(client.encryption.mark_as_untrusted) # Mark the message as originally encrypted post_treat.addCallback( - client.encryption.markAsEncrypted, + client.encryption.mark_as_encrypted, namespace=message.namespace ) @@ -2432,7 +2432,7 @@ # Add a store hint if this is a message stanza if stanza.name == "message": - self.__xep_0334.addHintElements(stanza, [ "store" ]) + self.__xep_0334.add_hint_elements(stanza, [ "store" ]) # Let the flow continue. return True @@ -2473,7 +2473,7 @@ ) # Add a store hint - self.__xep_0334.addHintElements(stanza, [ "store" ]) + self.__xep_0334.add_hint_elements(stanza, [ "store" ]) async def encrypt( self,
--- a/sat/plugins/plugin_xep_0391.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0391.py Sat Apr 08 13:54:42 2023 +0200 @@ -73,7 +73,7 @@ def __init__(self, host): log.info(_("XEP-0391 (Pubsub Attachments) plugin initialization")) - host.registerNamespace("jet", NS_JET) + host.register_namespace("jet", NS_JET) self.host = host self._o = host.plugins["XEP-0384"] self._j = host.plugins["XEP-0166"] @@ -94,7 +94,7 @@ self._add_encryption_filter ) - def getHandler(self, client): + def get_handler(self, client): return JET_Handler() async def _on_initiate_elt_build(
--- a/sat/plugins/plugin_xep_0422.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0422.py Sat Apr 08 13:54:42 2023 +0200 @@ -60,12 +60,12 @@ def __init__(self, host): log.info(_("XEP-0422 (Message Fastening) plugin initialization")) self.host = host - host.registerNamespace("fasten", NS_FASTEN) + host.register_namespace("fasten", NS_FASTEN) - def getHandler(self, __): + def get_handler(self, __): return XEP_0422_handler() - def applyToElt( + def apply_to_elt( self, message_elt: domish.Element, origin_id: str, @@ -93,9 +93,9 @@ apply_to_elt = message_elt.addElement((NS_FASTEN, "apply-to")) apply_to_elt["id"] = origin_id if clear is not None: - apply_to_elt["clear"] = C.boolConst(clear) + apply_to_elt["clear"] = C.bool_const(clear) if shell is not None: - apply_to_elt["shell"] = C.boolConst(shell) + apply_to_elt["shell"] = C.bool_const(shell) if children is not None: for child in children: apply_to_elt.addChild(child) @@ -111,7 +111,7 @@ return apply_to_elt @async_lru(maxsize=5) - async def getFastenedElts( + async def get_fastened_elts( self, client: SatXMPPEntity, message_elt: domish.Element
--- a/sat/plugins/plugin_xep_0424.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0424.py Sat Apr 08 13:54:42 2023 +0200 @@ -70,13 +70,13 @@ def __init__(self, host): log.info(_("XEP-0424 (Message Retraction) plugin initialization")) self.host = host - host.memory.updateParams(PARAMS) + host.memory.update_params(PARAMS) self._h = host.plugins["XEP-0334"] self._f = host.plugins["XEP-0422"] - host.registerNamespace("message-retract", NS_MESSAGE_RETRACT) - host.trigger.add("messageReceived", self._messageReceivedTrigger, 100) - host.bridge.addMethod( - "messageRetract", + host.register_namespace("message-retract", NS_MESSAGE_RETRACT) + host.trigger.add("messageReceived", self._message_received_trigger, 100) + host.bridge.add_method( + "message_retract", ".plugin", in_sign="ss", out_sign="", @@ -84,16 +84,16 @@ async_=True, ) - def getHandler(self, __): + def get_handler(self, __): return XEP_0424_handler() def _retract(self, message_id: str, profile: str) -> None: - client = self.host.getClient(profile) + client = self.host.get_client(profile) return defer.ensureDeferred( self.retract(client, message_id) ) - def retractByOriginId( + def retract_by_origin_id( self, client: SatXMPPEntity, dest_jid: jid.JID, @@ -109,17 +109,17 @@ message_elt = domish.Element((None, "message")) message_elt["from"] = client.jid.full() message_elt["to"] = dest_jid.full() - apply_to_elt = self._f.applyToElt(message_elt, origin_id) + apply_to_elt = self._f.apply_to_elt(message_elt, origin_id) apply_to_elt.addElement((NS_MESSAGE_RETRACT, "retract")) - self.host.plugins["XEP-0428"].addFallbackElt( + self.host.plugins["XEP-0428"].add_fallback_elt( message_elt, "[A message retraction has been requested, but your client doesn't support " "it]" ) - self._h.addHintElements(message_elt, [self._h.HINT_STORE]) + self._h.add_hint_elements(message_elt, [self._h.HINT_STORE]) client.send(message_elt) - async def retractByHistory( + async def retract_by_history( self, client: SatXMPPEntity, history: History @@ -138,8 +138,8 @@ "client is probably not supporting message retraction." ) else: - self.retractByOriginId(client, history.dest_jid, origin_id) - await self.retractDBHistory(client, history) + self.retract_by_origin_id(client, history.dest_jid, origin_id) + await self.retract_db_history(client, history) async def retract( self, @@ -163,9 +163,9 @@ raise exceptions.NotFound( f"message to retract not found in database ({message_id})" ) - await self.retractByHistory(client, history) + await self.retract_by_history(client, history) - async def retractDBHistory(self, client, history: History) -> None: + async def retract_db_history(self, client, history: History) -> None: """Mark an history instance in database as retracted @param history: history instance @@ -176,7 +176,7 @@ # we assign a new object to be sure to trigger an update history.extra = deepcopy(history.extra) if history.extra else {} history.extra["retracted"] = True - keep_history = self.host.memory.getParamA( + keep_history = self.host.memory.param_get_a( NAME, CATEGORY, profile_key=client.profile ) old_version: Dict[str, Any] = { @@ -194,13 +194,13 @@ session_add=[history] ) - async def _messageReceivedTrigger( + async def _message_received_trigger( self, client: SatXMPPEntity, message_elt: domish.Element, post_treat: defer.Deferred ) -> bool: - fastened_elts = await self._f.getFastenedElts(client, message_elt) + fastened_elts = await self._f.get_fastened_elts(client, message_elt) if fastened_elts is None: return True for elt in fastened_elts.elements: @@ -218,7 +218,7 @@ break else: return True - if not await self.host.trigger.asyncPoint( + if not await self.host.trigger.async_point( "XEP-0424_retractReceived", client, message_elt, elt, fastened_elts ): return False @@ -230,7 +230,7 @@ ) return False log.info(f"[{client.profile}] retracting message {fastened_elts.id!r}") - await self.retractDBHistory(client, fastened_elts.history) + await self.retract_db_history(client, fastened_elts.history) # TODO: send bridge signal return False
--- a/sat/plugins/plugin_xep_0428.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0428.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,9 +46,9 @@ def __init__(self, host): log.info(_("XEP-0428 (Fallback Indication) plugin initialization")) - host.registerNamespace("fallback", NS_FALLBACK) + host.register_namespace("fallback", NS_FALLBACK) - def addFallbackElt( + def add_fallback_elt( self, message_elt: domish.Element, msg: Optional[str] = None @@ -64,7 +64,7 @@ if msg is not None: message_elt.addElement("body", content=msg) - def hasFallback(self, message_elt: domish.Element) -> bool: + def has_fallback(self, message_elt: domish.Element) -> bool: """Tell if a message has a fallback indication""" try: next(message_elt.elements(NS_FALLBACK, "fallback")) @@ -73,7 +73,7 @@ else: return True - def getHandler(self, __): + def get_handler(self, __): return XEP_0428_handler()
--- a/sat/plugins/plugin_xep_0444.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0444.py Sat Apr 08 13:54:42 2023 +0200 @@ -53,23 +53,23 @@ def __init__(self, host): log.info(_("Message Reactions initialization")) - host.registerNamespace("reactions", NS_REACTIONS) + host.register_namespace("reactions", NS_REACTIONS) self.host = host self._h = host.plugins["XEP-0334"] - host.bridge.addMethod( - "messageReactionsSet", + host.bridge.add_method( + "message_reactions_set", ".plugin", in_sign="ssas", out_sign="", - method=self._reactionsSet, + method=self._reactions_set, async_=True, ) - host.trigger.add("messageReceived", self._messageReceivedTrigger) + host.trigger.add("messageReceived", self._message_received_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP_0444_Handler() - async def _messageReceivedTrigger( + async def _message_received_trigger( self, client: SatXMPPEntity, message_elt: domish.Element, @@ -77,13 +77,13 @@ ) -> bool: return True - def _reactionsSet(self, message_id: str, profile: str, reactions: List[str]) -> None: - client = self.host.getClient(profile) + def _reactions_set(self, message_id: str, profile: str, reactions: List[str]) -> None: + client = self.host.get_client(profile) return defer.ensureDeferred( - self.setReactions(client, message_id) + self.set_reactions(client, message_id) ) - def sendReactions( + def send_reactions( self, client: SatXMPPEntity, dest_jid: jid.JID, @@ -103,10 +103,10 @@ reactions_elt["id"] = message_id for r in set(reactions): reactions_elt.addElement("reaction", content=r) - self._h.addHintElements(message_elt, [self._h.HINT_STORE]) + self._h.add_hint_elements(message_elt, [self._h.HINT_STORE]) client.send(message_elt) - async def addReactionsToHistory( + async def add_reactions_to_history( self, history: History, from_jid: jid.JID, @@ -129,7 +129,7 @@ h_reactions["summary"] = sorted(list(set().union(*by_jid.values()))) await self.host.memory.storage.session_add(history) - async def setReactions( + async def set_reactions( self, client: SatXMPPEntity, message_id: str, @@ -157,8 +157,8 @@ "target message has neither origin-id nor message-id, we can't send a " "reaction" ) - await self.addReactionsToHistory(history, client.jid, reactions) - self.sendReactions(client, history.dest_jid, mess_id, reactions) + await self.add_reactions_to_history(history, client.jid, reactions) + self.send_reactions(client, history.dest_jid, mess_id, reactions) @implementer(iwokkel.IDisco)
--- a/sat/plugins/plugin_xep_0446.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0446.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,7 +49,7 @@ def __init__(self, host): log.info(_("XEP-0446 (File Metadata Element) plugin initialization")) - host.registerNamespace("file-metadata", NS_FILE_METADATA) + host.register_namespace("file-metadata", NS_FILE_METADATA) self._hash = host.plugins["XEP-0300"] def get_file_metadata_elt( @@ -95,7 +95,7 @@ file_elt.addElement(name, content=str(value)) if file_hash is not None: hash_algo, hash_ = file_hash - file_elt.addChild(self._hash.buildHashElt(hash_, hash_algo)) + file_elt.addChild(self._hash.build_hash_elt(hash_, hash_algo)) if date is not None: file_elt.addElement("date", utils.xmpp_date(date)) if thumbnail is not None: @@ -151,12 +151,12 @@ raise exceptions.InternalError try: - algo, hash_ = self._hash.parseHashElt(file_metadata_elt) + algo, hash_ = self._hash.parse_hash_elt(file_metadata_elt) except exceptions.NotFound: pass except exceptions.DataError: - from sat.tools.xml_tools import pFmtElt - log.warning("invalid <hash/> element:\n{pFmtElt(file_metadata_elt)}") + from sat.tools.xml_tools import p_fmt_elt + log.warning("invalid <hash/> element:\n{p_fmt_elt(file_metadata_elt)}") else: data["file_hash"] = (algo, hash_)
--- a/sat/plugins/plugin_xep_0447.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0447.py Sat Apr 08 13:54:42 2023 +0200 @@ -59,7 +59,7 @@ def __init__(self, host): self.host = host log.info(_("XEP-0447 (Stateless File Sharing) plugin initialization")) - host.registerNamespace("sfs", NS_SFS) + host.register_namespace("sfs", NS_SFS) self._sources_handlers = {} self._u = host.plugins["XEP-0103"] self._hints = host.plugins["XEP-0334"] @@ -150,7 +150,7 @@ if self._http_upload is None: return False try: - await self._http_upload.getHTTPUploadEntity(client) + await self._http_upload.get_http_upload_entity(client) except exceptions.NotFound: return False else: @@ -334,7 +334,7 @@ "There should not be more that one attachment at this point" ) await self._attach.upload_files(client, data) - self._hints.addHintElements(data["xml"], [self._hints.HINT_STORE]) + self._hints.add_hint_elements(data["xml"], [self._hints.HINT_STORE]) for attachment in attachments: try: file_hash = (attachment["hash_algo"], attachment["hash"])
--- a/sat/plugins/plugin_xep_0448.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0448.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,7 +74,7 @@ def __init__(self, host): self.host = host log.info(_("XEP_0448 plugin initialization")) - host.registerNamespace("esfs", NS_ESFS) + host.register_namespace("esfs", NS_ESFS) self._u = host.plugins["XEP-0103"] self._h = host.plugins["XEP-0300"] self._hints = host.plugins["XEP-0334"] @@ -92,7 +92,7 @@ host.trigger.add("XEP-0363_upload_pre_slot", self._upload_pre_slot) host.trigger.add("XEP-0363_upload", self._upload_trigger) - def getHandler(self, client): + def get_handler(self, client): return XEP0448Handler() def parse_encrypted_elt(self, encrypted_elt: domish.Element) -> Dict[str, Any]: @@ -123,7 +123,7 @@ "invalid <encrypted/> element: {encrypted_elt.toXml()}" ) try: - hash_algo, hash_value = self._h.parseHashElt(encrypted_elt) + hash_algo, hash_value = self._h.parse_hash_elt(encrypted_elt) except exceptions.NotFound: pass else: @@ -237,7 +237,7 @@ # handle the attachment if it's not activated return False try: - await self._http_upload.getHTTPUploadEntity(client) + await self._http_upload.get_http_upload_entity(client) except exceptions.NotFound: return False else: @@ -279,7 +279,7 @@ "There should not be more that one attachment at this point" ) await self._attach.upload_files(client, data, upload_cb=self._upload_cb) - self._hints.addHintElements(data["xml"], [self._hints.HINT_STORE]) + self._hints.add_hint_elements(data["xml"], [self._hints.HINT_STORE]) for attachment in attachments: encryption_data = attachment.pop("encryption_data") file_hash = (attachment["hash_algo"], attachment["hash"]) @@ -301,7 +301,7 @@ "iv", content=base64.b64encode(encryption_data["iv"]).decode() ) - encrypted_elt.addChild(self._h.buildHashElt( + encrypted_elt.addChild(self._h.build_hash_elt( attachment["encrypted_hash"], attachment["encrypted_hash_algo"] )) @@ -446,9 +446,9 @@ attachment.update({ "hash_algo": self._h.ALGO_DEFAULT, - "hasher": self._h.getHasher(), + "hasher": self._h.get_hasher(), "encrypted_hash_algo": self._h.ALGO_DEFAULT, - "encrypted_hasher": self._h.getHasher(), + "encrypted_hasher": self._h.get_hasher(), }) # with data_cb we encrypt the file on the fly
--- a/sat/plugins/plugin_xep_0465.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0465.py Sat Apr 08 13:54:42 2023 +0200 @@ -61,34 +61,34 @@ def __init__(self, host): log.info(_("Pubsub Public Subscriptions initialization")) - host.registerNamespace("pps", NS_PPS) + host.register_namespace("pps", NS_PPS) self.host = host - host.bridge.addMethod( - "psPublicSubscriptionsGet", + host.bridge.add_method( + "ps_public_subscriptions_get", ".plugin", in_sign="sss", out_sign="s", method=self._subscriptions, async_=True, ) - host.bridge.addMethod( - "psPublicSubscriptionsGet", + host.bridge.add_method( + "ps_public_subscriptions_get", ".plugin", in_sign="sss", out_sign="s", method=self._subscriptions, async_=True, ) - host.bridge.addMethod( - "psPublicNodeSubscriptionsGet", + host.bridge.add_method( + "ps_public_node_subscriptions_get", ".plugin", in_sign="sss", out_sign="a{ss}", - method=self._getPublicNodeSubscriptions, + method=self._get_public_node_subscriptions, async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return XEP_0465_Handler() @property @@ -99,7 +99,7 @@ def subscribers_node_prefix(self) -> str: return SUBSCRIBERS_NODE_PREFIX - def buildSubscriptionElt(self, node: str, service: jid.JID) -> domish.Element: + def build_subscription_elt(self, node: str, service: jid.JID) -> domish.Element: """Generate a <subscriptions> element This is the element that a service returns on public subscriptions request @@ -109,7 +109,7 @@ subscription_elt["service"] = service.full() return subscription_elt - def buildSubscriberElt(self, subscriber: jid.JID) -> domish.Element: + def build_subscriber_elt(self, subscriber: jid.JID) -> domish.Element: """Generate a <subscriber> element This is the element that a service returns on node public subscriptions request @@ -125,7 +125,7 @@ nodeIdentifier="", profile_key=C.PROF_KEY_NONE ) -> str: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) service = None if not service else jid.JID(service) subs = await self.subscriptions(client, service, nodeIdentifier or None) return data_format.serialise(subs) @@ -145,7 +145,7 @@ if service is None: service = client.jid.userhostJID() try: - items, __ = await self.host.plugins["XEP-0060"].getItems( + items, __ = await self.host.plugins["XEP-0060"].get_items( client, service, NS_PPS_SUBSCRIPTIONS ) except error.StanzaError as e: @@ -182,23 +182,23 @@ return ret @utils.ensure_deferred - async def _getPublicNodeSubscriptions( + async def _get_public_node_subscriptions( self, service: str, node: str, profile_key: str ) -> Dict[str, str]: - client = self.host.getClient(profile_key) - subs = await self.getPublicNodeSubscriptions( + client = self.host.get_client(profile_key) + subs = await self.get_public_node_subscriptions( client, jid.JID(service) if service else None, node ) return {j.full(): a for j, a in subs.items()} - def getPublicSubscribersNode(self, node: str) -> str: + def get_public_subscribers_node(self, node: str) -> str: """Return prefixed node to retrieve public subscribers""" return f"{NS_PPS_SUBSCRIBERS}/{node}" - async def getPublicNodeSubscriptions( + async def get_public_node_subscriptions( self, client: SatXMPPEntity, service: Optional[jid.JID], @@ -214,10 +214,10 @@ if service is None: service = client.jid.userhostJID() - subscribers_node = self.getPublicSubscribersNode(nodeIdentifier) + subscribers_node = self.get_public_subscribers_node(nodeIdentifier) try: - items, __ = await self.host.plugins["XEP-0060"].getItems( + items, __ = await self.host.plugins["XEP-0060"].get_items( client, service, subscribers_node ) except error.StanzaError as e: @@ -243,7 +243,7 @@ continue return ret - def setPublicOpt(self, options: Optional[dict] = None) -> dict: + def set_public_opt(self, options: Optional[dict] = None) -> dict: """Set option to make a subscription public @param options: dict where the option must be set
--- a/sat/plugins/plugin_xep_0470.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0470.py Sat Apr 08 13:54:42 2023 +0200 @@ -30,7 +30,7 @@ from sat.core.core_types import SatXMPPEntity from sat.core import exceptions from sat.tools.common import uri, data_format, date_utils -from sat.tools.utils import asDeferred, xmpp_date +from sat.tools.utils import as_deferred, xmpp_date log = getLogger(__name__) @@ -58,27 +58,27 @@ def __init__(self, host): log.info(_("XEP-0470 (Pubsub Attachments) plugin initialization")) - host.registerNamespace("pubsub-attachments", NS_PUBSUB_ATTACHMENTS) + host.register_namespace("pubsub-attachments", NS_PUBSUB_ATTACHMENTS) self.host = host self._p = host.plugins["XEP-0060"] self.handlers: Dict[Tuple[str, str], dict[str, Any]] = {} - host.trigger.add("XEP-0277_send", self.onMBSend) + host.trigger.add("XEP-0277_send", self.on_mb_send) self.register_attachment_handler( - "noticed", NS_PUBSUB_ATTACHMENTS, self.noticedGet, self.noticedSet + "noticed", NS_PUBSUB_ATTACHMENTS, self.noticed_get, self.noticed_set ) self.register_attachment_handler( - "reactions", NS_PUBSUB_ATTACHMENTS, self.reactionsGet, self.reactionsSet + "reactions", NS_PUBSUB_ATTACHMENTS, self.reactions_get, self.reactions_set ) - host.bridge.addMethod( - "psAttachmentsGet", + host.bridge.add_method( + "ps_attachments_get", ".plugin", in_sign="sssasss", out_sign="(ss)", method=self._get, async_=True, ) - host.bridge.addMethod( - "psAttachmentsSet", + host.bridge.add_method( + "ps_attachments_set", ".plugin", in_sign="ss", out_sign="", @@ -86,7 +86,7 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return PubsubAttachments_Handler() def register_attachment_handler( @@ -125,9 +125,9 @@ "set": set_cb } - def getAttachmentNodeName(self, service: jid.JID, node: str, item: str) -> str: + def get_attachment_node_name(self, service: jid.JID, node: str, item: str) -> str: """Generate name to use for attachment node""" - target_item_uri = uri.buildXMPPUri( + target_item_uri = uri.build_xmpp_uri( "pubsub", path=service.userhost(), node=node, @@ -135,17 +135,17 @@ ) return f"{NS_PUBSUB_ATTACHMENTS}/{target_item_uri}" - def isAttachmentNode(self, node: str) -> bool: + def is_attachment_node(self, node: str) -> bool: """Return True if node name is an attachment node""" return node.startswith(f"{NS_PUBSUB_ATTACHMENTS}/") - def attachmentNode2Item(self, node: str) -> Tuple[jid.JID, str, str]: + def attachment_node_2_item(self, node: str) -> Tuple[jid.JID, str, str]: """Retrieve service, node and item from attachement node's name""" - if not self.isAttachmentNode(node): + if not self.is_attachment_node(node): raise ValueError("this is not an attachment node!") prefix_len = len(f"{NS_PUBSUB_ATTACHMENTS}/") item_uri = node[prefix_len:] - parsed_uri = uri.parseXMPPUri(item_uri) + parsed_uri = uri.parse_xmpp_uri(item_uri) if parsed_uri["type"] != "pubsub": raise ValueError(f"unexpected URI type, it must be a pubsub URI: {item_uri}") try: @@ -156,7 +156,7 @@ item = parsed_uri["item"] return (service, node, item) - async def onMBSend( + async def on_mb_send( self, client: SatXMPPEntity, service: jid.JID, @@ -203,16 +203,16 @@ node_config.fields["pubsub#publish_model"].value = "open" except KeyError: log.warning("pubsub#publish_model field is missing") - attachment_node = self.getAttachmentNodeName(service, node, item_id) + attachment_node = self.get_attachment_node_name(service, node, item_id) # we use the same options as target node try: - await self._p.createIfNewNode( + await self._p.create_if_new_node( client, service, attachment_node, options=dict(node_config) ) except Exception as e: log.warning(f"Can't create attachment node {attachment_node}: {e}") - def items2attachmentData( + def items_2_attachment_data( self, client: SatXMPPEntity, items: List[domish.Element] @@ -267,11 +267,11 @@ extra_s: str, profile_key: str ) -> defer.Deferred: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) extra = data_format.deserialise(extra_s) senders = [jid.JID(s) for s in senders_s] d = defer.ensureDeferred( - self.getAttachments(client, jid.JID(service_s), node, item, senders) + self.get_attachments(client, jid.JID(service_s), node, item, senders) ) d.addCallback( lambda ret: @@ -280,7 +280,7 @@ ) return d - async def getAttachments( + async def get_attachments( self, client: SatXMPPEntity, service: jid.JID, @@ -298,21 +298,21 @@ entities will be retrieved. If None, attachments from all entities will be retrieved @param extra: extra data, will be used as ``extra`` argument when doing - ``getItems`` call. + ``get_items`` call. @return: A tuple with: - the list of attachments data, one item per found sender. The attachments data are dict containing attachment, no ``extra`` field is used here (contrarily to attachments data used with ``set_attachements``). - - metadata returned by the call to ``getItems`` + - metadata returned by the call to ``get_items`` """ if extra is None: extra = {} - attachment_node = self.getAttachmentNodeName(service, node, item) + attachment_node = self.get_attachment_node_name(service, node, item) item_ids = [e.userhost() for e in senders] if senders else None - items, metadata = await self._p.getItems( + items, metadata = await self._p.get_items( client, service, attachment_node, item_ids=item_ids, extra=extra ) - list_data = self.items2attachmentData(client, items) + list_data = self.items_2_attachment_data(client, items) return list_data, metadata @@ -321,7 +321,7 @@ attachments_s: str, profile_key: str ) -> None: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) attachments = data_format.deserialise(attachments_s) or {} return defer.ensureDeferred(self.set_attachements(client, attachments)) @@ -378,7 +378,7 @@ former_elt = next(attachments_elt.elements(namespace, name)) except StopIteration: former_elt = None - new_elt = await asDeferred( + new_elt = await as_deferred( handler["set"], client, attachments_data, former_elt ) if new_elt != former_elt: @@ -417,9 +417,9 @@ raise ValueError( 'data must have "service", "node" and "id" set' ) - attachment_node = self.getAttachmentNodeName(service, node, item) + attachment_node = self.get_attachment_node_name(service, node, item) try: - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, attachment_node, item_ids=[client.jid.userhost()] ) except exceptions.NotFound: @@ -437,7 +437,7 @@ ) try: - await self._p.sendItems(client, service, attachment_node, [item_elt]) + await self._p.send_items(client, service, attachment_node, [item_elt]) except error.StanzaError as e: if e.condition == "item-not-found": # the node doesn't exist, we can't publish attachments @@ -462,11 +462,11 @@ @param node: node of target item (used to get attachment node's name) @param item: name of target item (used to get attachment node's name) """ - attachment_node = self.getAttachmentNodeName(service, node, item) + attachment_node = self.get_attachment_node_name(service, node, item) await self._p.subscribe(client, service, attachment_node) - def setTimestamp(self, attachment_elt: domish.Element, data: dict) -> None: + def set_timestamp(self, attachment_elt: domish.Element, data: dict) -> None: """Check if a ``timestamp`` attribute is set, parse it, and fill data @param attachments_elt: element where the ``timestamp`` attribute may be set @@ -482,7 +482,7 @@ else: data["timestamp"] = timestamp - def noticedGet( + def noticed_get( self, client: SatXMPPEntity, attachments_elt: domish.Element, @@ -498,10 +498,10 @@ noticed_data = { "noticed": True } - self.setTimestamp(noticed_elt, noticed_data) + self.set_timestamp(noticed_elt, noticed_data) data["noticed"] = noticed_data - def noticedSet( + def noticed_set( self, client: SatXMPPEntity, data: Dict[str, Any], @@ -525,7 +525,7 @@ else: return None - def reactionsGet( + def reactions_get( self, client: SatXMPPEntity, attachments_elt: domish.Element, @@ -542,10 +542,10 @@ reactions = reactions_data["reactions"] for reaction_elt in reactions_elt.elements(NS_PUBSUB_ATTACHMENTS, "reaction"): reactions.append(str(reaction_elt)) - self.setTimestamp(reactions_elt, reactions_data) + self.set_timestamp(reactions_elt, reactions_data) data["reactions"] = reactions_data - def reactionsSet( + def reactions_set( self, client: SatXMPPEntity, data: Dict[str, Any],
--- a/sat/plugins/plugin_xep_0471.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/plugins/plugin_xep_0471.py Sat Apr 08 13:54:42 2023 +0200 @@ -80,67 +80,67 @@ self._sfs = host.plugins["XEP-0447"] self._a = host.plugins["XEP-0470"] # self._i = host.plugins.get("EMAIL_INVITATION") - host.registerNamespace("events", NS_EVENTS) + host.register_namespace("events", NS_EVENTS) self._a.register_attachment_handler("rsvp", NS_EVENTS, self.rsvp_get, self.rsvp_set) # host.plugins["PUBSUB_INVITATION"].register(NS_EVENTS, self) - host.bridge.addMethod( - "eventsGet", + host.bridge.add_method( + "events_get", ".plugin", in_sign="ssasss", out_sign="s", method=self._events_get, async_=True, ) - host.bridge.addMethod( - "eventCreate", + host.bridge.add_method( + "event_create", ".plugin", in_sign="sssss", out_sign="", method=self._event_create, async_=True, ) - host.bridge.addMethod( - "eventModify", + host.bridge.add_method( + "event_modify", ".plugin", in_sign="sssss", out_sign="", method=self._event_modify, async_=True, ) - host.bridge.addMethod( - "eventInviteeGet", + host.bridge.add_method( + "event_invitee_get", ".plugin", in_sign="sssasss", out_sign="s", method=self._event_invitee_get, async_=True, ) - host.bridge.addMethod( - "eventInviteeSet", + host.bridge.add_method( + "event_invitee_set", ".plugin", in_sign="sssss", out_sign="", method=self._event_invitee_set, async_=True, ) - host.bridge.addMethod( - "eventInviteesList", + host.bridge.add_method( + "event_invitees_list", ".plugin", in_sign="sss", out_sign="a{sa{ss}}", - method=self._eventInviteesList, + method=self._event_invitees_list, async_=True, ), - host.bridge.addMethod( - "eventInvite", + host.bridge.add_method( + "event_invite", ".plugin", in_sign="sssss", out_sign="", method=self._invite, async_=True, ) - host.bridge.addMethod( - "eventInviteByEmail", + host.bridge.add_method( + "event_invite_by_email", ".plugin", in_sign="ssssassssssss", out_sign="", @@ -148,10 +148,10 @@ async_=True, ) - def getHandler(self, client): + def get_handler(self, client): return EventsHandler(self) - def _parseEventElt(self, event_elt): + def _parse_event_elt(self, event_elt): """Helper method to parse event element @param (domish.Element): event_elt @@ -191,7 +191,7 @@ try: elt = next(event_elt.elements(NS_EVENT, uri_type)) uri = data[uri_type + "_uri"] = elt["uri"] - uri_data = xmpp_uri.parseXMPPUri(uri) + uri_data = xmpp_uri.parse_xmpp_uri(uri) if uri_data["type"] != "pubsub": raise ValueError except StopIteration: @@ -349,7 +349,7 @@ if rsvp_form is None: log.warning(f"RSVP form is missing: {rsvp_elt.toXml()}") continue - rsvp_data = xml_tools.dataForm2dataDict(rsvp_form) + rsvp_data = xml_tools.data_form_2_data_dict(rsvp_form) if rsvp_lang: rsvp_data["language"] = rsvp_lang event_data.setdefault("rsvp", []).append(rsvp_data) @@ -419,7 +419,7 @@ def _events_get( self, service: str, node: str, event_ids: List[str], extra: str, profile_key: str ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) d = defer.ensureDeferred( self.events_get( client, @@ -449,7 +449,7 @@ """ if service is None: service = client.jid.userhostJID() - items, __ = await self._p.getItems( + items, __ = await self._p.get_items( client, service, node, item_ids=events_ids, extra=extra ) events = [] @@ -471,7 +471,7 @@ event_id: str = "", profile_key: str = C.PROF_KEY_NONE ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred( self.event_create( client, @@ -523,7 +523,7 @@ If several locations are used, they must have distinct IDs rsvp(list[dict]) RSVP data. The dict is a data dict as used in - sat.tools.xml_tools.dataDict2dataForm with some extra keys. + sat.tools.xml_tools.data_dict_2_data_form with some extra keys. The "attending" key is automatically added if it's not already present, except if the "no_default" key is present. Thus, an empty dict can be used to use default RSVP. @@ -702,7 +702,7 @@ "required": True }) rsvp_data["namespace"] = NS_RSVP - rsvp_form = xml_tools.dataDict2dataForm(rsvp_data) + rsvp_form = xml_tools.data_dict_2_data_form(rsvp_data) rsvp_elt.addChild(rsvp_form.toElement()) for node_type in ("invitees", "comments", "blog", "schedule"): @@ -790,7 +790,7 @@ item_elt = pubsub.Item(id=event_id, payload=event_elt) options = {self._p.OPT_ACCESS_MODEL: self._p.ACCESS_WHITELIST} - await self._p.createIfNewNode( + await self._p.create_if_new_node( client, service, nodeIdentifier=node, options=options ) await self._p.publish(client, service, node, items=[item_elt]) @@ -805,7 +805,7 @@ node: str, profile_key: str = C.PROF_KEY_NONE ) -> None: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) defer.ensureDeferred( self.event_modify( client, @@ -855,7 +855,7 @@ rsvp_form = data_form.findForm(rsvp_elt, NS_RSVP) if rsvp_form is not None: data["rsvp"] = rsvp_data = dict(rsvp_form) - self._a.setTimestamp(rsvp_elt, rsvp_data) + self._a.set_timestamp(rsvp_elt, rsvp_data) def rsvp_set( self, @@ -890,7 +890,7 @@ extra: str, profile_key: str ) -> defer.Deferred: - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) if invitees_s: invitees = [jid.JID(i) for i in invitees_s] else: @@ -923,7 +923,7 @@ @param node: PubSub node of the event @param item: PubSub item of the event @param invitees: if set, only retrieve RSVPs from those guests - @param extra: extra data used to retrieve items as for [getAttachments] + @param extra: extra data used to retrieve items as for [get_attachments] @return: mapping of invitee bare JID to their RSVP an empty dict is returned if nothing has been answered yed """ @@ -931,7 +931,7 @@ service = client.jid.userhostJID() if node is None: node = NS_EVENTS - attachments, metadata = await self._a.getAttachments( + attachments, metadata = await self._a.get_attachments( client, service, node, item, invitees, extra ) ret = {} @@ -952,7 +952,7 @@ rsvp_s: str, profile_key: str ): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred( self.event_invitee_set( client, @@ -989,15 +989,15 @@ "extra": {"rsvp": rsvp} }) - def _eventInviteesList(self, service, node, profile_key): + def _event_invitees_list(self, service, node, profile_key): service = jid.JID(service) if service else None node = node if node else NS_EVENT - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) return defer.ensureDeferred( - self.eventInviteesList(client, service, node) + self.event_invitees_list(client, service, node) ) - async def eventInviteesList(self, client, service, node): + async def event_invitees_list(self, client, service, node): """Retrieve attendance from event node @param service(unicode, None): PubSub service @@ -1005,7 +1005,7 @@ @return (dict): a dict with current attendance status, an empty dict is returned if nothing has been answered yed """ - items, metadata = await self._p.getItems(client, service, node) + items, metadata = await self._p.get_items(client, service, node) invitees = {} for item in items: try: @@ -1025,7 +1025,7 @@ invitees[item["id"]] = data return invitees - async def invitePreflight( + async def invite_preflight( self, client: SatXMPPEntity, invitee_jid: jid.JID, @@ -1048,17 +1048,17 @@ invitees_node = event_data["invitees_node"] blog_service = jid.JID(event_data["blog_service"]) blog_node = event_data["blog_node"] - await self._p.setNodeAffiliations( + await self._p.set_node_affiliations( client, invitees_service, invitees_node, {invitee_jid: "publisher"} ) log.debug( f"affiliation set on invitee node (jid: {invitees_service}, " f"node: {invitees_node!r})" ) - await self._p.setNodeAffiliations( + await self._p.set_node_affiliations( client, blog_service, blog_node, {invitee_jid: "member"} ) - blog_items, __ = await self._b.mbGet(client, blog_service, blog_node, None) + blog_items, __ = await self._b.mb_get(client, blog_service, blog_node, None) for item in blog_items: try: @@ -1071,13 +1071,13 @@ ) ) else: - await self._p.setNodeAffiliations( + await self._p.set_node_affiliations( client, comments_service, comments_node, {invitee_jid: "publisher"} ) log.debug(_("affiliation set on blog and comments nodes")) def _invite(self, invitee_jid, service, node, item_id, profile): - return self.host.plugins["PUBSUB_INVITATION"]._sendPubsubInvitation( + return self.host.plugins["PUBSUB_INVITATION"]._send_pubsub_invitation( invitee_jid, service, node, item_id or NS_EVENT, profile_key=profile ) @@ -1085,7 +1085,7 @@ name="", host_name="", language="", url_template="", message_subject="", message_body="", profile_key=C.PROF_KEY_NONE): - client = self.host.getClient(profile_key) + client = self.host.get_client(profile_key) kwargs = { "profile": client.profile, "emails_extra": [str(e) for e in emails_extra], @@ -1121,7 +1121,7 @@ _('"XEP-0277" (blog) plugin is needed for this feature') ) service = service or client.jid.userhostJID() - event_uri = xmpp_uri.buildXMPPUri( + event_uri = xmpp_uri.build_xmpp_uri( "pubsub", path=service.full(), node=node, item=id_ ) kwargs["extra"] = {"event_uri": event_uri} @@ -1146,7 +1146,7 @@ link_elt["service"] = service.full() link_elt["node"] = node link_elt["item"] = item_id - __, event_data = self._parseEventElt(event_elt) + __, event_data = self._parse_event_elt(event_elt) try: name = event_data["name"] except KeyError:
--- a/sat/stdui/ui_contact_list.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/stdui/ui_contact_list.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,72 +29,72 @@ def __init__(self, host): self.host = host - self.__add_id = host.registerCallback(self._addContact, with_data=True) - self.__update_id = host.registerCallback(self._updateContact, with_data=True) - self.__confirm_delete_id = host.registerCallback( - self._getConfirmRemoveXMLUI, with_data=True + self.__add_id = host.register_callback(self._add_contact, with_data=True) + self.__update_id = host.register_callback(self._update_contact, with_data=True) + self.__confirm_delete_id = host.register_callback( + self._get_confirm_remove_xmlui, with_data=True ) - host.importMenu( + host.import_menu( (D_("Contacts"), D_("Add contact")), - self._getAddDialogXMLUI, + self._get_add_dialog_xmlui, security_limit=2, help_string=D_("Add contact"), ) - host.importMenu( + host.import_menu( (D_("Contacts"), D_("Update contact")), - self._getUpdateDialogXMLUI, + self._get_update_dialog_xmlui, security_limit=2, help_string=D_("Update contact"), ) - host.importMenu( + host.import_menu( (D_("Contacts"), D_("Remove contact")), - self._getRemoveDialogXMLUI, + self._get_remove_dialog_xmlui, security_limit=2, help_string=D_("Remove contact"), ) # FIXME: a plugin should not be used here, and current profile's jid host would be better than installation wise host if "MISC-ACCOUNT" in self.host.plugins: - self.default_host = self.host.plugins["MISC-ACCOUNT"].getNewAccountDomain() + self.default_host = self.host.plugins["MISC-ACCOUNT"].account_domain_new_get() else: self.default_host = "example.net" - def getContacts(self, profile): + def contacts_get(self, profile): """Return a sorted list of the contacts for that profile @param profile: %(doc_profile)s @return: list[string] """ - client = self.host.getClient(profile) - ret = [contact.full() for contact in client.roster.getJids()] + client = self.host.get_client(profile) + ret = [contact.full() for contact in client.roster.get_jids()] ret.sort() return ret - def getGroups(self, new_groups=None, profile=C.PROF_KEY_NONE): + def get_groups(self, new_groups=None, profile=C.PROF_KEY_NONE): """Return a sorted list of the groups for that profile @param new_group (list): add these groups to the existing ones @param profile: %(doc_profile)s @return: list[string] """ - client = self.host.getClient(profile) - ret = client.roster.getGroups() + client = self.host.get_client(profile) + ret = client.roster.get_groups() ret.sort() ret.extend([group for group in new_groups if group not in ret]) return ret - def getGroupsOfContact(self, user_jid_s, profile): + def get_groups_of_contact(self, user_jid_s, profile): """Return all the groups of the given contact @param user_jid_s (string) @param profile: %(doc_profile)s @return: list[string] """ - client = self.host.getClient(profile) - return client.roster.getItem(jid.JID(user_jid_s)).groups + client = self.host.get_client(profile) + return client.roster.get_item(jid.JID(user_jid_s)).groups - def getGroupsOfAllContacts(self, profile): + def get_groups_of_all_contacts(self, profile): """Return a mapping between the contacts and their groups @param profile: %(doc_profile)s @@ -102,8 +102,8 @@ - key: the JID userhost - value: list of groups """ - client = self.host.getClient(profile) - return {item.jid.userhost(): item.groups for item in client.roster.getItems()} + client = self.host.get_client(profile) + return {item.jid.userhost(): item.groups for item in client.roster.get_items()} def _data2elts(self, data): """Convert a contacts data dict to minidom Elements @@ -122,7 +122,7 @@ elts.append(key_elt) return elts - def getDialogXMLUI(self, options, data, profile): + def get_dialog_xmlui(self, options, data, profile): """Generic method to return the XMLUI dialog for adding or updating a contact @param options (dict): parameters for the dialog, with the keys: @@ -141,14 +141,14 @@ form_ui.addText(options["contact_text"]) if options["id"] == self.__add_id: contact = data.get( - xml_tools.formEscape("contact_jid"), "@%s" % self.default_host + xml_tools.form_escape("contact_jid"), "@%s" % self.default_host ) form_ui.addString("contact_jid", value=contact) elif options["id"] == self.__update_id: - contacts = self.getContacts(profile) + contacts = self.contacts_get(profile) list_ = form_ui.addList("contact_jid", options=contacts, selected=contacts[0]) - elts = self._data2elts(self.getGroupsOfAllContacts(profile)) - list_.setInternalCallback( + elts = self._data2elts(self.get_groups_of_all_contacts(profile)) + list_.set_internal_callback( "groups_of_contact", fields=["contact_jid", "groups_list"], data_elts=elts ) @@ -160,25 +160,25 @@ selected_groups = data["selected_groups"] elif options["id"] == self.__update_id: try: - selected_groups = self.getGroupsOfContact(contacts[0], profile) + selected_groups = self.get_groups_of_contact(contacts[0], profile) except IndexError: pass - groups = self.getGroups(selected_groups, profile) + groups = self.get_groups(selected_groups, profile) form_ui.addList( "groups_list", options=groups, selected=selected_groups, styles=["multi"] ) - adv_list = form_ui.changeContainer("advanced_list", columns=3, selectable="no") + adv_list = form_ui.change_container("advanced_list", columns=3, selectable="no") form_ui.addLabel(D_("Add group")) form_ui.addString("add_group") button = form_ui.addButton("", value=D_("Add")) - button.setInternalCallback("move", fields=["add_group", "groups_list"]) + button.set_internal_callback("move", fields=["add_group", "groups_list"]) adv_list.end() form_ui.addDivider("blank") return {"xmlui": form_ui.toXml()} - def _getAddDialogXMLUI(self, data, profile): + def _get_add_dialog_xmlui(self, data, profile): """Get the dialog for adding contact @param data (dict) @@ -190,16 +190,16 @@ "title": D_("Add contact"), "contact_text": D_("New contact identifier (JID):"), } - return self.getDialogXMLUI(options, {}, profile) + return self.get_dialog_xmlui(options, {}, profile) - def _getUpdateDialogXMLUI(self, data, profile): + def _get_update_dialog_xmlui(self, data, profile): """Get the dialog for updating contact @param data (dict) @param profile: %(doc_profile)s @return dict """ - if not self.getContacts(profile): + if not self.contacts_get(profile): _dialog = xml_tools.XMLUI("popup", title=D_("Nothing to update")) _dialog.addText(_("Your contact list is empty.")) return {"xmlui": _dialog.toXml()} @@ -209,16 +209,16 @@ "title": D_("Update contact"), "contact_text": D_("Which contact do you want to update?"), } - return self.getDialogXMLUI(options, {}, profile) + return self.get_dialog_xmlui(options, {}, profile) - def _getRemoveDialogXMLUI(self, data, profile): + def _get_remove_dialog_xmlui(self, data, profile): """Get the dialog for removing contact @param data (dict) @param profile: %(doc_profile)s @return dict """ - if not self.getContacts(profile): + if not self.contacts_get(profile): _dialog = xml_tools.XMLUI("popup", title=D_("Nothing to delete")) _dialog.addText(_("Your contact list is empty.")) return {"xmlui": _dialog.toXml()} @@ -228,10 +228,10 @@ title=D_("Who do you want to remove from your contacts?"), submit_id=self.__confirm_delete_id, ) - form_ui.addList("contact_jid", options=self.getContacts(profile)) + form_ui.addList("contact_jid", options=self.contacts_get(profile)) return {"xmlui": form_ui.toXml()} - def _getConfirmRemoveXMLUI(self, data, profile): + def _get_confirm_remove_xmlui(self, data, profile): """Get the confirmation dialog for removing contact @param data (dict) @@ -240,21 +240,21 @@ """ if C.bool(data.get("cancelled", "false")): return {} - contact = data[xml_tools.formEscape("contact_jid")] + contact = data[xml_tools.form_escape("contact_jid")] def delete_cb(data, profile): if not C.bool(data.get("cancelled", "false")): - self._deleteContact(jid.JID(contact), profile) + self._delete_contact(jid.JID(contact), profile) return {} - delete_id = self.host.registerCallback(delete_cb, with_data=True, one_shot=True) + delete_id = self.host.register_callback(delete_cb, with_data=True, one_shot=True) form_ui = xml_tools.XMLUI("form", title=D_("Delete contact"), submit_id=delete_id) form_ui.addText( D_("Are you sure you want to remove %s from your contact list?") % contact ) return {"xmlui": form_ui.toXml()} - def _addContact(self, data, profile): + def _add_contact(self, data, profile): """Add the selected contact @param data (dict) @@ -263,12 +263,12 @@ """ if C.bool(data.get("cancelled", "false")): return {} - contact_jid_s = data[xml_tools.formEscape("contact_jid")] + contact_jid_s = data[xml_tools.form_escape("contact_jid")] try: contact_jid = jid.JID(contact_jid_s) except (RuntimeError, jid.InvalidFormat, AttributeError): - # TODO: replace '\t' by a constant (see tools.xmlui.XMLUI.onFormSubmitted) - data["selected_groups"] = data[xml_tools.formEscape("groups_list")].split( + # TODO: replace '\t' by a constant (see tools.xmlui.XMLUI.on_form_submitted) + data["selected_groups"] = data[xml_tools.form_escape("groups_list")].split( "\t" ) options = { @@ -277,32 +277,32 @@ "contact_text": D_('Please enter a valid JID (like "contact@%s"):') % self.default_host, } - return self.getDialogXMLUI(options, data, profile) - self.host.addContact(contact_jid, profile_key=profile) - return self._updateContact(data, profile) # after adding, updating + return self.get_dialog_xmlui(options, data, profile) + self.host.contact_add(contact_jid, profile_key=profile) + return self._update_contact(data, profile) # after adding, updating - def _updateContact(self, data, profile): + def _update_contact(self, data, profile): """Update the selected contact @param data (dict) @param profile: %(doc_profile)s @return dict """ - client = self.host.getClient(profile) + client = self.host.get_client(profile) if C.bool(data.get("cancelled", "false")): return {} - contact_jid = jid.JID(data[xml_tools.formEscape("contact_jid")]) - # TODO: replace '\t' by a constant (see tools.xmlui.XMLUI.onFormSubmitted) - groups = data[xml_tools.formEscape("groups_list")].split("\t") - self.host.updateContact(client, contact_jid, name="", groups=groups) + contact_jid = jid.JID(data[xml_tools.form_escape("contact_jid")]) + # TODO: replace '\t' by a constant (see tools.xmlui.XMLUI.on_form_submitted) + groups = data[xml_tools.form_escape("groups_list")].split("\t") + self.host.contact_update(client, contact_jid, name="", groups=groups) return {} - def _deleteContact(self, contact_jid, profile): + def _delete_contact(self, contact_jid, profile): """Delete the selected contact @param contact_jid (JID) @param profile: %(doc_profile)s @return dict """ - self.host.delContact(contact_jid, profile_key=profile) + self.host.contact_del(contact_jid, profile_key=profile) return {}
--- a/sat/stdui/ui_profile_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/stdui/ui_profile_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,18 +38,18 @@ self.host = host self.profile_ciphers = {} self._sessions = ProfileSessions() - host.registerCallback( - self._authenticateProfile, force_id=C.AUTHENTICATE_PROFILE_ID, with_data=True + host.register_callback( + self._authenticate_profile, force_id=C.AUTHENTICATE_PROFILE_ID, with_data=True ) - host.registerCallback( - self._changeXMPPPassword, force_id=C.CHANGE_XMPP_PASSWD_ID, with_data=True + host.register_callback( + self._change_xmpp_password, force_id=C.CHANGE_XMPP_PASSWD_ID, with_data=True ) - self.__new_xmpp_passwd_id = host.registerCallback( - self._changeXMPPPasswordCb, with_data=True + self.__new_xmpp_passwd_id = host.register_callback( + self._change_xmpp_password_cb, with_data=True ) - def _startSessionEb(self, fail, first, profile): - """Errback method for startSession during profile authentication + def _start_session_eb(self, fail, first, profile): + """Errback method for start_session during profile authentication @param first(bool): if True, this is the first try and we have tryied empty password in this case we ask for a password to the user. @@ -62,8 +62,8 @@ "form", title=D_("Profile password for {}").format(profile), submit_id="" ) form_ui.addPassword("profile_password", value="") - d = xml_tools.deferredUI(self.host, form_ui, chained=True) - d.addCallback(self._authenticateProfile, profile) + d = xml_tools.deferred_ui(self.host, form_ui, chained=True) + d.addCallback(self._authenticate_profile, profile) return {"xmlui": form_ui.toXml()} assert profile is None @@ -77,13 +77,13 @@ dialog.addText(D_("Internal error: {}".format(fail))) return {"xmlui": dialog.toXml(), "validated": C.BOOL_FALSE} - def _authenticateProfile(self, data, profile): + def _authenticate_profile(self, data, profile): if C.bool(data.get("cancelled", "false")): return {} - if self.host.memory.isSessionStarted(profile): + if self.host.memory.is_session_started(profile): return {"validated": C.BOOL_TRUE} try: - password = data[xml_tools.formEscape("profile_password")] + password = data[xml_tools.form_escape("profile_password")] except KeyError: # first request, we try empty password password = "" @@ -92,30 +92,30 @@ else: first = False eb_profile = None - d = self.host.memory.startSession(password, profile) + d = self.host.memory.start_session(password, profile) d.addCallback(lambda __: {"validated": C.BOOL_TRUE}) - d.addErrback(self._startSessionEb, first, eb_profile) + d.addErrback(self._start_session_eb, first, eb_profile) return d - def _changeXMPPPassword(self, data, profile): - session_data = self._sessions.profileGetUnique(profile) + def _change_xmpp_password(self, data, profile): + session_data = self._sessions.profile_get_unique(profile) if not session_data: - server = self.host.memory.getParamA( + server = self.host.memory.param_get_a( C.FORCE_SERVER_PARAM, "Connection", profile_key=profile ) if not server: server = jid.parse( - self.host.memory.getParamA( + self.host.memory.param_get_a( "JabberID", "Connection", profile_key=profile ) )[1] - session_id, session_data = self._sessions.newSession( + session_id, session_data = self._sessions.new_session( {"count": 0, "server": server}, profile=profile ) if ( session_data["count"] > 2 ): # 3 attempts with a new password after the initial try - self._sessions.profileDelUnique(profile) + self._sessions.profile_del_unique(profile) _dialog = xml_tools.XMLUI("popup", title=D_("Connection error")) _dialog.addText( D_("Can't connect to %s. Please check your connection details.") @@ -140,12 +140,12 @@ form_ui.addPassword("xmpp_password", value="") return {"xmlui": form_ui.toXml()} - def _changeXMPPPasswordCb(self, data, profile): - xmpp_password = data[xml_tools.formEscape("xmpp_password")] - d = self.host.memory.setParam( + def _change_xmpp_password_cb(self, data, profile): + xmpp_password = data[xml_tools.form_escape("xmpp_password")] + d = self.host.memory.param_set( "Password", xmpp_password, "Connection", profile_key=profile ) d.addCallback(lambda __: defer.ensureDeferred(self.host.connect(profile))) d.addCallback(lambda __: {}) - d.addErrback(lambda __: self._changeXMPPPassword({}, profile)) + d.addErrback(lambda __: self._change_xmpp_password({}, profile)) return d
--- a/sat/test/helpers.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/helpers.py Sat Apr 08 13:54:42 2023 +0200 @@ -20,7 +20,7 @@ ## logging configuration for tests ## from sat.core import log_config -log_config.satConfigure() +log_config.sat_configure() import logging from sat.core.log import getLogger @@ -52,14 +52,14 @@ return "True" if value else "False" -def muteLogging(): +def mute_logging(): """Temporarily set the logging level to CRITICAL to not pollute the output with expected errors.""" logger = getLogger() logger.original_level = logger.getEffectiveLevel() logger.setLevel(logging.CRITICAL) -def unmuteLogging(): +def unmute_logging(): """Restore the logging level after it has been temporarily disabled.""" logger = getLogger() logger.setLevel(logger.original_level) @@ -97,17 +97,17 @@ self.plugins = {} self.profiles = {} - def delContact(self, to, profile_key): + def contact_del(self, to, profile_key): #TODO pass - def registerCallback(self, callback, *args, **kwargs): + def register_callback(self, callback, *args, **kwargs): pass - def messageSend(self, to_s, msg, subject=None, mess_type='auto', extra={}, profile_key='@NONE@'): - self.sendAndStoreMessage({"to": JID(to_s)}) + def message_send(self, to_s, msg, subject=None, mess_type='auto', extra={}, profile_key='@NONE@'): + self.send_and_store_message({"to": JID(to_s)}) - def _sendMessageToStream(self, mess_data, client): + def _send_message_to_stream(self, mess_data, client): """Save the information to check later to whom messages have been sent. @param mess_data: message data dictionnary @@ -116,7 +116,7 @@ client.xmlstream.send(mess_data['xml']) return mess_data - def _storeMessage(self, mess_data, client): + def _store_message(self, mess_data, client): """Save the information to check later if entries have been added to the history. @param mess_data: message data dictionnary @@ -125,7 +125,7 @@ self.stored_messages.append(mess_data["to"]) return mess_data - def sendMessageToBridge(self, mess_data, client): + def send_message_to_bridge(self, mess_data, client): """Simulate the message being sent to the frontends. @param mess_data: message data dictionnary @@ -133,33 +133,33 @@ """ return mess_data # TODO - def getProfileName(self, profile_key): + def get_profile_name(self, profile_key): """Get the profile name from the profile_key""" return profile_key - def getClient(self, profile_key): + def get_client(self, profile_key): """Convenient method to get client from profile key @return: client or None if it doesn't exist""" - profile = self.memory.getProfileName(profile_key) + profile = self.memory.get_profile_name(profile_key) if not profile: raise exceptions.ProfileKeyUnknown if profile not in self.profiles: self.profiles[profile] = FakeClient(self, profile) return self.profiles[profile] - def getJidNStream(self, profile_key): + def get_jid_n_stream(self, profile_key): """Convenient method to get jid and stream from profile key @return: tuple (jid, xmlstream) from profile, can be None""" return (C.PROFILE_DICT[profile_key], None) - def isConnected(self, profile): + def is_connected(self, profile): return True - def getSentMessages(self, profile_index): + def get_sent_messages(self, profile_index): """Return all the sent messages (in the order they have been sent) and empty the list. Called by tests. FakeClient instances associated to each profile must have been previously initialized with the method - FakeSAT.getClient. + FakeSAT.get_client. @param profile_index: index of the profile to consider (cf. C.PROFILE) @return: the sent messages for given profile, or None""" @@ -170,10 +170,10 @@ except IndexError: return None - def getSentMessage(self, profile_index): + def get_sent_message(self, profile_index): """Pop and return the sent message in first position (works like a FIFO). Called by tests. FakeClient instances associated to each profile must have - been previously initialized with the method FakeSAT.getClient. + been previously initialized with the method FakeSAT.get_client. @param profile_index: index of the profile to consider (cf. C.PROFILE) @return: the sent message for given profile, or None""" @@ -182,20 +182,20 @@ except IndexError: return None - def getSentMessageXml(self, profile_index): + def get_sent_message_xml(self, profile_index): """Pop and return the sent message in first position (works like a FIFO). Called by tests. FakeClient instances associated to each profile must have - been previously initialized with the method FakeSAT.getClient. + been previously initialized with the method FakeSAT.get_client. @return: XML representation of the sent message for given profile, or None""" - entry = self.getSentMessage(profile_index) + entry = self.get_sent_message(profile_index) return entry.toXml() if entry else None - def findFeaturesSet(self, features, identity=None, jid_=None, profile=C.PROF_KEY_NONE): - """Call self.addFeature from your tests to change the return value. + def find_features_set(self, features, identity=None, jid_=None, profile=C.PROF_KEY_NONE): + """Call self.add_feature from your tests to change the return value. @return: a set of entities """ - client = self.getClient(profile) + client = self.get_client(profile) if jid_ is None: jid_ = JID(client.jid.host) try: @@ -205,12 +205,12 @@ pass return defer.succeed(set()) - def addFeature(self, jid_, feature, profile_key): + def add_feature(self, jid_, feature, profile_key): """Add a feature to an entity. To be called from your tests when needed. """ - client = self.getClient(profile_key) + client = self.get_client(profile_key) if not hasattr(client, 'features'): client.features = {} if jid_ not in client.features: @@ -224,13 +224,13 @@ def __init__(self): self.expected_calls = {} - def expectCall(self, name, *check_args, **check_kwargs): + def expect_call(self, name, *check_args, **check_kwargs): if hasattr(self, name): # queue this new call as one already exists self.expected_calls.setdefault(name, []) self.expected_calls[name].append((check_args, check_kwargs)) return - def checkCall(*args, **kwargs): + def check_call(*args, **kwargs): if args != check_args or kwargs != check_kwargs: print("\n\n--------------------") print("Args are not equals:") @@ -244,19 +244,19 @@ args, kwargs = self.expected_calls[name].pop(0) if len(self.expected_calls[name]) == 0: del self.expected_calls[name] - self.expectCall(name, *args, **kwargs) + self.expect_call(name, *args, **kwargs) - setattr(self, name, checkCall) + setattr(self, name, check_call) - def addMethod(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc=None): + def add_method(self, name, int_suffix, in_sign, out_sign, method, async_=False, doc=None): pass - def addSignal(self, name, int_suffix, signature): + def add_signal(self, name, int_suffix, signature): pass - def addTestCallback(self, name, method): + def add_test_callback(self, name, method): """This can be used to register callbacks for bridge methods AND signals. - Contrary to expectCall, this will not check if the method or signal is + Contrary to expect_call, this will not check if the method or signal is called/sent with the correct arguments, it will instead run the callback of your choice.""" setattr(self, name, method) @@ -271,16 +271,16 @@ Params.__init__(self, host, storage) self.params = {} # naive simulation of values storage - def setParam(self, name, value, category, security_limit=-1, profile_key='@NONE@'): - profile = self.getProfileName(profile_key) + def param_set(self, name, value, category, security_limit=-1, profile_key='@NONE@'): + profile = self.get_profile_name(profile_key) self.params.setdefault(profile, {}) self.params[profile_key][(category, name)] = value - def getParamA(self, name, category, attr="value", profile_key='@NONE@'): - profile = self.getProfileName(profile_key) + def param_get_a(self, name, category, attr="value", profile_key='@NONE@'): + profile = self.get_profile_name(profile_key) return self.params[profile][(category, name)] - def getProfileName(self, profile_key, return_profile_keys=False): + def get_profile_name(self, profile_key, return_profile_keys=False): if profile_key == '@DEFAULT@': return C.PROFILE[0] elif profile_key == '@NONE@': @@ -288,7 +288,7 @@ else: return profile_key - def loadIndParams(self, profile, cache=None): + def load_ind_params(self, profile, cache=None): self.params[profile] = {} return defer.succeed(None) @@ -301,7 +301,7 @@ # manipulating basic stuff, the others should be overwritten when needed self.host = host self.params = FakeParams(host, None) - self.config = tools_config.parseMainConf() + self.config = tools_config.parse_main_conf() self.reinit() def reinit(self): @@ -312,29 +312,29 @@ self.params.frontends_cache = [] self.entities_data = {} - def getProfileName(self, profile_key, return_profile_keys=False): - return self.params.getProfileName(profile_key, return_profile_keys) + def get_profile_name(self, profile_key, return_profile_keys=False): + return self.params.get_profile_name(profile_key, return_profile_keys) - def addToHistory(self, from_jid, to_jid, message, _type='chat', extra=None, timestamp=None, profile="@NONE@"): + def add_to_history(self, from_jid, to_jid, message, _type='chat', extra=None, timestamp=None, profile="@NONE@"): pass - def addContact(self, contact_jid, attributes, groups, profile_key='@DEFAULT@'): + def contact_add(self, contact_jid, attributes, groups, profile_key='@DEFAULT@'): pass - def setPresenceStatus(self, contact_jid, show, priority, statuses, profile_key='@DEFAULT@'): + def set_presence_status(self, contact_jid, show, priority, statuses, profile_key='@DEFAULT@'): pass - def addWaitingSub(self, type_, contact_jid, profile_key): + def add_waiting_sub(self, type_, contact_jid, profile_key): pass - def delWaitingSub(self, contact_jid, profile_key): + def del_waiting_sub(self, contact_jid, profile_key): pass - def updateEntityData(self, entity_jid, key, value, silent=False, profile_key="@NONE@"): + def update_entity_data(self, entity_jid, key, value, silent=False, profile_key="@NONE@"): self.entities_data.setdefault(entity_jid, {}) self.entities_data[entity_jid][key] = value - def getEntityData(self, entity_jid, keys, profile_key): + def entity_data_get(self, entity_jid, keys, profile_key): result = {} for key in keys: result[key] = self.entities_data[entity_jid][key] @@ -358,9 +358,9 @@ SatRosterProtocol.__init__(self, host) self.parent = parent self._jids = {} - self.addItem(parent.jid.userhostJID()) + self.add_item(parent.jid.userhostJID()) - def addItem(self, jid, *args, **kwargs): + def add_item(self, jid, *args, **kwargs): if not args and not kwargs: # defaults values setted for the tests only kwargs["subscriptionTo"] = True @@ -369,9 +369,9 @@ attrs = {'to': b2s(roster_item.subscriptionTo), 'from': b2s(roster_item.subscriptionFrom), 'ask': b2s(roster_item.pendingOut)} if roster_item.name: attrs['name'] = roster_item.name - self.host.bridge.expectCall("newContact", jid.full(), attrs, roster_item.groups, self.parent.profile) + self.host.bridge.expect_call("contact_new", jid.full(), attrs, roster_item.groups, self.parent.profile) self._jids[jid] = roster_item - self._registerItem(roster_item) + self._register_item(roster_item) class FakeXmlStream(object): @@ -420,8 +420,8 @@ class SatTestCase(unittest.TestCase): - def assertEqualXML(self, xml, expected, ignore_blank=False): - def equalElt(got_elt, exp_elt): + def assert_equal_xml(self, xml, expected, ignore_blank=False): + def equal_elt(got_elt, exp_elt): if ignore_blank: for elt in got_elt, exp_elt: for attr in ('text', 'tail'): @@ -449,7 +449,7 @@ print("children len: got %d expected: %d" % (len(got_elt), len(exp_elt))) return False for idx, child in enumerate(got_elt): - if not equalElt(child, exp_elt[idx]): + if not equal_elt(child, exp_elt[idx]): return False return True @@ -460,7 +460,7 @@ xml_elt = etree.fromstring(remove_blank(xml) if ignore_blank else xml) expected_elt = etree.fromstring(remove_blank(expected) if ignore_blank else expected) - if not equalElt(xml_elt, expected_elt): + if not equal_elt(xml_elt, expected_elt): print("---") print("XML are not equals:") print("got:\n-\n%s\n-\n\n" % etree.tostring(xml_elt, encoding='utf-8')) @@ -468,7 +468,7 @@ print("---") raise DifferentXMLException - def assertEqualUnsortedList(self, a, b, msg): + def assert_equal_unsorted_list(self, a, b, msg): counter_a = Counter(a) counter_b = Counter(b) if counter_a != counter_b:
--- a/sat/test/helpers_plugins.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/helpers_plugins.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,7 +48,7 @@ @param profile_key: the profile key of the user joining the room @return: the deferred joined wokkel.muc.Room instance """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) roster = {} # ask the other profiles to fill our roster @@ -101,7 +101,7 @@ @param profile_key: the profile key of the user joining the room @return: a dummy deferred """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) room = self.joined_rooms[roomJID] # remove ourself from the other rosters for i in range(0, len(C.PROFILE)): @@ -134,20 +134,20 @@ @param profile_key: the profile of the user joining the room @return: the deferred joined wokkel.muc.Room instance or None """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if room_jid in self.clients[profile].joined_rooms: return defer.succeed(None) room = self.clients[profile].join(room_jid, nick, profile_key=profile) return room - def joinRoom(self, muc_index, user_index): + def join_room(self, muc_index, user_index): """Called by tests @return: the nickname of the user who joined room""" muc_jid = C.MUC[muc_index] nick = C.JID[user_index].user profile = C.PROFILE[user_index] self.join(muc_jid, nick, profile_key=profile) - return self.getNick(muc_index, user_index) + return self.get_nick(muc_index, user_index) def leave(self, room_jid, profile_key="@DEFAULT@"): """ @@ -155,21 +155,21 @@ @param profile_key: the profile of the user leaving the room @return: a dummy deferred """ - profile = self.host.memory.getProfileName(profile_key) + profile = self.host.memory.get_profile_name(profile_key) if room_jid not in self.clients[profile].joined_rooms: raise plugin_xep_0045.UnknownRoom("This room has not been joined") return self.clients[profile].leave(room_jid, profile) - def leaveRoom(self, muc_index, user_index): + def leave_room(self, muc_index, user_index): """Called by tests @return: the nickname of the user who left the room""" muc_jid = C.MUC[muc_index] - nick = self.getNick(muc_index, user_index) + nick = self.get_nick(muc_index, user_index) profile = C.PROFILE[user_index] self.leave(muc_jid, profile_key=profile) return nick - def getRoom(self, muc_index, user_index): + def get_room(self, muc_index, user_index): """Called by tests @return: a wokkel.muc.Room instance""" profile = C.PROFILE[user_index] @@ -179,13 +179,13 @@ except (AttributeError, KeyError): return None - def getNick(self, muc_index, user_index): + def get_nick(self, muc_index, user_index): try: - return self.getRoomNick(C.MUC[muc_index], C.PROFILE[user_index]) + return self.get_room_nick(C.MUC[muc_index], C.PROFILE[user_index]) except (KeyError, AttributeError): return "" - def getNickOfUser(self, muc_index, user_index, profile_index, secure=True): + def get_nick_of_user(self, muc_index, user_index, profile_index, secure=True): try: room = self.clients[C.PROFILE[profile_index]].joined_rooms[C.MUC[muc_index]] return self.getRoomNickOfUser(room, C.JID[user_index]) @@ -200,7 +200,7 @@ def invite(self, target, room, options={}, profile_key="@DEFAULT@"): """ Invite a user to a room. To accept the invitation from a test, - just call FakeXEP_0045.joinRoom (no need to have a dedicated method). + just call FakeXEP_0045.join_room (no need to have a dedicated method). @param target: jid of the user to invite @param room: jid of the room where the user is invited @options: attribute with extra info (reason, password) as in #XEP-0249 @@ -278,13 +278,13 @@ self.__rsm_responses[ext_data["id"]] = RSMResponse(len(items), *args) return defer.succeed(items) - def retractItems(self, service, nodeIdentifier, itemIdentifiers, sender=None): + def retract_items(self, service, nodeIdentifier, itemIdentifiers, sender=None): node = self.__items[nodeIdentifier] for item in [item for item in node if item["id"] in itemIdentifiers]: node.remove(item) return defer.succeed(None) - def getRSMResponse(self, id): + def get_rsm_response(self, id): if id not in self.__rsm_responses: return {} result = self.__rsm_responses[id].toDict() @@ -294,7 +294,7 @@ def subscriptions(self, service, nodeIdentifier, sender=None): return defer.succeed([]) - def service_getDiscoItems(self, service, nodeIdentifier, profile_key=C.PROF_KEY_NONE): + def service_get_disco_items(self, service, nodeIdentifier, profile_key=C.PROF_KEY_NONE): items = DiscoItems() for item in list(self.__items.keys()): items.append(DiscoItem(service, item))
--- a/sat/test/test_core_xmpp.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_core_xmpp.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,14 +46,14 @@ self.message = xmpp.SatMessageProtocol(self.host) self.message.parent = helpers.FakeClient(self.host) - def test_onMessage(self): + def test_on_message(self): xml = """ <message type="chat" from="sender@example.net/house" to="test@example.org/SàT" id="test_1"> <body>test</body> </message> """ stanza = parseXml(xml) - self.host.bridge.expectCall("messageNew", "sender@example.net/house", "test", "chat", "test@example.org/SàT", {}, profile=Const.PROFILE[0]) + self.host.bridge.expect_call("message_new", "sender@example.net/house", "test", "chat", "test@example.org/SàT", {}, profile=Const.PROFILE[0]) self.message.onMessage(stanza) @@ -64,15 +64,15 @@ self.roster = xmpp.SatRosterProtocol(self.host) self.roster.parent = helpers.FakeClient(self.host) - def test__registerItem(self): + def test_register_item(self): roster_item = RosterItem(Const.JID[0]) roster_item.name = "Test Man" roster_item.subscriptionTo = True roster_item.subscriptionFrom = True roster_item.ask = False roster_item.groups = set(["Test Group 1", "Test Group 2", "Test Group 3"]) - self.host.bridge.expectCall("newContact", Const.JID_STR[0], {'to': 'True', 'from': 'True', 'ask': 'False', 'name': 'Test Man'}, set(["Test Group 1", "Test Group 2", "Test Group 3"]), Const.PROFILE[0]) - self.roster._registerItem(roster_item) + self.host.bridge.expect_call("contact_new", Const.JID_STR[0], {'to': 'True', 'from': 'True', 'ask': 'False', 'name': 'Test Man'}, set(["Test Group 1", "Test Group 2", "Test Group 3"]), Const.PROFILE[0]) + self.roster._register_item(roster_item) class SatPresenceProtocolTest(unittest.TestCase): @@ -83,29 +83,29 @@ self.presence.parent = helpers.FakeClient(self.host) def test_availableReceived(self): - self.host.bridge.expectCall("presenceUpdate", Const.JID_STR[0], "xa", 15, {'default': "test status", 'fr': 'statut de test'}, Const.PROFILE[0]) + self.host.bridge.expect_call("presence_update", Const.JID_STR[0], "xa", 15, {'default': "test status", 'fr': 'statut de test'}, Const.PROFILE[0]) self.presence.availableReceived(Const.JID[0], 'xa', {None: "test status", 'fr': 'statut de test'}, 15) - def test_availableReceived_empty_statuses(self): - self.host.bridge.expectCall("presenceUpdate", Const.JID_STR[0], "xa", 15, {}, Const.PROFILE[0]) + def test_available_received_empty_statuses(self): + self.host.bridge.expect_call("presence_update", Const.JID_STR[0], "xa", 15, {}, Const.PROFILE[0]) self.presence.availableReceived(Const.JID[0], 'xa', None, 15) def test_unavailableReceived(self): - self.host.bridge.expectCall("presenceUpdate", Const.JID_STR[0], "unavailable", 0, {}, Const.PROFILE[0]) + self.host.bridge.expect_call("presence_update", Const.JID_STR[0], "unavailable", 0, {}, Const.PROFILE[0]) self.presence.unavailableReceived(Const.JID[0], None) def test_subscribedReceived(self): - self.host.bridge.expectCall("subscribe", "subscribed", Const.JID[0].userhost(), Const.PROFILE[0]) + self.host.bridge.expect_call("subscribe", "subscribed", Const.JID[0].userhost(), Const.PROFILE[0]) self.presence.subscribedReceived(Const.JID[0]) def test_unsubscribedReceived(self): - self.host.bridge.expectCall("subscribe", "unsubscribed", Const.JID[0].userhost(), Const.PROFILE[0]) + self.host.bridge.expect_call("subscribe", "unsubscribed", Const.JID[0].userhost(), Const.PROFILE[0]) self.presence.unsubscribedReceived(Const.JID[0]) def test_subscribeReceived(self): - self.host.bridge.expectCall("subscribe", "subscribe", Const.JID[0].userhost(), Const.PROFILE[0]) + self.host.bridge.expect_call("subscribe", "subscribe", Const.JID[0].userhost(), Const.PROFILE[0]) self.presence.subscribeReceived(Const.JID[0]) def test_unsubscribeReceived(self): - self.host.bridge.expectCall("subscribe", "unsubscribe", Const.JID[0].userhost(), Const.PROFILE[0]) + self.host.bridge.expect_call("subscribe", "unsubscribe", Const.JID[0].userhost(), Const.PROFILE[0]) self.presence.unsubscribeReceived(Const.JID[0])
--- a/sat/test/test_helpers_plugins.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_helpers_plugins.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,96 +29,96 @@ self.host = helpers.FakeSAT() self.plugin = helpers_plugins.FakeXEP_0045(self.host) - def test_joinRoom(self): - self.plugin.joinRoom(0, 0) - self.assertEqual("test", self.plugin.getNick(0, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 0)) - self.assertEqual("", self.plugin.getNick(0, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 1)) - self.assertEqual("", self.plugin.getNick(0, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 2)) - self.assertEqual("", self.plugin.getNick(0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 3)) - self.plugin.joinRoom(0, 1) - self.assertEqual("test", self.plugin.getNick(0, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 0)) - self.assertEqual("sender", self.plugin.getNick(0, 1)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 1)) - self.assertEqual("", self.plugin.getNick(0, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 2)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 2)) - self.assertEqual("", self.plugin.getNick(0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 3)) - self.plugin.joinRoom(0, 2) - self.assertEqual("test", self.plugin.getNick(0, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 0)) - self.assertEqual("sender", self.plugin.getNick(0, 1)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 1)) + def test_join_room(self): + self.plugin.join_room(0, 0) + self.assertEqual("test", self.plugin.get_nick(0, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 0)) + self.assertEqual("", self.plugin.get_nick(0, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 1)) + self.assertEqual("", self.plugin.get_nick(0, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 2)) + self.assertEqual("", self.plugin.get_nick(0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 3)) + self.plugin.join_room(0, 1) + self.assertEqual("test", self.plugin.get_nick(0, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 0)) + self.assertEqual("sender", self.plugin.get_nick(0, 1)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 1)) + self.assertEqual("", self.plugin.get_nick(0, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 2)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 2)) + self.assertEqual("", self.plugin.get_nick(0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 3)) + self.plugin.join_room(0, 2) + self.assertEqual("test", self.plugin.get_nick(0, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 0)) + self.assertEqual("sender", self.plugin.get_nick(0, 1)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 1)) self.assertEqual( - "sender", self.plugin.getNickOfUser(0, 1, 1) + "sender", self.plugin.get_nick_of_user(0, 1, 1) ) # Const.JID[2] is in the roster for Const.PROFILE[1] - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 1)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 1)) - self.assertEqual("sender", self.plugin.getNick(0, 2)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 2)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 2)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 1)) + self.assertEqual("sender", self.plugin.get_nick(0, 2)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 2)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 2)) self.assertEqual( - "sender", self.plugin.getNickOfUser(0, 2, 2) + "sender", self.plugin.get_nick_of_user(0, 2, 2) ) # Const.JID[1] is in the roster for Const.PROFILE[2] - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 2)) - self.assertEqual("", self.plugin.getNick(0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 1, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 2, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 3)) - self.plugin.joinRoom(0, 3) - self.assertEqual("test", self.plugin.getNick(0, 0)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 0, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 0)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 0)) - self.assertEqual("sender_", self.plugin.getNickOfUser(0, 3, 0)) - self.assertEqual("sender", self.plugin.getNick(0, 1)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 1)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 2)) + self.assertEqual("", self.plugin.get_nick(0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 1, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 2, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 3)) + self.plugin.join_room(0, 3) + self.assertEqual("test", self.plugin.get_nick(0, 0)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 0, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 0)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 0)) + self.assertEqual("sender_", self.plugin.get_nick_of_user(0, 3, 0)) + self.assertEqual("sender", self.plugin.get_nick(0, 1)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 1)) self.assertEqual( - "sender", self.plugin.getNickOfUser(0, 1, 1) + "sender", self.plugin.get_nick_of_user(0, 1, 1) ) # Const.JID[2] is in the roster for Const.PROFILE[1] - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 1)) - self.assertEqual("sender_", self.plugin.getNickOfUser(0, 3, 1)) - self.assertEqual("sender", self.plugin.getNick(0, 2)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 2)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 2)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 1)) + self.assertEqual("sender_", self.plugin.get_nick_of_user(0, 3, 1)) + self.assertEqual("sender", self.plugin.get_nick(0, 2)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 2)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 2)) self.assertEqual( - "sender", self.plugin.getNickOfUser(0, 2, 2) + "sender", self.plugin.get_nick_of_user(0, 2, 2) ) # Const.JID[1] is in the roster for Const.PROFILE[2] - self.assertEqual("sender_", self.plugin.getNickOfUser(0, 3, 2)) - self.assertEqual("sender_", self.plugin.getNick(0, 3)) - self.assertEqual("test", self.plugin.getNickOfUser(0, 0, 3)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 1, 3)) - self.assertEqual("sender", self.plugin.getNickOfUser(0, 2, 3)) - self.assertEqual(None, self.plugin.getNickOfUser(0, 3, 3)) + self.assertEqual("sender_", self.plugin.get_nick_of_user(0, 3, 2)) + self.assertEqual("sender_", self.plugin.get_nick(0, 3)) + self.assertEqual("test", self.plugin.get_nick_of_user(0, 0, 3)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 1, 3)) + self.assertEqual("sender", self.plugin.get_nick_of_user(0, 2, 3)) + self.assertEqual(None, self.plugin.get_nick_of_user(0, 3, 3))
--- a/sat/test/test_memory.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_memory.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ def setUp(self): self.host = helpers.FakeSAT() - def _getParamXML(self, param="1", security_level=None): + def _get_param_xml(self, param="1", security_level=None): """Generate XML for testing parameters @param param (str): a subset of "123" @@ -37,7 +37,7 @@ @return (str) """ - def getParam(name): + def get_param(name): return """ <param name="%(param_name)s" label="%(param_label)s" value="true" type="bool" %(security)s/> """ % { @@ -50,11 +50,11 @@ params = "" if "1" in param: - params += getParam(Const.ENABLE_UNIBOX_PARAM) + params += get_param(Const.ENABLE_UNIBOX_PARAM) if "2" in param: - params += getParam(Const.PARAM_IN_QUOTES) + params += get_param(Const.PARAM_IN_QUOTES) if "3" in param: - params += getParam("Dummy param") + params += get_param("Dummy param") return """ <params> <individual> @@ -69,7 +69,7 @@ "params": params, } - def _paramExists(self, param="1", src=None): + def _param_exists(self, param="1", src=None): """ @param param (str): a character in "12" @@ -86,7 +86,7 @@ for type_node in src.childNodes: # when src comes self.host.memory.params.dom, we have here # some "individual" or "general" elements, when it comes - # from Memory.getParams we have here a "params" elements + # from Memory.get_params we have here a "params" elements if type_node.nodeName not in ("individual", "general", "params"): continue for cat_node in type_node.childNodes: @@ -100,7 +100,7 @@ return True return False - def assertParam_generic(self, param="1", src=None, exists=True, deferred=False): + def assert_param_generic(self, param="1", src=None, exists=True, deferred=False): """ @param param (str): a character in "12" @param src (DOM element): the top-level element to look in @@ -116,28 +116,28 @@ # in this stack we can see the line where the error came from, # if limit=5, 6 is not enough you can increase the value msg += "\n".join(traceback.format_stack(limit=5 if exists else 6)) - assertion = self._paramExists(param, src) + assertion = self._param_exists(param, src) getattr(self, "assert%s" % exists)(assertion, msg) - def assertParamExists(self, param="1", src=None): - self.assertParam_generic(param, src, True) + def assert_param_exists(self, param="1", src=None): + self.assert_param_generic(param, src, True) - def assertParamNotExists(self, param="1", src=None): - self.assertParam_generic(param, src, False) + def assert_param_not_exists(self, param="1", src=None): + self.assert_param_generic(param, src, False) - def assertParamExists_async(self, src, param="1"): - """@param src: a deferred result from Memory.getParams""" - self.assertParam_generic( + def assert_param_exists_async(self, src, param="1"): + """@param src: a deferred result from Memory.get_params""" + self.assert_param_generic( param, minidom.parseString(src.encode("utf-8")), True, True ) - def assertParamNotExists_async(self, src, param="1"): - """@param src: a deferred result from Memory.getParams""" - self.assertParam_generic( + def assert_param_not_exists_async(self, src, param="1"): + """@param src: a deferred result from Memory.get_params""" + self.assert_param_generic( param, minidom.parseString(src.encode("utf-8")), False, True ) - def _getParams(self, security_limit, app="", profile_key="@NONE@"): + def _get_params(self, security_limit, app="", profile_key="@NONE@"): """Get the parameters accessible with the given security limit and application name. @param security_limit (int): the security limit @@ -146,16 +146,16 @@ """ if profile_key == "@NONE@": profile_key = "@DEFAULT@" - return self.host.memory.params.getParams(security_limit, app, profile_key) + return self.host.memory.params.get_params(security_limit, app, profile_key) - def test_updateParams(self): + def test_update_params(self): self.host.memory.reinit() # check if the update works - self.host.memory.updateParams(self._getParamXML()) - self.assertParamExists() + self.host.memory.update_params(self._get_param_xml()) + self.assert_param_exists() previous = self.host.memory.params.dom.cloneNode(True) # now check if it is really updated and not duplicated - self.host.memory.updateParams(self._getParamXML()) + self.host.memory.update_params(self._get_param_xml()) self.assertEqual( previous.toxml().encode("utf-8"), self.host.memory.params.dom.toxml().encode("utf-8"), @@ -163,23 +163,23 @@ self.host.memory.reinit() # check successive updates (without intersection) - self.host.memory.updateParams(self._getParamXML("1")) - self.assertParamExists("1") - self.assertParamNotExists("2") - self.host.memory.updateParams(self._getParamXML("2")) - self.assertParamExists("1") - self.assertParamExists("2") + self.host.memory.update_params(self._get_param_xml("1")) + self.assert_param_exists("1") + self.assert_param_not_exists("2") + self.host.memory.update_params(self._get_param_xml("2")) + self.assert_param_exists("1") + self.assert_param_exists("2") previous = self.host.memory.params.dom.cloneNode(True) # save for later self.host.memory.reinit() # check successive updates (with intersection) - self.host.memory.updateParams(self._getParamXML("1")) - self.assertParamExists("1") - self.assertParamNotExists("2") - self.host.memory.updateParams(self._getParamXML("12")) - self.assertParamExists("1") - self.assertParamExists("2") + self.host.memory.update_params(self._get_param_xml("1")) + self.assert_param_exists("1") + self.assert_param_not_exists("2") + self.host.memory.update_params(self._get_param_xml("12")) + self.assert_param_exists("1") + self.assert_param_exists("2") # successive updates with or without intersection should have the same result self.assertEqual( @@ -189,125 +189,125 @@ self.host.memory.reinit() # one update with two params in a new category - self.host.memory.updateParams(self._getParamXML("12")) - self.assertParamExists("1") - self.assertParamExists("2") + self.host.memory.update_params(self._get_param_xml("12")) + self.assert_param_exists("1") + self.assert_param_exists("2") - def test_getParams(self): + def test_get_params(self): # tests with no security level on the parameter (most secure) - params = self._getParamXML() + params = self._get_param_xml() self.host.memory.reinit() - self.host.memory.updateParams(params) - self._getParams(Const.NO_SECURITY_LIMIT).addCallback(self.assertParamExists_async) - self._getParams(0).addCallback(self.assertParamNotExists_async) - self._getParams(1).addCallback(self.assertParamNotExists_async) + self.host.memory.update_params(params) + self._get_params(Const.NO_SECURITY_LIMIT).addCallback(self.assert_param_exists_async) + self._get_params(0).addCallback(self.assert_param_not_exists_async) + self._get_params(1).addCallback(self.assert_param_not_exists_async) # tests with security level 0 on the parameter (not secure) - params = self._getParamXML(security_level=0) + params = self._get_param_xml(security_level=0) self.host.memory.reinit() - self.host.memory.updateParams(params) - self._getParams(Const.NO_SECURITY_LIMIT).addCallback(self.assertParamExists_async) - self._getParams(0).addCallback(self.assertParamExists_async) - self._getParams(1).addCallback(self.assertParamExists_async) + self.host.memory.update_params(params) + self._get_params(Const.NO_SECURITY_LIMIT).addCallback(self.assert_param_exists_async) + self._get_params(0).addCallback(self.assert_param_exists_async) + self._get_params(1).addCallback(self.assert_param_exists_async) # tests with security level 1 on the parameter (more secure) - params = self._getParamXML(security_level=1) + params = self._get_param_xml(security_level=1) self.host.memory.reinit() - self.host.memory.updateParams(params) - self._getParams(Const.NO_SECURITY_LIMIT).addCallback(self.assertParamExists_async) - self._getParams(0).addCallback(self.assertParamNotExists_async) - return self._getParams(1).addCallback(self.assertParamExists_async) + self.host.memory.update_params(params) + self._get_params(Const.NO_SECURITY_LIMIT).addCallback(self.assert_param_exists_async) + self._get_params(0).addCallback(self.assert_param_not_exists_async) + return self._get_params(1).addCallback(self.assert_param_exists_async) - def test_paramsRegisterApp(self): + def test_params_register_app(self): def register(xml, security_limit, app): """ @param xml: XML definition of the parameters to be added @param security_limit: -1 means no security, 0 is the maximum security then the higher the less secure @param app: name of the frontend registering the parameters """ - helpers.muteLogging() - self.host.memory.paramsRegisterApp(xml, security_limit, app) - helpers.unmuteLogging() + helpers.mute_logging() + self.host.memory.params_register_app(xml, security_limit, app) + helpers.unmute_logging() # tests with no security level on the parameter (most secure) - params = self._getParamXML() + params = self._get_param_xml() self.host.memory.reinit() register(params, Const.NO_SECURITY_LIMIT, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() self.host.memory.reinit() register(params, 0, Const.APP_NAME) - self.assertParamNotExists() + self.assert_param_not_exists() self.host.memory.reinit() register(params, 1, Const.APP_NAME) - self.assertParamNotExists() + self.assert_param_not_exists() # tests with security level 0 on the parameter (not secure) - params = self._getParamXML(security_level=0) + params = self._get_param_xml(security_level=0) self.host.memory.reinit() register(params, Const.NO_SECURITY_LIMIT, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() self.host.memory.reinit() register(params, 0, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() self.host.memory.reinit() register(params, 1, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() # tests with security level 1 on the parameter (more secure) - params = self._getParamXML(security_level=1) + params = self._get_param_xml(security_level=1) self.host.memory.reinit() register(params, Const.NO_SECURITY_LIMIT, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() self.host.memory.reinit() register(params, 0, Const.APP_NAME) - self.assertParamNotExists() + self.assert_param_not_exists() self.host.memory.reinit() register(params, 1, Const.APP_NAME) - self.assertParamExists() + self.assert_param_exists() # tests with security level 1 and several parameters being registered - params = self._getParamXML("12", security_level=1) + params = self._get_param_xml("12", security_level=1) self.host.memory.reinit() register(params, Const.NO_SECURITY_LIMIT, Const.APP_NAME) - self.assertParamExists() - self.assertParamExists("2") + self.assert_param_exists() + self.assert_param_exists("2") self.host.memory.reinit() register(params, 0, Const.APP_NAME) - self.assertParamNotExists() - self.assertParamNotExists("2") + self.assert_param_not_exists() + self.assert_param_not_exists("2") self.host.memory.reinit() register(params, 1, Const.APP_NAME) - self.assertParamExists() - self.assertParamExists("2") + self.assert_param_exists() + self.assert_param_exists("2") # tests with several parameters being registered in an existing category self.host.memory.reinit() - self.host.memory.updateParams(self._getParamXML("3")) - register(self._getParamXML("12"), Const.NO_SECURITY_LIMIT, Const.APP_NAME) - self.assertParamExists() - self.assertParamExists("2") + self.host.memory.update_params(self._get_param_xml("3")) + register(self._get_param_xml("12"), Const.NO_SECURITY_LIMIT, Const.APP_NAME) + self.assert_param_exists() + self.assert_param_exists("2") self.host.memory.reinit() - def test_paramsRegisterApp_getParams(self): + def test_params_register_app_get_params(self): # test retrieving the parameter for a specific frontend self.host.memory.reinit() - params = self._getParamXML(security_level=1) - self.host.memory.paramsRegisterApp(params, 1, Const.APP_NAME) - self._getParams(1, "").addCallback(self.assertParamExists_async) - self._getParams(1, Const.APP_NAME).addCallback(self.assertParamExists_async) - self._getParams(1, "another_dummy_frontend").addCallback( - self.assertParamNotExists_async + params = self._get_param_xml(security_level=1) + self.host.memory.params_register_app(params, 1, Const.APP_NAME) + self._get_params(1, "").addCallback(self.assert_param_exists_async) + self._get_params(1, Const.APP_NAME).addCallback(self.assert_param_exists_async) + self._get_params(1, "another_dummy_frontend").addCallback( + self.assert_param_not_exists_async ) # the same with several parameters registered at the same time self.host.memory.reinit() - params = self._getParamXML("12", security_level=0) - self.host.memory.paramsRegisterApp(params, 5, Const.APP_NAME) - self._getParams(5, "").addCallback(self.assertParamExists_async) - self._getParams(5, "").addCallback(self.assertParamExists_async, "2") - self._getParams(5, Const.APP_NAME).addCallback(self.assertParamExists_async) - self._getParams(5, Const.APP_NAME).addCallback(self.assertParamExists_async, "2") - self._getParams(5, "another_dummy_frontend").addCallback( - self.assertParamNotExists_async + params = self._get_param_xml("12", security_level=0) + self.host.memory.params_register_app(params, 5, Const.APP_NAME) + self._get_params(5, "").addCallback(self.assert_param_exists_async) + self._get_params(5, "").addCallback(self.assert_param_exists_async, "2") + self._get_params(5, Const.APP_NAME).addCallback(self.assert_param_exists_async) + self._get_params(5, Const.APP_NAME).addCallback(self.assert_param_exists_async, "2") + self._get_params(5, "another_dummy_frontend").addCallback( + self.assert_param_not_exists_async ) - return self._getParams(5, "another_dummy_frontend").addCallback( - self.assertParamNotExists_async, "2" + return self._get_params(5, "another_dummy_frontend").addCallback( + self.assert_param_not_exists_async, "2" )
--- a/sat/test/test_memory_crypto.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_memory_crypto.py Sat Apr 08 13:54:42 2023 +0200 @@ -28,7 +28,7 @@ from twisted.internet import defer -def getRandomUnicode(len): +def get_random_unicode(len): """Return a random unicode string""" return "".join(random.choice(string.letters + "éáúóâêûôßüöä") for i in range(len)) @@ -47,9 +47,9 @@ d_list.append(d) for key_len in (0, 2, 8, 10, 16, 24, 30, 32, 40): - key = getRandomUnicode(key_len) + key = get_random_unicode(key_len) for message_len in (0, 2, 16, 24, 32, 100): - message = getRandomUnicode(message_len) + message = get_random_unicode(message_len) test(key, message) return defer.DeferredList(d_list) @@ -62,7 +62,7 @@ d1 = PasswordHasher.verify(password, hashed) d1.addCallback(lambda result: self.assertTrue(result)) d_list.append(d1) - attempt = getRandomUnicode(10) + attempt = get_random_unicode(10) d2 = PasswordHasher.verify(attempt, hashed) d2.addCallback(lambda result: self.assertFalse(result)) d_list.append(d2)
--- a/sat/test/test_plugin_misc_groupblog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_misc_groupblog.py Sat Apr 08 13:54:42 2023 +0200 @@ -167,7 +167,7 @@ def _initialise(self, profile_key): profile = profile_key - client = self.host.getClient(profile) + client = self.host.get_client(profile) if not self.__initialised: client.item_access_pubsub = jid.JID(SERVICE) xep_0060 = self.host.plugins["XEP-0060"] @@ -176,15 +176,15 @@ ) client.pubsub_client.parent = client self.psclient = client.pubsub_client - helpers.FakeSAT.getDiscoItems = self.psclient.service_getDiscoItems + helpers.FakeSAT.getDiscoItems = self.psclient.service_get_disco_items self.__initialised = True return defer.succeed((profile, client)) - def _addItem(self, profile, item, parent_node=None): - client = self.host.getClient(profile) - client.pubsub_client._addItem(item, parent_node) + def _add_item(self, profile, item, parent_node=None): + client = self.host.get_client(profile) + client.pubsub_client._add_item(item, parent_node) - def test_sendGroupBlog(self): + def test_send_group_blog(self): self._initialise(C.PROFILE[0]) d = self.psclient.items(SERVICE, NODE_ID) d.addCallback(lambda items: self.assertEqual(len(items), 0)) @@ -196,9 +196,9 @@ d.addCallback(lambda __: self.psclient.items(SERVICE, NODE_ID)) return d.addCallback(lambda items: self.assertEqual(len(items), 1)) - def test_deleteGroupBlog(self): + def test_delete_group_blog(self): pub_data = (SERVICE, NODE_ID, ITEM_ID_1) - self.host.bridge.expectCall( + self.host.bridge.expect_call( "personalEvent", C.JID_STR[0], "MICROBLOG_DELETE", @@ -214,7 +214,7 @@ ) return d.addCallback(self.assertEqual, None) - def test_updateGroupBlog(self): + def test_update_group_blog(self): pub_data = (SERVICE, NODE_ID, ITEM_ID_1) new_text = "silfu23RFWUP)IWNOEIOEFÖ" @@ -232,7 +232,7 @@ ) ) - def test_sendGroupBlogComment(self): + def test_send_group_blog_comment(self): self._initialise(C.PROFILE[0]) d = self.psclient.items(SERVICE, NODE_ID) d.addCallback(lambda items: self.assertEqual(len(items), 0)) @@ -244,7 +244,7 @@ d.addCallback(lambda __: self.psclient.items(SERVICE, COMMENTS_NODE_ID_1)) return d.addCallback(lambda items: self.assertEqual(len(items), 1)) - def test_getGroupBlogs(self): + def test_get_group_blogs(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -256,7 +256,7 @@ ) return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsNoCount(self): + def test_get_group_blogs_no_count(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -270,7 +270,7 @@ ) return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsWithIDs(self): + def test_get_group_blogs_with_i_ds(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -284,7 +284,7 @@ ) return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsWithRSM(self): + def test_get_group_blogs_with_rsm(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -298,7 +298,7 @@ ) return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsWithComments(self): + def test_get_group_blogs_with_comments(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -328,7 +328,7 @@ ) return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsWithComments2(self): + def test_get_group_blogs_with_comments_2(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -361,7 +361,7 @@ return d.addCallback(self.assertEqual, result) - def test_getGroupBlogsAtom(self): + def test_get_group_blogs_atom(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -376,7 +376,7 @@ return d.addCallback(cb) - def test_getMassiveGroupBlogs(self): + def test_get_massive_group_blogs(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -400,7 +400,7 @@ d.addCallback(clean) d.addCallback(self.assertEqual, result) - def test_getMassiveGroupBlogsWithComments(self): + def test_get_massive_group_blogs_with_comments(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -429,7 +429,7 @@ d.addCallback(clean) d.addCallback(self.assertEqual, result) - def test_getGroupBlogComments(self): + def test_get_group_blog_comments(self): self._initialise(C.PROFILE[0]) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( @@ -446,12 +446,12 @@ ) return d.addCallback(self.assertEqual, result) - def test_subscribeGroupBlog(self): + def test_subscribe_group_blog(self): self._initialise(C.PROFILE[0]) d = self.plugin.subscribeGroupBlog(PUBLISHER, profile_key=C.PROFILE[0]) return d.addCallback(self.assertEqual, None) - def test_massiveSubscribeGroupBlogs(self): + def test_massive_subscribe_group_blogs(self): self._initialise(C.PROFILE[0]) d = self.plugin.massiveSubscribeGroupBlogs( "JID", [jid.JID(PUBLISHER)], profile_key=C.PROFILE[0] @@ -469,10 +469,10 @@ d.addCallback(clean) return d.addCallback(self.assertEqual, None) - def test_deleteAllGroupBlogs(self): + def test_delete_all_group_blogs(self): """Delete our main node and associated comments node""" self._initialise(C.PROFILE[0]) - self.host.profiles[C.PROFILE[0]].roster.addItem(jid.JID(OTHER_PUBLISHER)) + self.host.profiles[C.PROFILE[0]].roster.add_item(jid.JID(OTHER_PUBLISHER)) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( lambda __: self.psclient.publish( @@ -517,10 +517,10 @@ d.addCallback(lambda items: self.assertEqual(len(items), 2)) return d - def test_deleteAllGroupBlogsComments(self): + def test_delete_all_group_blogs_comments(self): """Delete the comments we posted on other node's""" self._initialise(C.PROFILE[0]) - self.host.profiles[C.PROFILE[0]].roster.addItem(jid.JID(OTHER_PUBLISHER)) + self.host.profiles[C.PROFILE[0]].roster.add_item(jid.JID(OTHER_PUBLISHER)) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( lambda __: self.psclient.publish( @@ -565,9 +565,9 @@ d.addCallback(lambda items: self.assertEqual(len(items), 0)) return d - def test_deleteAllGroupBlogsAndComments(self): + def test_delete_all_group_blogs_and_comments(self): self._initialise(C.PROFILE[0]) - self.host.profiles[C.PROFILE[0]].roster.addItem(jid.JID(OTHER_PUBLISHER)) + self.host.profiles[C.PROFILE[0]].roster.add_item(jid.JID(OTHER_PUBLISHER)) d = self.psclient.publish(SERVICE, NODE_ID, [ITEM_1]) d.addCallback( lambda __: self.psclient.publish(
--- a/sat/test/test_plugin_misc_radiocol.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_misc_radiocol.py Sat Apr 08 13:54:42 2023 +0200 @@ -73,10 +73,10 @@ self.host ) for profile in Const.PROFILE: - self.host.getClient(profile) # init self.host.profiles[profile] + self.host.get_client(profile) # init self.host.profiles[profile] self.songs = [] self.playlist = [] - self.sound_dir = self.host.memory.getConfig("", "media_dir") + "/test/sound/" + self.sound_dir = self.host.memory.config_get("", "media_dir") + "/test/sound/" try: for filename in os.listdir(self.sound_dir): if filename.endswith(".ogg") or filename.endswith(".mp3"): @@ -84,7 +84,7 @@ except OSError: raise SkipTest("The sound samples in sat_media/test/sound were not found") - def _buildPlayers(self, players=[]): + def _build_players(self, players=[]): """@return: the "started" content built with the given players""" content = "<started" if not players: @@ -96,7 +96,7 @@ content += "</started>" return content - def _expectedMessage(self, to_jid, type_, content): + def _expected_message(self, to_jid, type_, content): """ @param to_jid: recipient full jid @param type_: message type ('normal' or 'groupchat') @@ -118,56 +118,56 @@ plugin.RADIOC_TAG, ) - def _rejectSongCb(self, profile_index): + def _reject_song_cb(self, profile_index): """Check if the message "song_rejected" has been sent by the referee and process the command with the profile of the uploader @param profile_index: uploader's profile""" - sent = self.host.getSentMessage(0) + sent = self.host.get_sent_message(0) content = "<song_rejected xmlns='' reason='Too many songs in queue'/>" - self.assertEqualXML( + self.assert_equal_xml( sent.toXml(), - self._expectedMessage( + self._expected_message( JID( ROOM_JID.userhost() + "/" - + self.plugin_0045.getNick(0, profile_index), + + self.plugin_0045.get_nick(0, profile_index), "normal", content, ) ), ) - self._roomGameCmd( - sent, ["radiocolSongRejected", ROOM_JID.full(), "Too many songs in queue"] + self._room_game_cmd( + sent, ["radiocol_song_rejected", ROOM_JID.full(), "Too many songs in queue"] ) - def _noUploadCb(self): + def _no_upload_cb(self): """Check if the message "no_upload" has been sent by the referee and process the command with the profiles of each room users""" - sent = self.host.getSentMessage(0) + sent = self.host.get_sent_message(0) content = "<no_upload xmlns=''/>" - self.assertEqualXML( - sent.toXml(), self._expectedMessage(ROOM_JID, "groupchat", content) + self.assert_equal_xml( + sent.toXml(), self._expected_message(ROOM_JID, "groupchat", content) ) - self._roomGameCmd(sent, ["radiocolNoUpload", ROOM_JID.full()]) + self._room_game_cmd(sent, ["radiocol_no_upload", ROOM_JID.full()]) - def _uploadOkCb(self): + def _upload_ok_cb(self): """Check if the message "upload_ok" has been sent by the referee and process the command with the profiles of each room users""" - sent = self.host.getSentMessage(0) + sent = self.host.get_sent_message(0) content = "<upload_ok xmlns=''/>" - self.assertEqualXML( - sent.toXml(), self._expectedMessage(ROOM_JID, "groupchat", content) + self.assert_equal_xml( + sent.toXml(), self._expected_message(ROOM_JID, "groupchat", content) ) - self._roomGameCmd(sent, ["radiocolUploadOk", ROOM_JID.full()]) + self._room_game_cmd(sent, ["radiocol_upload_ok", ROOM_JID.full()]) - def _preloadCb(self, attrs, profile_index): + def _preload_cb(self, attrs, profile_index): """Check if the message "preload" has been sent by the referee and process the command with the profiles of each room users @param attrs: information dict about the song @param profile_index: profile index of the uploader """ - sent = self.host.getSentMessage(0) - attrs["sender"] = self.plugin_0045.getNick(0, profile_index) + sent = self.host.get_sent_message(0) + attrs["sender"] = self.plugin_0045.get_nick(0, profile_index) radiocol_elt = next(domish.generateElementsNamed(sent.elements(), "radiocol")) preload_elt = next(domish.generateElementsNamed( radiocol_elt.elements(), "preload" @@ -178,13 +178,13 @@ ) if sent.hasAttribute("from"): del sent["from"] - self.assertEqualXML( - sent.toXml(), self._expectedMessage(ROOM_JID, "groupchat", content) + self.assert_equal_xml( + sent.toXml(), self._expected_message(ROOM_JID, "groupchat", content) ) - self._roomGameCmd( + self._room_game_cmd( sent, [ - "radiocolPreload", + "radiocol_preload", ROOM_JID.full(), attrs["timestamp"], attrs["filename"], @@ -195,25 +195,25 @@ ], ) - def _playNextSongCb(self): + def _play_next_song_cb(self): """Check if the message "play" has been sent by the referee and process the command with the profiles of each room users""" - sent = self.host.getSentMessage(0) + sent = self.host.get_sent_message(0) filename = self.playlist.pop(0) content = "<play xmlns='' filename='%s' />" % filename - self.assertEqualXML( - sent.toXml(), self._expectedMessage(ROOM_JID, "groupchat", content) + self.assert_equal_xml( + sent.toXml(), self._expected_message(ROOM_JID, "groupchat", content) ) - self._roomGameCmd(sent, ["radiocolPlay", ROOM_JID.full(), filename]) + self._room_game_cmd(sent, ["radiocol_play", ROOM_JID.full(), filename]) game_data = self.plugin.games[ROOM_JID] if len(game_data["queue"]) == plugin.QUEUE_LIMIT - 1: - self._uploadOkCb() + self._upload_ok_cb() - def _addSongCb(self, d, filepath, profile_index): + def _add_song_cb(self, d, filepath, profile_index): """Check if the message "song_added" has been sent by the uploader and process the command with the profile of the referee - @param d: deferred value or failure got from self.plugin.radiocolSongAdded + @param d: deferred value or failure got from self.plugin.radiocol_song_added @param filepath: full path to the sound file @param profile_index: the profile index of the uploader """ @@ -250,9 +250,9 @@ content = "<song_added xmlns='' %s/>" % " ".join( ["%s='%s'" % (attr, attrs[attr]) for attr in attrs] ) - sent = self.host.getSentMessage(profile_index) - self.assertEqualXML( - sent.toXml(), self._expectedMessage(REFEREE_FULL, "normal", content) + sent = self.host.get_sent_message(profile_index) + self.assert_equal_xml( + sent.toXml(), self._expected_message(REFEREE_FULL, "normal", content) ) reject_song = len(game_data["queue"]) >= plugin.QUEUE_LIMIT @@ -262,20 +262,20 @@ and len(game_data["queue"]) + 1 == plugin.QUEUE_TO_START ) - self._roomGameCmd(sent, profile_index) # queue unchanged or +1 + self._room_game_cmd(sent, profile_index) # queue unchanged or +1 if reject_song: - self._rejectSongCb(profile_index) + self._reject_song_cb(profile_index) return if no_upload: - self._noUploadCb() - self._preloadCb(attrs, profile_index) + self._no_upload_cb() + self._preload_cb(attrs, profile_index) self.playlist.append(attrs["filename"]) if play_next: - self._playNextSongCb() # queue -1 + self._play_next_song_cb() # queue -1 - def _roomGameCmd(self, sent, from_index=0, call=[]): + def _room_game_cmd(self, sent, from_index=0, call=[]): """Process a command. It is also possible to call this method as - _roomGameCmd(sent, call) instead of _roomGameCmd(sent, from_index, call). + _room_game_cmd(sent, call) instead of _room_game_cmd(sent, from_index, call). If from index is a list, it is assumed that it is containing the value for call and from_index will take its default value. @param sent: the sent message that we need to process @@ -287,14 +287,14 @@ call = from_index from_index = 0 - sent["from"] = ROOM_JID.full() + "/" + self.plugin_0045.getNick(0, from_index) + sent["from"] = ROOM_JID.full() + "/" + self.plugin_0045.get_nick(0, from_index) recipient = JID(sent["to"]).resource # The message could have been sent to a room user (room_jid + '/' + nick), # but when it is received, the 'to' attribute of the message has been # changed to the recipient own JID. We need to simulate that here. if recipient: - room = self.plugin_0045.getRoom(0, 0) + room = self.plugin_0045.get_room(0, 0) sent["to"] = ( Const.JID_STR[0] if recipient == room.nick @@ -302,33 +302,33 @@ ) for index in range(0, len(Const.PROFILE)): - nick = self.plugin_0045.getNick(0, index) + nick = self.plugin_0045.get_nick(0, index) if nick: if not recipient or nick == recipient: if call and ( - self.plugin.isPlayer(ROOM_JID, nick) - or call[0] == "radiocolStarted" + self.plugin.is_player(ROOM_JID, nick) + or call[0] == "radiocol_started" ): args = copy.deepcopy(call) args.append(Const.PROFILE[index]) - self.host.bridge.expectCall(*args) + self.host.bridge.expect_call(*args) self.plugin.room_game_cmd(sent, Const.PROFILE[index]) - def _syncCb(self, sync_data, profile_index): + def _sync_cb(self, sync_data, profile_index): """Synchronize one player when he joins a running game. @param sync_data: result from self.plugin.getSyncData @param profile_index: index of the profile to be synchronized """ for nick in sync_data: - expected = self._expectedMessage( + expected = self._expected_message( JID(ROOM_JID.userhost() + "/" + nick), "normal", sync_data[nick] ) - sent = self.host.getSentMessage(0) - self.assertEqualXML(sent.toXml(), expected) + sent = self.host.get_sent_message(0) + self.assert_equal_xml(sent.toXml(), expected) for elt in sync_data[nick]: if elt.name == "preload": - self.host.bridge.expectCall( - "radiocolPreload", + self.host.bridge.expect_call( + "radiocol_preload", ROOM_JID.full(), elt["timestamp"], elt["filename"], @@ -339,48 +339,48 @@ Const.PROFILE[profile_index], ) elif elt.name == "play": - self.host.bridge.expectCall( - "radiocolPlay", + self.host.bridge.expect_call( + "radiocol_play", ROOM_JID.full(), elt["filename"], Const.PROFILE[profile_index], ) elif elt.name == "no_upload": - self.host.bridge.expectCall( - "radiocolNoUpload", ROOM_JID.full(), Const.PROFILE[profile_index] + self.host.bridge.expect_call( + "radiocol_no_upload", ROOM_JID.full(), Const.PROFILE[profile_index] ) sync_data[nick] - self._roomGameCmd(sent, []) + self._room_game_cmd(sent, []) - def _joinRoom(self, room, nicks, player_index, sync=True): + def _join_room(self, room, nicks, player_index, sync=True): """Make a player join a room and update the list of nicks @param room: wokkel.muc.Room instance from the referee perspective @param nicks: list of the players which will be updated @param player_index: profile index of the new player @param sync: set to True to synchronize data """ - user_nick = self.plugin_0045.joinRoom(0, player_index) - self.plugin.userJoinedTrigger(room, room.roster[user_nick], PROFILE) + user_nick = self.plugin_0045.join_room(0, player_index) + self.plugin.user_joined_trigger(room, room.roster[user_nick], PROFILE) if player_index not in PLAYERS_INDICES: # this user is actually not a player - self.assertFalse(self.plugin.isPlayer(ROOM_JID, user_nick)) + self.assertFalse(self.plugin.is_player(ROOM_JID, user_nick)) to_jid, type_ = (JID(ROOM_JID.userhost() + "/" + user_nick), "normal") else: # this user is a player - self.assertTrue(self.plugin.isPlayer(ROOM_JID, user_nick)) + self.assertTrue(self.plugin.is_player(ROOM_JID, user_nick)) nicks.append(user_nick) to_jid, type_ = (ROOM_JID, "groupchat") # Check that the message "players" has been sent by the referee - expected = self._expectedMessage(to_jid, type_, self._buildPlayers(nicks)) - sent = self.host.getSentMessage(0) - self.assertEqualXML(sent.toXml(), expected) + expected = self._expected_message(to_jid, type_, self._build_players(nicks)) + sent = self.host.get_sent_message(0) + self.assert_equal_xml(sent.toXml(), expected) # Process the command with the profiles of each room users - self._roomGameCmd( + self._room_game_cmd( sent, [ - "radiocolStarted", + "radiocol_started", ROOM_JID.full(), REFEREE_FULL.full(), nicks, @@ -389,21 +389,21 @@ ) if sync: - self._syncCb(self.plugin._getSyncData(ROOM_JID, [user_nick]), player_index) + self._sync_cb(self.plugin._get_sync_data(ROOM_JID, [user_nick]), player_index) - def _leaveRoom(self, room, nicks, player_index): + def _leave_room(self, room, nicks, player_index): """Make a player leave a room and update the list of nicks @param room: wokkel.muc.Room instance from the referee perspective @param nicks: list of the players which will be updated @param player_index: profile index of the new player """ - user_nick = self.plugin_0045.getNick(0, player_index) + user_nick = self.plugin_0045.get_nick(0, player_index) user = room.roster[user_nick] - self.plugin_0045.leaveRoom(0, player_index) - self.plugin.userLeftTrigger(room, user, PROFILE) + self.plugin_0045.leave_room(0, player_index) + self.plugin.user_left_trigger(room, user, PROFILE) nicks.remove(user_nick) - def _uploadSong(self, song_index, profile_index): + def _upload_song(self, song_index, profile_index): """Upload the song of index song_index (modulo self.songs size) from the profile of index profile_index. @param song_index: index of the song or None to test with non existing file @@ -420,7 +420,7 @@ expect_io_error = False try: - d = self.plugin.radiocolSongAdded( + d = self.plugin.radiocol_song_added( REFEREE_FULL, dst_filepath, Const.PROFILE[profile_index] ) except IOError: @@ -428,7 +428,7 @@ return self.assertFalse(expect_io_error) - cb = lambda defer: self._addSongCb(defer, dst_filepath, profile_index) + cb = lambda defer: self._add_song_cb(defer, dst_filepath, profile_index) def eb(failure): if not isinstance(failure, Failure): @@ -451,20 +451,20 @@ self.reinit() # create game - self.plugin.prepareRoom(OTHER_PLAYERS, ROOM_JID, PROFILE) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) - room = self.plugin_0045.getRoom(0, 0) - nicks = [self.plugin_0045.getNick(0, 0)] + self.plugin.prepare_room(OTHER_PLAYERS, ROOM_JID, PROFILE) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) + room = self.plugin_0045.get_room(0, 0) + nicks = [self.plugin_0045.get_nick(0, 0)] - sent = self.host.getSentMessage(0) - self.assertEqualXML( + sent = self.host.get_sent_message(0) + self.assert_equal_xml( sent.toXml(), - self._expectedMessage(ROOM_JID, "groupchat", self._buildPlayers(nicks)), + self._expected_message(ROOM_JID, "groupchat", self._build_players(nicks)), ) - self._roomGameCmd( + self._room_game_cmd( sent, [ - "radiocolStarted", + "radiocol_started", ROOM_JID.full(), REFEREE_FULL.full(), nicks, @@ -472,42 +472,42 @@ ], ) - self._joinRoom(room, nicks, 1) # player joins - self._joinRoom(room, nicks, 4) # user not playing joins + self._join_room(room, nicks, 1) # player joins + self._join_room(room, nicks, 4) # user not playing joins song_index = 0 - self._uploadSong( + self._upload_song( song_index, 0 ) # ogg or mp3 file should exist in sat_media/test/song - self._uploadSong(None, 0) # non existing file + self._upload_song(None, 0) # non existing file # another songs are added by Const.JID[1] until the radio starts + 1 to fill the queue # when the first song starts + 1 to be rejected because the queue is full for song_index in range(1, plugin.QUEUE_TO_START + 1): - self._uploadSong(song_index, 1) + self._upload_song(song_index, 1) - self.plugin.playNext(Const.MUC[0], PROFILE) # simulate the end of the first song - self._playNextSongCb() - self._uploadSong( + self.plugin.play_next(Const.MUC[0], PROFILE) # simulate the end of the first song + self._play_next_song_cb() + self._upload_song( song_index, 1 ) # now the song is accepted and the queue is full again - self._joinRoom(room, nicks, 3) # new player joins + self._join_room(room, nicks, 3) # new player joins - self.plugin.playNext(Const.MUC[0], PROFILE) # the second song finishes - self._playNextSongCb() - self._uploadSong(0, 3) # the player who recently joined re-upload the first file + self.plugin.play_next(Const.MUC[0], PROFILE) # the second song finishes + self._play_next_song_cb() + self._upload_song(0, 3) # the player who recently joined re-upload the first file - self._leaveRoom(room, nicks, 1) # one player leaves - self._joinRoom(room, nicks, 1) # and join again + self._leave_room(room, nicks, 1) # one player leaves + self._join_room(room, nicks, 1) # and join again - self.plugin.playNext(Const.MUC[0], PROFILE) # empty the queue - self._playNextSongCb() - self.plugin.playNext(Const.MUC[0], PROFILE) - self._playNextSongCb() + self.plugin.play_next(Const.MUC[0], PROFILE) # empty the queue + self._play_next_song_cb() + self.plugin.play_next(Const.MUC[0], PROFILE) + self._play_next_song_cb() for filename in self.playlist: - self.plugin.deleteFile("/tmp/" + filename) + self.plugin.delete_file("/tmp/" + filename) return defer.succeed(None)
--- a/sat/test/test_plugin_misc_room_game.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_misc_room_game.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,13 +65,13 @@ self.host ) for profile in Const.PROFILE: - self.host.getClient(profile) # init self.host.profiles[profile] + self.host.get_client(profile) # init self.host.profiles[profile] - def initGame(self, muc_index, user_index): - self.plugin_0045.joinRoom(user_index, muc_index) - self.plugin._initGame(JID(Const.MUC_STR[muc_index]), Const.JID[user_index].user) + def init_game(self, muc_index, user_index): + self.plugin_0045.join_room(user_index, muc_index) + self.plugin._init_game(JID(Const.MUC_STR[muc_index]), Const.JID[user_index].user) - def _expectedMessage(self, to, type_, tag, players=[]): + def _expected_message(self, to, type_, tag, players=[]): content = "<%s" % tag if not players: content += "/>" @@ -88,45 +88,45 @@ content, ) - def test_createOrInvite_solo(self): + def test_create_or_invite_solo(self): self.reinit() - self.plugin_0045.joinRoom(0, 0) - self.plugin._createOrInvite(self.plugin_0045.getRoom(0, 0), [], Const.PROFILE[0]) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.plugin_0045.join_room(0, 0) + self.plugin._create_or_invite(self.plugin_0045.get_room(0, 0), [], Const.PROFILE[0]) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) - def test_createOrInvite_multi_not_waiting(self): + def test_create_or_invite_multi_not_waiting(self): self.reinit() - self.plugin_0045.joinRoom(0, 0) + self.plugin_0045.join_room(0, 0) other_players = [Const.JID[1], Const.JID[2]] - self.plugin._createOrInvite( - self.plugin_0045.getRoom(0, 0), other_players, Const.PROFILE[0] + self.plugin._create_or_invite( + self.plugin_0045.get_room(0, 0), other_players, Const.PROFILE[0] ) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) - def test_createOrInvite_multi_waiting(self): + def test_create_or_invite_multi_waiting(self): self.reinit(player_init={"score": 0}) - self.plugin_0045.joinRoom(0, 0) + self.plugin_0045.join_room(0, 0) other_players = [Const.JID[1], Const.JID[2]] - self.plugin._createOrInvite( - self.plugin_0045.getRoom(0, 0), other_players, Const.PROFILE[0] + self.plugin._create_or_invite( + self.plugin_0045.get_room(0, 0), other_players, Const.PROFILE[0] ) - self.assertTrue(self.plugin._gameExists(ROOM_JID, False)) - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) + self.assertTrue(self.plugin._game_exists(ROOM_JID, False)) + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) - def test_initGame(self): + def test_init_game(self): self.reinit() - self.initGame(0, 0) - self.assertTrue(self.plugin.isReferee(ROOM_JID, Const.JID[0].user)) + self.init_game(0, 0) + self.assertTrue(self.plugin.is_referee(ROOM_JID, Const.JID[0].user)) self.assertEqual([], self.plugin.games[ROOM_JID]["players"]) - def test_checkJoinAuth(self): + def test_check_join_auth(self): self.reinit() check = lambda value: getattr(self, "assert%s" % value)( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[0], Const.JID[0].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[0], Const.JID[0].user) ) check(False) # to test the "invited" mode, the referee must be different than the user to test - self.initGame(0, 1) + self.init_game(0, 1) self.plugin.join_mode = self.plugin.ALL check(True) self.plugin.join_mode = self.plugin.INVITED @@ -138,71 +138,71 @@ self.plugin.games[ROOM_JID]["players"].append(Const.JID[0].user) check(True) - def test_updatePlayers(self): + def test_update_players(self): self.reinit() - self.initGame(0, 0) + self.init_game(0, 0) self.assertEqual(self.plugin.games[ROOM_JID]["players"], []) - self.plugin._updatePlayers(ROOM_JID, [], True, Const.PROFILE[0]) + self.plugin._update_players(ROOM_JID, [], True, Const.PROFILE[0]) self.assertEqual(self.plugin.games[ROOM_JID]["players"], []) - self.plugin._updatePlayers(ROOM_JID, ["user1"], True, Const.PROFILE[0]) + self.plugin._update_players(ROOM_JID, ["user1"], True, Const.PROFILE[0]) self.assertEqual(self.plugin.games[ROOM_JID]["players"], ["user1"]) - self.plugin._updatePlayers(ROOM_JID, ["user2", "user3"], True, Const.PROFILE[0]) + self.plugin._update_players(ROOM_JID, ["user2", "user3"], True, Const.PROFILE[0]) self.assertEqual( self.plugin.games[ROOM_JID]["players"], ["user1", "user2", "user3"] ) - self.plugin._updatePlayers( + self.plugin._update_players( ROOM_JID, ["user2", "user3"], True, Const.PROFILE[0] ) # should not be stored twice self.assertEqual( self.plugin.games[ROOM_JID]["players"], ["user1", "user2", "user3"] ) - def test_synchronizeRoom(self): + def test_synchronize_room(self): self.reinit() - self.initGame(0, 0) - self.plugin._synchronizeRoom(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) + self.init_game(0, 0) + self.plugin._synchronize_room(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "players", []), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "players", []), ) self.plugin.games[ROOM_JID]["players"].append("test1") - self.plugin._synchronizeRoom(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) + self.plugin._synchronize_room(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "players", ["test1"]), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "players", ["test1"]), ) self.plugin.games[ROOM_JID]["started"] = True self.plugin.games[ROOM_JID]["players"].append("test2") - self.plugin._synchronizeRoom(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) + self.plugin._synchronize_room(ROOM_JID, [Const.MUC[0]], Const.PROFILE[0]) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "started", ["test1", "test2"]), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "started", ["test1", "test2"]), ) self.plugin.games[ROOM_JID]["players"].append("test3") self.plugin.games[ROOM_JID]["players"].append("test4") user1 = JID(ROOM_JID.userhost() + "/" + Const.JID[0].user) user2 = JID(ROOM_JID.userhost() + "/" + Const.JID[1].user) - self.plugin._synchronizeRoom(ROOM_JID, [user1, user2], Const.PROFILE[0]) - self.assertEqualXML( - self.host.getSentMessageXml(0), - self._expectedMessage( + self.plugin._synchronize_room(ROOM_JID, [user1, user2], Const.PROFILE[0]) + self.assert_equal_xml( + self.host.get_sent_message_xml(0), + self._expected_message( user1, "normal", "started", ["test1", "test2", "test3", "test4"] ), ) - self.assertEqualXML( - self.host.getSentMessageXml(0), - self._expectedMessage( + self.assert_equal_xml( + self.host.get_sent_message_xml(0), + self._expected_message( user2, "normal", "started", ["test1", "test2", "test3", "test4"] ), ) - def test_invitePlayers(self): + def test_invite_players(self): self.reinit() - self.initGame(0, 0) - self.plugin_0045.joinRoom(0, 1) + self.init_game(0, 0) + self.plugin_0045.join_room(0, 1) self.assertEqual(self.plugin.invitations[ROOM_JID], []) - room = self.plugin_0045.getRoom(0, 0) - nicks = self.plugin._invitePlayers( + room = self.plugin_0045.get_room(0, 0) + nicks = self.plugin._invite_players( room, [Const.JID[1], Const.JID[2]], Const.JID[0].user, Const.PROFILE[0] ) self.assertEqual( @@ -212,7 +212,7 @@ # the following assertion is True because Const.JID[1] and Const.JID[2] have the same userhost self.assertEqual(nicks, [Const.JID[1].user, Const.JID[2].user]) - nicks = self.plugin._invitePlayers( + nicks = self.plugin._invite_players( room, [Const.JID[1], Const.JID[3]], Const.JID[0].user, Const.PROFILE[0] ) self.assertEqual( @@ -222,11 +222,11 @@ # this time Const.JID[1] and Const.JID[3] have the same user but the host differs self.assertEqual(nicks, [Const.JID[1].user]) - def test_checkInviteAuth(self): + def test_check_invite_auth(self): def check(value, index): - nick = self.plugin_0045.getNick(0, index) + nick = self.plugin_0045.get_nick(0, index) getattr(self, "assert%s" % value)( - self.plugin._checkInviteAuth(ROOM_JID, nick) + self.plugin._check_invite_auth(ROOM_JID, nick) ) self.reinit() @@ -240,7 +240,7 @@ self.plugin.invite_mode = mode check(True, 0) - self.initGame(0, 0) + self.init_game(0, 0) self.plugin.invite_mode = self.plugin.FROM_ALL check(True, 0) check(True, 1) @@ -250,37 +250,37 @@ self.plugin.invite_mode = self.plugin.FROM_REFEREE check(True, 0) check(False, 1) - user_nick = self.plugin_0045.joinRoom(0, 1) + user_nick = self.plugin_0045.join_room(0, 1) self.plugin.games[ROOM_JID]["players"].append(user_nick) self.plugin.invite_mode = self.plugin.FROM_PLAYERS check(True, 0) check(True, 1) check(False, 2) - def test_isReferee(self): + def test_is_referee(self): self.reinit() - self.initGame(0, 0) - self.assertTrue(self.plugin.isReferee(ROOM_JID, self.plugin_0045.getNick(0, 0))) - self.assertFalse(self.plugin.isReferee(ROOM_JID, self.plugin_0045.getNick(0, 1))) + self.init_game(0, 0) + self.assertTrue(self.plugin.is_referee(ROOM_JID, self.plugin_0045.get_nick(0, 0))) + self.assertFalse(self.plugin.is_referee(ROOM_JID, self.plugin_0045.get_nick(0, 1))) - def test_isPlayer(self): + def test_is_player(self): self.reinit() - self.initGame(0, 0) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, self.plugin_0045.getNick(0, 0))) - user_nick = self.plugin_0045.joinRoom(0, 1) + self.init_game(0, 0) + self.assertTrue(self.plugin.is_player(ROOM_JID, self.plugin_0045.get_nick(0, 0))) + user_nick = self.plugin_0045.join_room(0, 1) self.plugin.games[ROOM_JID]["players"].append(user_nick) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, user_nick)) - self.assertFalse(self.plugin.isPlayer(ROOM_JID, self.plugin_0045.getNick(0, 2))) + self.assertTrue(self.plugin.is_player(ROOM_JID, user_nick)) + self.assertFalse(self.plugin.is_player(ROOM_JID, self.plugin_0045.get_nick(0, 2))) - def test_checkWaitAuth(self): + def test_check_wait_auth(self): def check(value, other_players, confirmed, rest): - room = self.plugin_0045.getRoom(0, 0) + room = self.plugin_0045.get_room(0, 0) self.assertEqual( - (value, confirmed, rest), self.plugin._checkWaitAuth(room, other_players) + (value, confirmed, rest), self.plugin._check_wait_auth(room, other_players) ) self.reinit() - self.initGame(0, 0) + self.init_game(0, 0) other_players = [Const.JID[1], Const.JID[3]] self.plugin.wait_mode = self.plugin.FOR_NONE check(True, [], [], []) @@ -292,22 +292,22 @@ check(True, [], [], []) check(False, [Const.JID[0]], [], [Const.JID[0]]) check(False, other_players, [], other_players) - self.plugin_0045.joinRoom(0, 1) + self.plugin_0045.join_room(0, 1) check(False, other_players, [], other_players) - self.plugin_0045.joinRoom(0, 4) + self.plugin_0045.join_room(0, 4) check( False, other_players, - [self.plugin_0045.getNickOfUser(0, 1, 0)], + [self.plugin_0045.get_nick_of_user(0, 1, 0)], [Const.JID[3]], ) - self.plugin_0045.joinRoom(0, 3) + self.plugin_0045.join_room(0, 3) check( True, other_players, [ - self.plugin_0045.getNickOfUser(0, 1, 0), - self.plugin_0045.getNickOfUser(0, 3, 0), + self.plugin_0045.get_nick_of_user(0, 1, 0), + self.plugin_0045.get_nick_of_user(0, 3, 0), ], [], ) @@ -318,189 +318,189 @@ True, other_players, [ - self.plugin_0045.getNickOfUser(0, 1, 0), - self.plugin_0045.getNickOfUser(0, 3, 0), - self.plugin_0045.getNickOfUser(0, 2, 0), + self.plugin_0045.get_nick_of_user(0, 1, 0), + self.plugin_0045.get_nick_of_user(0, 3, 0), + self.plugin_0045.get_nick_of_user(0, 2, 0), ], [], ) - def test_prepareRoom_trivial(self): + def test_prepare_room_trivial(self): self.reinit() other_players = [] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) self.assertTrue( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[0], Const.JID[0].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[0], Const.JID[0].user) ) - self.assertTrue(self.plugin._checkInviteAuth(ROOM_JID, Const.JID[0].user)) - self.assertEqual((True, [], []), self.plugin._checkWaitAuth(ROOM_JID, [])) - self.assertTrue(self.plugin.isReferee(ROOM_JID, Const.JID[0].user)) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, Const.JID[0].user)) + self.assertTrue(self.plugin._check_invite_auth(ROOM_JID, Const.JID[0].user)) + self.assertEqual((True, [], []), self.plugin._check_wait_auth(ROOM_JID, [])) + self.assertTrue(self.plugin.is_referee(ROOM_JID, Const.JID[0].user)) + self.assertTrue(self.plugin.is_player(ROOM_JID, Const.JID[0].user)) self.assertEqual( - (False, True), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (False, True), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) - def test_prepareRoom_invite(self): + def test_prepare_room_invite(self): self.reinit() other_players = [Const.JID[1], Const.JID[2]] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - room = self.plugin_0045.getRoom(0, 0) + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + room = self.plugin_0045.get_room(0, 0) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) self.assertTrue( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[1], Const.JID[1].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[1], Const.JID[1].user) ) self.assertFalse( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[3], Const.JID[3].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[3], Const.JID[3].user) ) - self.assertFalse(self.plugin._checkInviteAuth(ROOM_JID, Const.JID[1].user)) + self.assertFalse(self.plugin._check_invite_auth(ROOM_JID, Const.JID[1].user)) self.assertEqual( - (True, [], other_players), self.plugin._checkWaitAuth(room, other_players) + (True, [], other_players), self.plugin._check_wait_auth(room, other_players) ) - player2_nick = self.plugin_0045.joinRoom(0, 1) - self.plugin.userJoinedTrigger(room, room.roster[player2_nick], PROFILE) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, player2_nick)) - self.assertTrue(self.plugin._checkInviteAuth(ROOM_JID, player2_nick)) - self.assertFalse(self.plugin.isReferee(ROOM_JID, player2_nick)) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, player2_nick)) + player2_nick = self.plugin_0045.join_room(0, 1) + self.plugin.user_joined_trigger(room, room.roster[player2_nick], PROFILE) + self.assertTrue(self.plugin.is_player(ROOM_JID, player2_nick)) + self.assertTrue(self.plugin._check_invite_auth(ROOM_JID, player2_nick)) + self.assertFalse(self.plugin.is_referee(ROOM_JID, player2_nick)) + self.assertTrue(self.plugin.is_player(ROOM_JID, player2_nick)) self.assertTrue( - self.plugin.isPlayer(ROOM_JID, self.plugin_0045.getNickOfUser(0, 2, 0)) + self.plugin.is_player(ROOM_JID, self.plugin_0045.get_nick_of_user(0, 2, 0)) ) - self.assertFalse(self.plugin.isPlayer(ROOM_JID, "xxx")) + self.assertFalse(self.plugin.is_player(ROOM_JID, "xxx")) self.assertEqual( (False, False), - self.plugin._checkCreateGameAndInit(ROOM_JID, Const.PROFILE[1]), + self.plugin._check_create_game_and_init(ROOM_JID, Const.PROFILE[1]), ) - def test_prepareRoom_score1(self): + def test_prepare_room_score_1(self): self.reinit(player_init={"score": 0}) other_players = [Const.JID[1], Const.JID[2]] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - room = self.plugin_0045.getRoom(0, 0) + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + room = self.plugin_0045.get_room(0, 0) - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) self.assertTrue( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[1], Const.JID[1].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[1], Const.JID[1].user) ) self.assertFalse( - self.plugin._checkJoinAuth(ROOM_JID, Const.JID[3], Const.JID[3].user) + self.plugin._check_join_auth(ROOM_JID, Const.JID[3], Const.JID[3].user) ) - self.assertFalse(self.plugin._checkInviteAuth(ROOM_JID, Const.JID[1].user)) + self.assertFalse(self.plugin._check_invite_auth(ROOM_JID, Const.JID[1].user)) self.assertEqual( - (False, [], other_players), self.plugin._checkWaitAuth(room, other_players) + (False, [], other_players), self.plugin._check_wait_auth(room, other_players) ) - user_nick = self.plugin_0045.joinRoom(0, 1) - self.plugin.userJoinedTrigger(room, room.roster[user_nick], PROFILE) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, user_nick)) - self.assertFalse(self.plugin._checkInviteAuth(ROOM_JID, user_nick)) - self.assertFalse(self.plugin.isReferee(ROOM_JID, user_nick)) - self.assertTrue(self.plugin.isPlayer(ROOM_JID, user_nick)) + user_nick = self.plugin_0045.join_room(0, 1) + self.plugin.user_joined_trigger(room, room.roster[user_nick], PROFILE) + self.assertTrue(self.plugin.is_player(ROOM_JID, user_nick)) + self.assertFalse(self.plugin._check_invite_auth(ROOM_JID, user_nick)) + self.assertFalse(self.plugin.is_referee(ROOM_JID, user_nick)) + self.assertTrue(self.plugin.is_player(ROOM_JID, user_nick)) # the following assertion is True because Const.JID[1] and Const.JID[2] have the same userhost self.assertTrue( - self.plugin.isPlayer(ROOM_JID, self.plugin_0045.getNickOfUser(0, 2, 0)) + self.plugin.is_player(ROOM_JID, self.plugin_0045.get_nick_of_user(0, 2, 0)) ) # the following assertion is True because Const.JID[1] nick in the room is equal to Const.JID[3].user - self.assertTrue(self.plugin.isPlayer(ROOM_JID, Const.JID[3].user)) + self.assertTrue(self.plugin.is_player(ROOM_JID, Const.JID[3].user)) # but Const.JID[3] is actually not in the room - self.assertEqual(self.plugin_0045.getNickOfUser(0, 3, 0), None) + self.assertEqual(self.plugin_0045.get_nick_of_user(0, 3, 0), None) self.assertEqual( - (True, False), self.plugin._checkCreateGameAndInit(ROOM_JID, Const.PROFILE[0]) + (True, False), self.plugin._check_create_game_and_init(ROOM_JID, Const.PROFILE[0]) ) - def test_prepareRoom_score2(self): + def test_prepare_room_score_2(self): self.reinit(player_init={"score": 0}) other_players = [Const.JID[1], Const.JID[4]] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - room = self.plugin_0045.getRoom(0, 0) + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + room = self.plugin_0045.get_room(0, 0) - user_nick = self.plugin_0045.joinRoom(0, 1) - self.plugin.userJoinedTrigger(room, room.roster[user_nick], PROFILE) + user_nick = self.plugin_0045.join_room(0, 1) + self.plugin.user_joined_trigger(room, room.roster[user_nick], PROFILE) self.assertEqual( - (True, False), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (True, False), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) - user_nick = self.plugin_0045.joinRoom(0, 4) - self.plugin.userJoinedTrigger(room, room.roster[user_nick], PROFILE) + user_nick = self.plugin_0045.join_room(0, 4) + self.plugin.user_joined_trigger(room, room.roster[user_nick], PROFILE) self.assertEqual( - (False, True), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (False, True), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) - def test_userJoinedTrigger(self): + def test_user_joined_trigger(self): self.reinit(player_init={"xxx": "xyz"}) other_players = [Const.JID[1], Const.JID[3]] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - nicks = [self.plugin_0045.getNick(0, 0)] + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + nicks = [self.plugin_0045.get_nick(0, 0)] self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "players", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "players", nicks), ) self.assertTrue(len(self.plugin.invitations[ROOM_JID]) == 1) # wrong profile - user_nick = self.plugin_0045.joinRoom(0, 1) - room = self.plugin_0045.getRoom(0, 1) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[1]), OTHER_PROFILE) + user_nick = self.plugin_0045.join_room(0, 1) + room = self.plugin_0045.get_room(0, 1) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[1]), OTHER_PROFILE) self.assertEqual( - self.host.getSentMessage(0), None + self.host.get_sent_message(0), None ) # no new message has been sent - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) # game not started + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) # game not started # referee profile, user is allowed, wait for one more - room = self.plugin_0045.getRoom(0, 0) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[1]), PROFILE) + room = self.plugin_0045.get_room(0, 0) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[1]), PROFILE) nicks.append(user_nick) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "players", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "players", nicks), ) - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) # game not started + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) # game not started # referee profile, user is not allowed - user_nick = self.plugin_0045.joinRoom(0, 4) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[4]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 4) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[4]), PROFILE) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage( + self.host.get_sent_message_xml(0), + self._expected_message( JID(ROOM_JID.userhost() + "/" + user_nick), "normal", "players", nicks ), ) - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) # game not started + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) # game not started # referee profile, user is allowed, everybody here - user_nick = self.plugin_0045.joinRoom(0, 3) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[3]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 3) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[3]), PROFILE) nicks.append(user_nick) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "started", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "started", nicks), ) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) # game started + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) # game started self.assertTrue(len(self.plugin.invitations[ROOM_JID]) == 0) # wait for none self.reinit() - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - self.assertNotEqual(self.host.getSentMessage(0), None) # init messages - room = self.plugin_0045.getRoom(0, 0) - nicks = [self.plugin_0045.getNick(0, 0)] - user_nick = self.plugin_0045.joinRoom(0, 3) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[3]), PROFILE) + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + self.assertNotEqual(self.host.get_sent_message(0), None) # init messages + room = self.plugin_0045.get_room(0, 0) + nicks = [self.plugin_0045.get_nick(0, 0)] + user_nick = self.plugin_0045.join_room(0, 3) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[3]), PROFILE) nicks.append(user_nick) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "started", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "started", nicks), ) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) - def test_userLeftTrigger(self): + def test_user_left_trigger(self): self.reinit(player_init={"xxx": "xyz"}) other_players = [Const.JID[1], Const.JID[3], Const.JID[4]] - self.plugin.prepareRoom(other_players, ROOM_JID, PROFILE) - room = self.plugin_0045.getRoom(0, 0) - nicks = [self.plugin_0045.getNick(0, 0)] + self.plugin.prepare_room(other_players, ROOM_JID, PROFILE) + room = self.plugin_0045.get_room(0, 0) + nicks = [self.plugin_0045.get_nick(0, 0)] self.assertEqual( self.plugin.invitations[ROOM_JID][0][1], [ @@ -511,50 +511,50 @@ ) # one user joins - user_nick = self.plugin_0045.joinRoom(0, 1) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[1]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 1) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[1]), PROFILE) nicks.append(user_nick) # the user leaves self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) - room = self.plugin_0045.getRoom(0, 1) - # to not call self.plugin_0045.leaveRoom(0, 1) here, we are testing the trigger with a wrong profile - self.plugin.userLeftTrigger( + room = self.plugin_0045.get_room(0, 1) + # to not call self.plugin_0045.leave_room(0, 1) here, we are testing the trigger with a wrong profile + self.plugin.user_left_trigger( room, User(user_nick, Const.JID[1]), Const.PROFILE[1] ) # not the referee self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) - room = self.plugin_0045.getRoom(0, 0) - user_nick = self.plugin_0045.leaveRoom(0, 1) - self.plugin.userLeftTrigger( + room = self.plugin_0045.get_room(0, 0) + user_nick = self.plugin_0045.leave_room(0, 1) + self.plugin.user_left_trigger( room, User(user_nick, Const.JID[1]), PROFILE ) # referee nicks.pop() self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) # all the users join - user_nick = self.plugin_0045.joinRoom(0, 1) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[1]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 1) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[1]), PROFILE) nicks.append(user_nick) - user_nick = self.plugin_0045.joinRoom(0, 3) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[3]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 3) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[3]), PROFILE) nicks.append(user_nick) - user_nick = self.plugin_0045.joinRoom(0, 4) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[4]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 4) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[4]), PROFILE) nicks.append(user_nick) self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) self.assertTrue(len(self.plugin.invitations[ROOM_JID]) == 0) # one user leaves - user_nick = self.plugin_0045.leaveRoom(0, 4) - self.plugin.userLeftTrigger(room, User(user_nick, Const.JID[4]), PROFILE) + user_nick = self.plugin_0045.leave_room(0, 4) + self.plugin.user_left_trigger(room, User(user_nick, Const.JID[4]), PROFILE) nicks.pop() self.assertEqual( self.plugin.invitations[ROOM_JID][0][1], [Const.JID[4].userhostJID()] ) # another leaves - user_nick = self.plugin_0045.leaveRoom(0, 3) - self.plugin.userLeftTrigger(room, User(user_nick, Const.JID[3]), PROFILE) + user_nick = self.plugin_0045.leave_room(0, 3) + self.plugin.user_left_trigger(room, User(user_nick, Const.JID[3]), PROFILE) nicks.pop() self.assertEqual( self.plugin.invitations[ROOM_JID][0][1], @@ -562,64 +562,64 @@ ) # they can join again - user_nick = self.plugin_0045.joinRoom(0, 3) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[3]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 3) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[3]), PROFILE) nicks.append(user_nick) - user_nick = self.plugin_0045.joinRoom(0, 4) - self.plugin.userJoinedTrigger(room, User(user_nick, Const.JID[4]), PROFILE) + user_nick = self.plugin_0045.join_room(0, 4) + self.plugin.user_joined_trigger(room, User(user_nick, Const.JID[4]), PROFILE) nicks.append(user_nick) self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) self.assertTrue(len(self.plugin.invitations[ROOM_JID]) == 0) - def test__checkCreateGameAndInit(self): + def test_check_create_game_and_init(self): self.reinit() - helpers.muteLogging() + helpers.mute_logging() self.assertEqual( - (False, False), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (False, False), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) - helpers.unmuteLogging() + helpers.unmute_logging() - nick = self.plugin_0045.joinRoom(0, 0) + nick = self.plugin_0045.join_room(0, 0) self.assertEqual( - (True, False), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (True, False), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) - self.assertTrue(self.plugin._gameExists(ROOM_JID, False)) - self.assertFalse(self.plugin._gameExists(ROOM_JID, True)) - self.assertTrue(self.plugin.isReferee(ROOM_JID, nick)) + self.assertTrue(self.plugin._game_exists(ROOM_JID, False)) + self.assertFalse(self.plugin._game_exists(ROOM_JID, True)) + self.assertTrue(self.plugin.is_referee(ROOM_JID, nick)) - helpers.muteLogging() + helpers.mute_logging() self.assertEqual( - (False, False), self.plugin._checkCreateGameAndInit(ROOM_JID, OTHER_PROFILE) + (False, False), self.plugin._check_create_game_and_init(ROOM_JID, OTHER_PROFILE) ) - helpers.unmuteLogging() + helpers.unmute_logging() - self.plugin_0045.joinRoom(0, 1) + self.plugin_0045.join_room(0, 1) self.assertEqual( - (False, False), self.plugin._checkCreateGameAndInit(ROOM_JID, OTHER_PROFILE) + (False, False), self.plugin._check_create_game_and_init(ROOM_JID, OTHER_PROFILE) ) - self.plugin.createGame(ROOM_JID, [Const.JID[1]], PROFILE) + self.plugin.create_game(ROOM_JID, [Const.JID[1]], PROFILE) self.assertEqual( - (False, True), self.plugin._checkCreateGameAndInit(ROOM_JID, PROFILE) + (False, True), self.plugin._check_create_game_and_init(ROOM_JID, PROFILE) ) self.assertEqual( - (False, False), self.plugin._checkCreateGameAndInit(ROOM_JID, OTHER_PROFILE) + (False, False), self.plugin._check_create_game_and_init(ROOM_JID, OTHER_PROFILE) ) - def test_createGame(self): + def test_create_game(self): self.reinit(player_init={"xxx": "xyz"}) nicks = [] for i in [0, 1, 3, 4]: - nicks.append(self.plugin_0045.joinRoom(0, i)) + nicks.append(self.plugin_0045.join_room(0, i)) # game not exists - self.plugin.createGame(ROOM_JID, nicks, PROFILE) - self.assertTrue(self.plugin._gameExists(ROOM_JID, True)) + self.plugin.create_game(ROOM_JID, nicks, PROFILE) + self.assertTrue(self.plugin._game_exists(ROOM_JID, True)) self.assertEqual(self.plugin.games[ROOM_JID]["players"], nicks) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "started", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "started", nicks), ) for nick in nicks: self.assertEqual("init", self.plugin.games[ROOM_JID]["status"][nick]) @@ -635,20 +635,20 @@ # game exists, current profile is referee self.reinit(player_init={"xxx": "xyz"}) - self.initGame(0, 0) + self.init_game(0, 0) self.plugin.games[ROOM_JID]["started"] = True - self.plugin.createGame(ROOM_JID, nicks, PROFILE) + self.plugin.create_game(ROOM_JID, nicks, PROFILE) self.assertEqual( - self.host.getSentMessageXml(0), - self._expectedMessage(ROOM_JID, "groupchat", "started", nicks), + self.host.get_sent_message_xml(0), + self._expected_message(ROOM_JID, "groupchat", "started", nicks), ) # game exists, current profile is not referee self.reinit(player_init={"xxx": "xyz"}) - self.initGame(0, 0) + self.init_game(0, 0) self.plugin.games[ROOM_JID]["started"] = True - self.plugin_0045.joinRoom(0, 1) - self.plugin.createGame(ROOM_JID, nicks, OTHER_PROFILE) + self.plugin_0045.join_room(0, 1) + self.plugin.create_game(ROOM_JID, nicks, OTHER_PROFILE) self.assertEqual( - self.host.getSentMessage(0), None + self.host.get_sent_message(0), None ) # no sync message has been sent by other_profile
--- a/sat/test/test_plugin_misc_text_syntaxes.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_misc_text_syntaxes.py Sat Apr 08 13:54:42 2023 +0200 @@ -78,15 +78,15 @@ </img></body> </div>""" - d = self.text_syntaxes.cleanXHTML(self.EVIL_HTML1) - d.addCallback(self.assertEqualXML, expected, ignore_blank=True) + d = self.text_syntaxes.clean_xhtml(self.EVIL_HTML1) + d.addCallback(self.assert_equal_xml, expected, ignore_blank=True) return d def test_styles_sanitise(self): expected = """<p style="color: blue">test <strong>retest</strong><br/><span style="color: #cf2828; font-size: 3px; color: red; color: red !important; font-size: 100px !important; font-size: 100%; font-size: 100px; font-size: 100; font-size: 100 %; color: rgba(0, 0, 0, 0.1); color: rgb(35,79,255); background-color: no-repeat"> toto </span></p>""" - d = self.text_syntaxes.cleanXHTML(self.EVIL_HTML2) - d.addCallback(self.assertEqualXML, expected) + d = self.text_syntaxes.clean_xhtml(self.EVIL_HTML2) + d.addCallback(self.assert_equal_xml, expected) return d def test_html2text(self): @@ -105,11 +105,11 @@ d.addCallback(self.assertEqual, expected) return d - def test_removeXHTMLMarkups(self): + def test_remove_xhtml_markups(self): expected = """ a link another link a paragraph secret EVIL! of EVIL! Password: annoying EVIL! spam spam SPAM! """ - result = self.text_syntaxes._removeMarkups(self.EVIL_HTML1) + result = self.text_syntaxes._remove_markups(self.EVIL_HTML1) self.assertEqual(re.sub(r"\s+", " ", result).rstrip(), expected.rstrip()) expected = """test retest toto""" - result = self.text_syntaxes._removeMarkups(self.EVIL_HTML2) + result = self.text_syntaxes._remove_markups(self.EVIL_HTML2) self.assertEqual(re.sub(r"\s+", " ", result).rstrip(), expected.rstrip())
--- a/sat/test/test_plugin_xep_0033.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0033.py Sat Apr 08 13:54:42 2023 +0200 @@ -44,7 +44,7 @@ self.host = helpers.FakeSAT() self.plugin = plugin.XEP_0033(self.host) - def test_messageReceived(self): + def test_message_received(self): self.host.memory.reinit() xml = """ <message type="chat" from="%s" to="%s" id="test_1"> @@ -64,8 +64,8 @@ ) stanza = parseXml(xml.encode("utf-8")) treatments = defer.Deferred() - self.plugin.messageReceivedTrigger( - self.host.getClient(PROFILE), stanza, treatments + self.plugin.message_received_trigger( + self.host.get_client(PROFILE), stanza, treatments ) data = {"extra": {}} @@ -98,7 +98,7 @@ mess_data["xml"] = parseXml(original_stanza.encode("utf-8")) return mess_data - def _assertAddresses(self, mess_data): + def _assert_addresses(self, mess_data): """The mess_data that we got here has been modified by self.plugin.messageSendTrigger, check that the addresses element has been added to the stanza.""" expected = self._get_mess_data()["xml"] @@ -113,13 +113,13 @@ ) addresses_element = parseXml(addresses_extra.encode("utf-8")) expected.addChild(addresses_element) - self.assertEqualXML( + self.assert_equal_xml( mess_data["xml"].toXml().encode("utf-8"), expected.toXml().encode("utf-8") ) - def _checkSentAndStored(self): + def _check_sent_and_stored(self): """Check that all the recipients got their messages and that the history has been filled. - /!\ see the comments in XEP_0033.sendAndStoreMessage""" + /!\ see the comments in XEP_0033.send_and_store_message""" sent = [] stored = [] d_list = [] @@ -135,23 +135,23 @@ else: # feature not supported, use normal behavior sent.append(to_jid) stored.append(to_jid) - helpers.unmuteLogging() + helpers.unmute_logging() for to_s in (JID_STR_X_TO, JID_STR_X_CC, JID_STR_X_BCC): to_jid = JID(to_s) host = JID(to_jid.host) - helpers.muteLogging() - d = self.host.findFeaturesSet([plugin.NS_ADDRESS], jid_=host, profile=PROFILE) + helpers.mute_logging() + d = self.host.find_features_set([plugin.NS_ADDRESS], jid_=host, profile=PROFILE) d.addCallback(cb, to_jid) d_list.append(d) def cb_list(__): - msg = "/!\ see the comments in XEP_0033.sendAndStoreMessage" + msg = "/!\ see the comments in XEP_0033.send_and_store_message" sent_recipients = [ - JID(elt["to"]) for elt in self.host.getSentMessages(PROFILE_INDEX) + JID(elt["to"]) for elt in self.host.get_sent_messages(PROFILE_INDEX) ] - self.assertEqualUnsortedList(sent_recipients, sent, msg) - self.assertEqualUnsortedList(self.host.stored_messages, stored, msg) + self.assert_equal_unsorted_list(sent_recipients, sent, msg) + self.assert_equal_unsorted_list(self.host.stored_messages, stored, msg) return defer.DeferredList(d_list).addCallback(cb_list) @@ -164,48 +164,48 @@ """ pre_treatments = defer.Deferred() post_treatments = defer.Deferred() - helpers.muteLogging() + helpers.mute_logging() self.plugin.messageSendTrigger( - self.host.getClient[PROFILE], data, pre_treatments, post_treatments + self.host.get_client[PROFILE], data, pre_treatments, post_treatments ) post_treatments.callback(data) - helpers.unmuteLogging() + helpers.unmute_logging() post_treatments.addCallbacks( - self._assertAddresses, lambda failure: failure.trap(CancelError) + self._assert_addresses, lambda failure: failure.trap(CancelError) ) return post_treatments - def test_messageSendTriggerFeatureNotSupported(self): + def test_message_send_trigger_feature_not_supported(self): # feature is not supported, abort the message self.host.memory.reinit() data = self._get_mess_data() return self._trigger(data) - def test_messageSendTriggerFeatureSupported(self): + def test_message_send_trigger_feature_supported(self): # feature is supported by the main target server self.host.reinit() - self.host.addFeature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) + self.host.add_feature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) data = self._get_mess_data() d = self._trigger(data) - return d.addCallback(lambda __: self._checkSentAndStored()) + return d.addCallback(lambda __: self._check_sent_and_stored()) - def test_messageSendTriggerFeatureFullySupported(self): + def test_message_send_trigger_feature_fully_supported(self): # feature is supported by all target servers self.host.reinit() - self.host.addFeature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) + self.host.add_feature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) for dest in (JID_STR_X_TO, JID_STR_X_CC, JID_STR_X_BCC): - self.host.addFeature(JID(JID(dest).host), plugin.NS_ADDRESS, PROFILE) + self.host.add_feature(JID(JID(dest).host), plugin.NS_ADDRESS, PROFILE) data = self._get_mess_data() d = self._trigger(data) - return d.addCallback(lambda __: self._checkSentAndStored()) + return d.addCallback(lambda __: self._check_sent_and_stored()) - def test_messageSendTriggerFixWrongEntity(self): + def test_message_send_trigger_fix_wrong_entity(self): # check that a wrong recipient entity is fixed by the backend self.host.reinit() - self.host.addFeature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) + self.host.add_feature(JID(JID_STR_TO), plugin.NS_ADDRESS, PROFILE) for dest in (JID_STR_X_TO, JID_STR_X_CC, JID_STR_X_BCC): - self.host.addFeature(JID(JID(dest).host), plugin.NS_ADDRESS, PROFILE) + self.host.add_feature(JID(JID(dest).host), plugin.NS_ADDRESS, PROFILE) data = self._get_mess_data() data["to"] = JID(JID_STR_X_TO) d = self._trigger(data) - return d.addCallback(lambda __: self._checkSentAndStored()) + return d.addCallback(lambda __: self._check_sent_and_stored())
--- a/sat/test/test_plugin_xep_0085.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0085.py Sat Apr 08 13:54:42 2023 +0200 @@ -33,7 +33,7 @@ def setUp(self): self.host = helpers.FakeSAT() self.plugin = plugin.XEP_0085(self.host) - self.host.memory.setParam( + self.host.memory.param_set( plugin.PARAM_NAME, True, plugin.PARAM_KEY, @@ -41,7 +41,7 @@ Const.PROFILE[0], ) - def test_messageReceived(self): + def test_message_received(self): for state in plugin.CHAT_STATES: xml = """ <message type="chat" from="%s" to="%s" id="test_1"> @@ -56,17 +56,17 @@ plugin.NS_CHAT_STATES, ) stanza = parseXml(xml.encode("utf-8")) - self.host.bridge.expectCall( - "chatStateReceived", Const.JID_STR[1], state, Const.PROFILE[0] + self.host.bridge.expect_call( + "chat_state_received", Const.JID_STR[1], state, Const.PROFILE[0] ) - self.plugin.messageReceivedTrigger( - self.host.getClient(Const.PROFILE[0]), stanza, None + self.plugin.message_received_trigger( + self.host.get_client(Const.PROFILE[0]), stanza, None ) - def test_messageSendTrigger(self): + def test_message_send_trigger(self): def cb(data): xml = data["xml"].toXml().encode("utf-8") - self.assertEqualXML(xml, expected.toXml().encode("utf-8")) + self.assert_equal_xml(xml, expected.toXml().encode("utf-8")) d_list = [] @@ -91,7 +91,7 @@ expected.addElement(state, plugin.NS_CHAT_STATES) post_treatments = defer.Deferred() self.plugin.messageSendTrigger( - self.host.getClient(Const.PROFILE[0]), mess_data, None, post_treatments + self.host.get_client(Const.PROFILE[0]), mess_data, None, post_treatments ) post_treatments.addCallback(cb)
--- a/sat/test/test_plugin_xep_0203.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0203.py Sat Apr 08 13:54:42 2023 +0200 @@ -63,5 +63,5 @@ parent.addElement("body", None, "text") stamp = datetime.datetime(2002, 9, 10, 23, 8, 25, tzinfo=tzutc()) elt = self.plugin.delay(stamp, JID("capulet.com"), "Offline Storage", parent) - self.assertEqualXML(elt.toXml(), delay_xml, True) - self.assertEqualXML(parent.toXml(), message_xml, True) + self.assert_equal_xml(elt.toXml(), delay_xml, True) + self.assert_equal_xml(parent.toXml(), message_xml, True)
--- a/sat/test/test_plugin_xep_0277.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0277.py Sat Apr 08 13:54:42 2023 +0200 @@ -77,7 +77,7 @@ def __init__(self, host): pass - def addPEPEvent(self, *args): + def add_pep_event(self, *args): pass self.host.plugins["XEP-0060"] = plugin_xep_0060.XEP_0060(self.host)
--- a/sat/test/test_plugin_xep_0297.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0297.py Sat Apr 08 13:54:42 2023 +0200 @@ -82,8 +82,8 @@ profile_key=C.PROFILE[0], ) d.addCallback( - lambda __: self.assertEqualXML( - self.host.getSentMessageXml(0), output, True + lambda __: self.assert_equal_xml( + self.host.get_sent_message_xml(0), output, True ) ) return d
--- a/sat/test/test_plugin_xep_0313.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0313.py Sat Apr 08 13:54:42 2023 +0200 @@ -42,11 +42,11 @@ def setUp(self): self.host = helpers.FakeSAT() self.plugin = XEP_0313(self.host) - self.client = self.host.getClient(C.PROFILE[0]) - mam_client = self.plugin.getHandler(C.PROFILE[0]) - mam_client.makeConnection(self.host.getClient(C.PROFILE[0]).xmlstream) + self.client = self.host.get_client(C.PROFILE[0]) + mam_client = self.plugin.get_handler(C.PROFILE[0]) + mam_client.makeConnection(self.host.get_client(C.PROFILE[0]).xmlstream) - def test_queryArchive(self): + def test_query_archive(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'/> @@ -57,11 +57,11 @@ ) d = self.plugin.queryArchive(self.client, MAMRequest(), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchivePubsub(self): + def test_query_archive_pubsub(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1' node='fdp/submitted/capulet.lit/sonnets' /> @@ -74,11 +74,11 @@ self.client, MAMRequest(node="fdp/submitted/capulet.lit/sonnets"), SERVICE_JID ) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveWith(self): + def test_query_archive_with(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -99,11 +99,11 @@ form = buildForm(with_jid=JID("juliet@capulet.lit")) d = self.plugin.queryArchive(self.client, MAMRequest(form), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveStartEnd(self): + def test_query_archive_start_end(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -129,11 +129,11 @@ form = buildForm(start=start, end=end) d = self.plugin.queryArchive(self.client, MAMRequest(form), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveStart(self): + def test_query_archive_start(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -155,11 +155,11 @@ form = buildForm(start=start) d = self.plugin.queryArchive(self.client, MAMRequest(form), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveRSM(self): + def test_query_archive_rsm(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -185,11 +185,11 @@ rsm = RSMRequest(max_=10) d = self.plugin.queryArchive(self.client, MAMRequest(form, rsm), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveRSMPaging(self): + def test_query_archive_rsm_paging(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -212,11 +212,11 @@ rsm = RSMRequest(max_=10, after="09af3-cc343-b409f") d = self.plugin.queryArchive(self.client, MAMRequest(form, rsm), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryFields(self): + def test_query_fields(self): xml = """ <iq type='get' id="%s" to='%s'> <query xmlns='urn:xmpp:mam:1'/> @@ -227,11 +227,11 @@ ) d = self.plugin.queryFields(self.client, SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryArchiveFields(self): + def test_query_archive_fields(self): xml = """ <iq type='set' id='%s' to='%s'> <query xmlns='urn:xmpp:mam:1'> @@ -267,11 +267,11 @@ form = buildForm(extra_fields=extra_fields) d = self.plugin.queryArchive(self.client, MAMRequest(form), SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_queryPrefs(self): + def test_query_prefs(self): xml = """ <iq type='get' id='%s' to='%s'> <prefs xmlns='urn:xmpp:mam:1'> @@ -283,13 +283,13 @@ ("H_%d" % domish.Element._idCounter), SERVICE, ) - d = self.plugin.getPrefs(self.client, SERVICE_JID) + d = self.plugin.get_prefs(self.client, SERVICE_JID) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d - def test_setPrefs(self): + def test_set_prefs(self): xml = """ <iq type='set' id='%s' to='%s'> <prefs xmlns='urn:xmpp:mam:1' default='roster'> @@ -309,6 +309,6 @@ never = [JID("montague@montague.lit")] d = self.plugin.setPrefs(self.client, SERVICE_JID, always=always, never=never) d.addCallback( - lambda __: self.assertEqualXML(self.host.getSentMessageXml(0), xml, True) + lambda __: self.assert_equal_xml(self.host.get_sent_message_xml(0), xml, True) ) return d
--- a/sat/test/test_plugin_xep_0334.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/test/test_plugin_xep_0334.py Sat Apr 08 13:54:42 2023 +0200 @@ -35,7 +35,7 @@ self.host = helpers.FakeSAT() self.plugin = XEP_0334(self.host) - def test_messageSendTrigger(self): + def test_message_send_trigger(self): template_xml = """ <message from='romeo@montague.net/orchard' @@ -51,7 +51,7 @@ def cb(data, expected_xml): result_xml = data["xml"].toXml().encode("utf-8") - self.assertEqualXML(result_xml, expected_xml, True) + self.assert_equal_xml(result_xml, expected_xml, True) for key in HINTS + ("", "dummy_hint"): mess_data = { @@ -60,7 +60,7 @@ } treatments = defer.Deferred() self.plugin.messageSendTrigger( - self.host.getClient(C.PROFILE[0]), mess_data, defer.Deferred(), treatments + self.host.get_client(C.PROFILE[0]), mess_data, defer.Deferred(), treatments ) if treatments.callbacks: # the trigger added a callback expected_xml = template_xml % ('<%s xmlns="urn:xmpp:hints"/>' % key) @@ -70,7 +70,7 @@ return defer.DeferredList(d_list) - def test_messageReceivedTrigger(self): + def test_message_received_trigger(self): template_xml = """ <message from='romeo@montague.net/orchard' @@ -92,8 +92,8 @@ for key in HINTS + ("dummy_hint",): message = parseXml(template_xml % ('<%s xmlns="urn:xmpp:hints"/>' % key)) post_treat = defer.Deferred() - self.plugin.messageReceivedTrigger( - self.host.getClient(C.PROFILE[0]), message, post_treat + self.plugin.message_received_trigger( + self.host.get_client(C.PROFILE[0]), message, post_treat ) if post_treat.callbacks: assert key in ("no-permanent-storage", "no-storage")
--- a/sat/tools/async_trigger.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/async_trigger.py Sat Apr 08 13:54:42 2023 +0200 @@ -25,10 +25,10 @@ from twisted.internet import defer class TriggerManager(sync_trigger.TriggerManager): - """This is a TriggerManager with an new asyncPoint method""" + """This is a TriggerManager with an new async_point method""" @defer.inlineCallbacks - def asyncPoint(self, point_name, *args, **kwargs): + def async_point(self, point_name, *args, **kwargs): """This put a trigger point with potentially async Deferred All the triggers for that point will be run @@ -47,25 +47,25 @@ for priority, trigger in self.__triggers[point_name]: try: - cont = yield utils.asDeferred(trigger, *args, **kwargs) + cont = yield utils.as_deferred(trigger, *args, **kwargs) if can_cancel and not cont: defer.returnValue(False) except sync_trigger.SkipOtherTriggers: break defer.returnValue(True) - async def asyncReturnPoint( + async def async_return_point( self, point_name: str, *args, **kwargs ) -> Tuple[bool, Any]: - """Async version of returnPoint""" + """Async version of return_point""" if point_name not in self.__triggers: return True, None for priority, trigger in self.__triggers[point_name]: try: - cont, ret_value = await utils.asDeferred(trigger, *args, **kwargs) + cont, ret_value = await utils.as_deferred(trigger, *args, **kwargs) if not cont: return False, ret_value except sync_trigger.SkipOtherTriggers:
--- a/sat/tools/common/data_format.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/data_format.py Sat Apr 08 13:54:42 2023 +0200 @@ -107,7 +107,7 @@ return dict -def getSubDict(name, dict_, sep="_"): +def get_sub_dict(name, dict_, sep="_"): """get a sub dictionary from a serialised dictionary look for keys starting with name, and create a dict with it
--- a/sat/tools/common/data_objects.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/data_objects.py Sat Apr 08 13:54:42 2023 +0200 @@ -43,7 +43,7 @@ self._subject_data = msg_data[5] self._type = msg_data[6] self._extra = data_format.deserialise(msg_data[7]) - self._html = dict(data_format.getSubDict("xhtml", self._extra)) + self._html = dict(data_format.get_sub_dict("xhtml", self._extra)) @property def id(self): @@ -203,7 +203,7 @@ def __init__(self, url=None): self.url = url - def formatUrl(self, *args, **kwargs): + def format_url(self, *args, **kwargs): """format URL using Python formatting values will be quoted before being used
--- a/sat/tools/common/dynamic_import.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/dynamic_import.py Sat Apr 08 13:54:42 2023 +0200 @@ -26,7 +26,7 @@ def bridge(name, module_path="sat.bridge"): - """Import bridge module + """import bridge module @param module_path(str): path of the module to import @param name(str): name of the bridge to import (e.g.: dbus)
--- a/sat/tools/common/email.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/email.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,7 +29,7 @@ log = getLogger(__name__) -def sendEmail(config, to_emails, subject="", body="", from_email=None): +def send_email(config, to_emails, subject="", body="", from_email=None): """Send an email using SàT configuration @param config (SafeConfigParser): the configuration instance @@ -42,11 +42,11 @@ """ if isinstance(to_emails, str): to_emails = to_emails.split() - email_host = tools_config.getConfig(config, None, "email_server") or "localhost" - email_from = from_email or tools_config.getConfig(config, None, "email_from") + email_host = tools_config.config_get(config, None, "email_server") or "localhost" + email_from = from_email or tools_config.config_get(config, None, "email_from") # we suppose that email domain and XMPP domain are identical - domain = tools_config.getConfig(config, None, "xmpp_domain") + domain = tools_config.config_get(config, None, "xmpp_domain") if domain is None: if email_from is not None: domain = email_from.split("@", 1)[-1] @@ -55,14 +55,14 @@ if email_from is None: email_from = "no_reply@" + domain - email_sender_domain = tools_config.getConfig( + email_sender_domain = tools_config.config_get( config, None, "email_sender_domain", domain ) - email_port = int(tools_config.getConfig(config, None, "email_port", 25)) - email_username = tools_config.getConfig(config, None, "email_username") - email_password = tools_config.getConfig(config, None, "email_password") - email_auth = C.bool(tools_config.getConfig(config, None, "email_auth", C.BOOL_FALSE)) - email_starttls = C.bool(tools_config.getConfig(config, None, "email_starttls", + email_port = int(tools_config.config_get(config, None, "email_port", 25)) + email_username = tools_config.config_get(config, None, "email_username") + email_password = tools_config.config_get(config, None, "email_password") + email_auth = C.bool(tools_config.config_get(config, None, "email_auth", C.BOOL_FALSE)) + email_starttls = C.bool(tools_config.config_get(config, None, "email_starttls", C.BOOL_FALSE)) msg = MIMEText(body, "plain", "UTF-8")
--- a/sat/tools/common/regex.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/regex.py Sat Apr 08 13:54:42 2023 +0200 @@ -33,12 +33,12 @@ TEXT_WORD_MIN_LENGHT = 0 -def reJoin(exps): +def re_join(exps): """Join (OR) various regexes""" return re.compile("|".join(exps)) -def reSubDict(pattern, repl_dict, string): +def re_sub_dict(pattern, repl_dict, string): """Replace key, value found in dict according to pattern @param pattern(basestr): pattern using keys found in repl_dict @@ -49,29 +49,29 @@ return pattern.sub(lambda m: repl_dict[re.escape(m.group(0))], string) -path_escape_re = reJoin(list(path_escape.keys())) -path_escape_rev_re = reJoin(list(path_escape_rev.keys())) +path_escape_re = re_join(list(path_escape.keys())) +path_escape_rev_re = re_join(list(path_escape_rev.keys())) -def pathEscape(string): +def path_escape(string): """Escape string so it can be use in a file path @param string(basestr): string to escape @return (str, unicode): escaped string, usable in a file path """ - return reSubDict(path_escape_re, path_escape, string) + return re_sub_dict(path_escape_re, path_escape, string) -def pathUnescape(string): +def path_unescape(string): """Unescape string from value found in file path @param string(basestr): string found in file path @return (str, unicode): unescaped string """ - return reSubDict(path_escape_rev_re, path_escape_rev, string) + return re_sub_dict(path_escape_rev_re, path_escape_rev, string) -def ansiRemove(string): +def ansi_remove(string): """Remove ANSI escape codes from string @param string(basestr): string to filter @@ -80,7 +80,7 @@ return RE_ANSI_REMOVE.sub("", string) -def urlFriendlyText(text): +def url_friendly_text(text): """Convert text to url-friendly one""" # we change special chars to ascii one, # trick found at https://stackoverflow.com/a/3194567
--- a/sat/tools/common/template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/template.py Sat Apr 08 13:54:42 2023 +0200 @@ -162,7 +162,7 @@ return TemplateData(site, theme, template_path) @staticmethod - def getSitesAndThemes( + def get_sites_and_themes( site: str, theme: str, settings: Optional[dict] = None, @@ -205,7 +205,7 @@ raise exceptions.InternalError( "_get_template_f must not be used with absolute path") settings = self.sites_themes[site][theme]['settings'] - for site_to_check, theme_to_check in self.getSitesAndThemes( + for site_to_check, theme_to_check in self.get_sites_and_themes( site, theme, settings): try: base_path = self.sites_paths[site_to_check] @@ -328,12 +328,12 @@ scripts = [] tpl = "<script src={src} {attribute}></script>" for library, attribute in self.scripts: - library_path = self.renderer.getStaticPath(self.template_data, library) + library_path = self.renderer.get_static_path(self.template_data, library) if library_path is None: log.warning(_("Can't find {libary} javascript library").format( library=library)) continue - path = self.renderer.getFrontURL(library_path) + path = self.renderer.get_front_url(library_path) scripts.append(tpl.format(src=quoteattr(path), attribute=attribute)) return safe("\n".join(scripts)) @@ -377,11 +377,11 @@ } self.sites_themes = { } - conf = config.parseMainConf() - public_sites = config.getConfig(conf, None, "sites_path_public_dict", {}) + conf = config.parse_main_conf() + public_sites = config.config_get(conf, None, "sites_path_public_dict", {}) sites_data = [public_sites] if private: - private_sites = config.getConfig(conf, None, "sites_path_private_dict", {}) + private_sites = config.config_get(conf, None, "sites_path_private_dict", {}) sites_data.append(private_sites) for sites in sites_data: normalised = {} @@ -459,7 +459,7 @@ self.env._template_data = None self._locale_str = C.DEFAULT_LOCALE self._locale = Locale.parse(self._locale_str) - self.installTranslations() + self.install_translations() # we want to have access to SàT constants in templates self.env.globals["C"] = C @@ -490,7 +490,7 @@ "default": lambda o: o.to_json() if hasattr(o, "to_json") else None } - def getFrontURL(self, template_data, path=None): + def get_front_url(self, template_data, path=None): """Give front URL (i.e. URL seen by end-user) of a path @param template_data[TemplateData]: data of current template @@ -500,7 +500,7 @@ return self.env.filters["front_url"]({"template_data": template_data}, path or template_data.path) - def installTranslations(self): + def install_translations(self): # TODO: support multi translation # for now, only translations in sat_templates are handled self.translations = {} @@ -546,7 +546,7 @@ key=lambda l: l.language_name.lower())) - def setLocale(self, locale_str): + def set_locale(self, locale_str): """set current locale change current translation locale and self self._locale and self._locale_str @@ -582,7 +582,7 @@ self._locale = locale self._locale_str = locale_str - def getThemeAndRoot(self, template): + def get_theme_and_root(self, template): """retrieve theme and root dir of a given template @param template(unicode): template to parse @@ -600,13 +600,13 @@ raise exceptions.NotFound return theme, os.path.join(site_root_dir, C.TEMPLATE_TPL_DIR, theme) - def getThemesData(self, site_name): + def get_themes_data(self, site_name): try: return self.sites_themes[site_name] except KeyError: raise exceptions.NotFound(f"no theme found for {site_name}") - def getStaticPath( + def get_static_path( self, template_data: TemplateData, filename: str, @@ -640,7 +640,7 @@ else: return None - sites_and_themes = TemplateLoader.getSitesAndThemes(template_data.site, + sites_and_themes = TemplateLoader.get_sites_and_themes(template_data.site, template_data.theme, settings) for site, theme in sites_and_themes: @@ -653,7 +653,7 @@ return None - def _appendCSSPaths( + def _append_css_paths( self, template_data: TemplateData, css_files: list, @@ -669,16 +669,16 @@ with "_noscript" suffix """ name = name_root + ".css" - css_path = self.getStaticPath(template_data, name, settings) + css_path = self.get_static_path(template_data, name, settings) if css_path is not None: - css_files.append(self.getFrontURL(css_path)) + css_files.append(self.get_front_url(css_path)) noscript_name = name_root + "_noscript.css" - noscript_path = self.getStaticPath( + noscript_path = self.get_static_path( template_data, noscript_name, settings) if noscript_path is not None: - css_files_noscript.append(self.getFrontURL(noscript_path)) + css_files_noscript.append(self.get_front_url(noscript_path)) - def getCSSFiles(self, template_data): + def get_css_files(self, template_data): """Retrieve CSS files to use according template_data For each element of the path, a .css file is looked for in /static, and returned @@ -719,17 +719,17 @@ else: settings = self.sites_themes[site][template_data.theme]['settings'] - css_path = self.getStaticPath(template_data, 'fonts.css', settings) + css_path = self.get_static_path(template_data, 'fonts.css', settings) if css_path is not None: - css_files.append(self.getFrontURL(css_path)) + css_files.append(self.get_front_url(css_path)) for name_root in ('styles', 'styles_extra', 'highlight'): - self._appendCSSPaths( + self._append_css_paths( template_data, css_files, css_files_noscript, name_root, settings) for idx in range(len(path_elems)): name_root = "_".join(path_elems[:idx+1]) - self._appendCSSPaths( + self._append_css_paths( template_data, css_files, css_files_noscript, name_root, settings) return css_files, css_files_noscript @@ -1024,7 +1024,7 @@ template_source = self.env.get_template(template) if css_files is None: - css_files, css_files_noscript = self.getCSSFiles(template_data) + css_files, css_files_noscript = self.get_css_files(template_data) else: css_files_noscript = [] @@ -1045,7 +1045,7 @@ kwargs["css_content" + suffix] = "\n".join(css_contents) scripts_handler = ScriptsHandler(self, template_data) - self.setLocale(locale) + self.set_locale(locale) # XXX: theme used in template arguments is the requested theme, which may differ # from actual theme if the template doesn't exist in the requested theme.
--- a/sat/tools/common/template_xmlui.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/template_xmlui.py Sat Apr 08 13:54:42 2023 +0200 @@ -191,10 +191,10 @@ def __iter__(self): return iter(self.children) - def _xmluiAppend(self, widget): + def _xmlui_append(self, widget): self.children.append(widget) - def _xmluiRemove(self, widget): + def _xmlui_remove(self, widget): self.children.remove(widget)
--- a/sat/tools/common/tls.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/tls.py Sat Apr 08 13:54:42 2023 +0200 @@ -33,14 +33,14 @@ log = getLogger(__name__) -def getOptionsFromConfig(config, section=""): +def get_options_from_config(config, section=""): options = {} for option in ('tls_certificate', 'tls_private_key', 'tls_chain'): - options[option] = tools_config.getConfig(config, section, option) + options[option] = tools_config.config_get(config, section, option) return options -def TLSOptionsCheck(options): +def tls_options_check(options): """Check options coherence if TLS is activated, and update missing values Must be called only if TLS is activated @@ -52,7 +52,7 @@ options["tls_private_key"] = options["tls_certificate"] -def loadCertificates(f): +def load_certificates(f): """Read a .pem file with a list of certificates @param f (file): file obj (opened .pem file) @@ -78,7 +78,7 @@ return certificates -def loadPKey(f): +def load_p_key(f): """Read a private key from a .pem file @param f (file): file obj (opened .pem file) @@ -88,7 +88,7 @@ return OpenSSL.crypto.load_privatekey(OpenSSL.crypto.FILETYPE_PEM, f.read()) -def loadCertificate(f): +def load_certificate(f): """Read a public certificate from a .pem file @param f (file): file obj (opened .pem file) @@ -98,7 +98,7 @@ return OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, f.read()) -def getTLSContextFactory(options): +def get_tls_context_factory(options): """Load TLS certificate and build the context factory needed for listenSSL""" if ssl is None: raise ImportError("Python module pyOpenSSL is not installed!") @@ -106,9 +106,9 @@ cert_options = {} for name, option, method in [ - ("privateKey", "tls_private_key", loadPKey), - ("certificate", "tls_certificate", loadCertificate), - ("extraCertChain", "tls_chain", loadCertificates), + ("privateKey", "tls_private_key", load_p_key), + ("certificate", "tls_certificate", load_certificate), + ("extraCertChain", "tls_chain", load_certificates), ]: path = options[option] if not path:
--- a/sat/tools/common/uri.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/uri.py Sat Apr 08 13:54:42 2023 +0200 @@ -27,7 +27,7 @@ # FIXME: basic implementation, need to follow RFC 5122 -def parseXMPPUri(uri): +def parse_xmpp_uri(uri): """Parse an XMPP uri and return a dict with various information @param uri(unicode): uri to parse @@ -87,7 +87,7 @@ return data -def addPairs(uri, pairs): +def add_pairs(uri, pairs): for k, v in pairs.items(): uri.append( ";" @@ -97,7 +97,7 @@ ) -def buildXMPPUri(type_: Optional[str] = None, **kwargs: str) -> str: +def build_xmpp_uri(type_: Optional[str] = None, **kwargs: str) -> str: uri = ["xmpp:"] subtype = kwargs.pop("subtype", None) path = kwargs.pop("path") @@ -114,7 +114,7 @@ kwargs["node"] = "urn:xmpp:microblog:0" if kwargs: uri.append("?") - addPairs(uri, kwargs) + add_pairs(uri, kwargs) else: raise NotImplementedError("{type_} URI are not handled yet".format(type_=type_))
--- a/sat/tools/common/utils.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/common/utils.py Sat Apr 08 13:54:42 2023 +0200 @@ -111,7 +111,7 @@ self._dict.update({i: None for i in items}) -def parseSize(size): +def parse_size(size): """Parse a file size with optional multiple symbole""" try: return int(size) @@ -143,7 +143,7 @@ raise ValueError(f"invalid size: {e}") -def getSizeMultiplier(size, suffix="o"): +def get_size_multiplier(size, suffix="o"): """Get multiplier of a file size""" size = int(size) # cf. https://stackoverflow.com/a/1094933 (thanks) @@ -154,6 +154,6 @@ return size, f"Yi{suffix}" -def getHumanSize(size, suffix="o", sep=" "): - size, symbol = getSizeMultiplier(size, suffix) +def get_human_size(size, suffix="o", sep=" "): + size, symbol = get_size_multiplier(size, suffix) return f"{size:.2f}{sep}{symbol}"
--- a/sat/tools/config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/config.py Sat Apr 08 13:54:42 2023 +0200 @@ -34,7 +34,7 @@ log = getLogger(__name__) -def fixConfigOption(section, option, value, silent=True): +def fix_config_option(section, option, value, silent=True): """Force a configuration option value the option will be written in the first found user config file, a new user @@ -75,7 +75,7 @@ config_file=target_file)) -def parseMainConf(log_filenames=False): +def parse_main_conf(log_filenames=False): """Look for main .ini configuration file, and parse it @param log_filenames(bool): if True, log filenames of read config files @@ -99,7 +99,7 @@ return config -def getConfig(config, section, name, default=None): +def config_get(config, section, name, default=None): """Get a configuration option @param config (ConfigParser): the configuration instance @@ -145,7 +145,7 @@ return value -def getConf( +def get_conf( conf: ConfigParser, prefix: str, section: str, @@ -168,4 +168,4 @@ """ # XXX: This is a temporary method until parameters are refactored value = os.getenv(f"LIBERVIA_{prefix}_{name}".upper()) - return value or getConfig(conf, section, f"{prefix}_{name}", default) + return value or config_get(conf, section, f"{prefix}_{name}", default)
--- a/sat/tools/image.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/image.py Sat Apr 08 13:54:42 2023 +0200 @@ -53,7 +53,7 @@ report = {} image = Image.open(path) if max_size is None: - max_size = tuple(host.memory.getConfig(None, "image_max", (1200, 720))) + max_size = tuple(host.memory.config_get(None, "image_max", (1200, 720))) if image.size > max_size: report['too_large'] = True if image.size[0] > max_size[0]:
--- a/sat/tools/sat_defer.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/sat_defer.py Sat Apr 08 13:54:42 2023 +0200 @@ -35,7 +35,7 @@ KEY_NEXT = "next_defer" -def stanza2NotFound(failure_): +def stanza_2_not_found(failure_): """Convert item-not-found StanzaError to exceptions.NotFound""" failure_.trap(jabber_error.StanzaError) if failure_.value.condition == 'item-not-found': @@ -95,7 +95,7 @@ timeout=timeout, resettable_timeout=False ) - def newSession(self, deferreds, profile): + def new_session(self, deferreds, profile): """Launch a new session with a list of deferreds @param deferreds(list[defer.Deferred]): list of deferred to call @@ -103,7 +103,7 @@ @param return (tupe[str, defer.Deferred]): tuple with session id and a deferred wich fire *WITHOUT RESULT* when all results are received """ data = {KEY_NEXT: defer.Deferred()} - session_id, session_data = super(RTDeferredSessions, self).newSession( + session_id, session_data = super(RTDeferredSessions, self).new_session( data, profile=profile ) if isinstance(deferreds, dict): @@ -120,7 +120,7 @@ d.addErrback(self._errback, d, session_id, profile) return session_id - def _purgeSession( + def _purge_session( self, session_id, reason="timeout", no_warning=False, got_result=False ): """Purge the session @@ -155,14 +155,14 @@ "RTDeferredList cancelled: {} (profile {})".format(reason, profile) ) - super(RTDeferredSessions, self)._purgeSession(session_id) + super(RTDeferredSessions, self)._purge_session(session_id) def _gotResult(self, session_id, profile): """Method called after each callback or errback manage the next_defer deferred """ - session_data = self.profileGet(session_id, profile) + session_data = self.profile_get(session_id, profile) defer_next = session_data[KEY_NEXT] if not defer_next.called: defer_next.callback(None) @@ -182,9 +182,9 @@ @param reason (unicode): reason of the cancellation @param no_log(bool): if True, don't log the cancellation """ - self._purgeSession(session_id, reason=reason, no_warning=no_log) + self._purge_session(session_id, reason=reason, no_warning=no_log) - def getResults( + def get_results( self, session_id, on_success=None, on_error=None, profile=C.PROF_KEY_NONE ): """Get current results of a real-time deferred session @@ -212,7 +212,7 @@ """ if profile == C.PROF_KEY_NONE: raise exceptions.ProfileNotSetError - session_data = self.profileGet(session_id, profile) + session_data = self.profile_get(session_id, profile) @defer.inlineCallbacks def next_cb(__): @@ -261,7 +261,7 @@ else: # no more data to get, the result have been gotten, # we can cleanly finish the session - self._purgeSession(session_id, got_result=True) + self._purge_session(session_id, got_result=True) defer.returnValue((len(filtered_data), results))
--- a/sat/tools/stream.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/stream.py Sat Apr 08 13:54:42 2023 +0200 @@ -37,7 +37,7 @@ class IStreamProducer(interface.Interface): - def startStream(consumer): + def start_stream(consumer): """start producing the stream @return (D): deferred fired when stream is finished @@ -68,18 +68,18 @@ @param path(Path, str): path to the file to get or write to @param mode(str): same as for built-in "open" function @param uid(unicode, None): unique id identifing this progressing element - This uid will be used with self.host.progressGet + This uid will be used with self.host.progress_get will be automaticaly generated if None @param size(None, int): size of the file (when known in advance) @param data_cb(None, callable): method to call on each data read/write can be used to do processing like calculating hash. if data_cb return a non None value, it will be used instead of the data read/to write - @param auto_end_signals(bool): if True, progressFinished and progressError signals + @param auto_end_signals(bool): if True, progress_finished and progress_error signals are automatically sent. - if False, you'll have to call self.progressFinished and self.progressError + if False, you'll have to call self.progress_finished and self.progress_error yourself. - progressStarted signal is always sent automatically + progress_started signal is always sent automatically @param check_size_with_read(bool): if True, size will be checked using number of bytes read or written. This is useful when data_cb modifiy len of file. @param pre_close_cb: @@ -92,11 +92,11 @@ self.data_cb = data_cb self.auto_end_signals = auto_end_signals self.pre_close_cb = pre_close_cb - metadata = self.getProgressMetadata() - self.host.registerProgressCb( - self.uid, self.getProgress, metadata, profile=client.profile + metadata = self.get_progress_metadata() + self.host.register_progress_cb( + self.uid, self.get_progress, metadata, profile=client.profile ) - self.host.bridge.progressStarted(self.uid, metadata, client.profile) + self.host.bridge.progress_started(self.uid, metadata, client.profile) self._transfer_count = 0 if check_size_with_read else None @@ -111,7 +111,7 @@ else: self._transfer_count = None - def checkSize(self): + def check_size(self): """Check that current size correspond to given size must be used when the transfer is supposed to be finished @@ -142,7 +142,7 @@ self.pre_close_cb() if error is None: try: - size_ok = self.checkSize() + size_ok = self.check_size() except exceptions.NotFound: size_ok = True if not size_ok: @@ -154,12 +154,12 @@ if self.auto_end_signals: if error is None: - self.progressFinished(progress_metadata) + self.progress_finished(progress_metadata) else: assert progress_metadata is None - self.progressError(error) + self.progress_error(error) - self.host.removeProgressCb(self.uid, self.profile) + self.host.remove_progress_cb(self.uid, self.profile) if error is not None: log.error(f"file {self._file} closed with an error: {error}") @@ -167,13 +167,13 @@ def closed(self): return self._file.closed - def progressFinished(self, metadata=None): + def progress_finished(self, metadata=None): if metadata is None: metadata = {} - self.host.bridge.progressFinished(self.uid, metadata, self.profile) + self.host.bridge.progress_finished(self.uid, metadata, self.profile) - def progressError(self, error): - self.host.bridge.progressError(self.uid, error, self.profile) + def progress_error(self, error): + self.host.bridge.progress_error(self.uid, error, self.profile) def flush(self): self._file.flush() @@ -207,8 +207,8 @@ def mode(self): return self._file.mode - def getProgressMetadata(self): - """Return progression metadata as given to progressStarted + def get_progress_metadata(self): + """Return progression metadata as given to progress_started @return (dict): metadata (check bridge for documentation) """ @@ -230,7 +230,7 @@ return metadata - def getProgress(self, progress_id, profile): + def get_progress(self, progress_id, profile): ret = {"position": self._file.tell()} if self.size: ret["size"] = self.size @@ -252,7 +252,7 @@ def registerProducer(self, producer, streaming): pass - def startStream(self, consumer): + def start_stream(self, consumer): return self.beginFileTransfer(self.file_obj, consumer) def write(self, data):
--- a/sat/tools/trigger.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/trigger.py Sat Apr 08 13:54:42 2023 +0200 @@ -111,7 +111,7 @@ break return True - def returnPoint(self, point_name, *args, **kwargs): + def return_point(self, point_name, *args, **kwargs): """Like point but trigger must return (continue, return_value) All triggers for that point must return a tuple with 2 values:
--- a/sat/tools/utils.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/utils.py Sat Apr 08 13:54:42 2023 +0200 @@ -67,7 +67,7 @@ def partial(func, *fixed_args, **fixed_kwargs): # FIXME: temporary hack to workaround the fact that inspect.getargspec is not working with functools.partial - # making partial unusable with current D-bus module (in addMethod). + # making partial unusable with current D-bus module (in add_method). # Should not be needed anywore once moved to Python 3 ori_args = inspect.getargspec(func).args @@ -93,7 +93,7 @@ return method -def asDeferred(func, *args, **kwargs): +def as_deferred(func, *args, **kwargs): """Call a method and return a Deferred the method can be a simple callable, a Deferred or a coroutine. @@ -184,7 +184,7 @@ return dt.timestamp() -def generatePassword(vocabulary=None, size=20): +def generate_password(vocabulary=None, size=20): """Generate a password with random characters. @param vocabulary(iterable): characters to use to create password @@ -199,7 +199,7 @@ return "".join([random.choice(vocabulary) for i in range(15)]) -def getRepositoryData(module, as_string=True, is_path=False): +def get_repository_data(module, as_string=True, is_path=False): """Retrieve info on current mecurial repository Data is gotten by using the following methods, in order:
--- a/sat/tools/web.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/web.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,7 +74,7 @@ treq_client_no_ssl = HTTPClient(http_client.Agent(reactor, NoCheckContextFactory())) -async def downloadFile( +async def download_file( url: str, dest: Union[str, Path, BufferedIOBase], max_size: Optional[int] = None
--- a/sat/tools/xml_tools.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat/tools/xml_tools.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,7 +47,7 @@ # method to clean XHTML, receive raw unsecure XML or HTML, must return cleaned raw XHTML # this method must be set during runtime -cleanXHTML = None +clean_xhtml = None # TODO: move XMLUI stuff in a separate module # TODO: rewrite this with lxml or ElementTree or domish.Element: it's complicated and difficult to maintain with current minidom implementation @@ -55,7 +55,7 @@ # Helper functions -def _dataFormField2XMLUIData(field, read_only=False): +def _data_form_field_2_xmlui_data(field, read_only=False): """Get data needed to create an XMLUI's Widget from Wokkel's data_form's Field. The attribute field can be modified (if it's fixed and it has no value). @@ -144,14 +144,14 @@ return widget_type, widget_args, widget_kwargs -def dataForm2Widgets(form_ui, form, read_only=False, prepend=None, filters=None): +def data_form_2_widgets(form_ui, form, read_only=False, prepend=None, filters=None): """Complete an existing XMLUI with widget converted from XEP-0004 data forms. @param form_ui (XMLUI): XMLUI instance @param form (data_form.Form): Wokkel's implementation of data form @param read_only (bool): if True and it makes sense, create a read only input widget @param prepend(iterable, None): widgets to prepend to main LabelContainer - if not None, must be an iterable of *args for addWidget. Those widgets will + if not None, must be an iterable of *args for add_widget. Those widgets will be added first to the container. @param filters(dict, None): if not None, a dictionary of callable: key is the name of the widget to filter @@ -165,14 +165,14 @@ if form.instructions: form_ui.addText("\n".join(form.instructions), "instructions") - form_ui.changeContainer("label") + form_ui.change_container("label") if prepend is not None: for widget_args in prepend: - form_ui.addWidget(*widget_args) + form_ui.add_widget(*widget_args) for field in form.fieldList: - widget_type, widget_args, widget_kwargs = _dataFormField2XMLUIData( + widget_type, widget_args, widget_kwargs = _data_form_field_2_xmlui_data( field, read_only ) try: @@ -190,12 +190,12 @@ else: form_ui.addEmpty() - form_ui.addWidget(widget_type, *widget_args, **widget_kwargs) + form_ui.add_widget(widget_type, *widget_args, **widget_kwargs) return form_ui -def dataForm2XMLUI(form, submit_id, session_id=None, read_only=False): +def data_form_2_xmlui(form, submit_id, session_id=None, read_only=False): """Take a data form (Wokkel's XEP-0004 implementation) and convert it to a SàT XMLUI. @param form (data_form.Form): a Form instance @@ -205,13 +205,13 @@ @return: XMLUI instance """ form_ui = XMLUI("form", "vertical", submit_id=submit_id, session_id=session_id) - return dataForm2Widgets(form_ui, form, read_only=read_only) + return data_form_2_widgets(form_ui, form, read_only=read_only) -def dataForm2dataDict(form: data_form.Form) -> dict: +def data_form_2_data_dict(form: data_form.Form) -> dict: """Convert data form to a simple dict, easily serialisable - see dataDict2dataForm for a description of the format + see data_dict_2_data_form for a description of the format """ fields = [] data_dict = { @@ -253,7 +253,7 @@ return data_dict -def dataDict2dataForm(data_dict): +def data_dict_2_data_form(data_dict): """Convert serialisable dict of data to a data form The format of the dict is as follow: @@ -299,7 +299,7 @@ ) -def dataFormEltResult2XMLUIData(form_xml): +def data_form_elt_result_2_xmlui_data(form_xml): """Parse a data form result (not parsed by Wokkel's XEP-0004 implementation). The raw data form is used because Wokkel doesn't manage result items parsing yet. @@ -337,12 +337,12 @@ continue field = data_form.Field.fromElement(elt) - xmlui_data.append(_dataFormField2XMLUIData(field)) + xmlui_data.append(_data_form_field_2_xmlui_data(field)) return headers, xmlui_data -def XMLUIData2AdvancedList(xmlui, headers, xmlui_data): +def xmlui_data_2_advanced_list(xmlui, headers, xmlui_data): """Take a raw data form result (not parsed by Wokkel's XEP-0004 implementation) and convert it to an advanced list. The raw data form is used because Wokkel doesn't manage result items parsing yet. @@ -354,15 +354,15 @@ adv_list = AdvancedListContainer( xmlui, headers=headers, columns=len(headers), parent=xmlui.current_container ) - xmlui.changeContainer(adv_list) + xmlui.change_container(adv_list) for widget_type, widget_args, widget_kwargs in xmlui_data: - xmlui.addWidget(widget_type, *widget_args, **widget_kwargs) + xmlui.add_widget(widget_type, *widget_args, **widget_kwargs) return xmlui -def dataFormResult2AdvancedList(xmlui, form_xml): +def data_form_result_2_advanced_list(xmlui, form_xml): """Take a raw data form result (not parsed by Wokkel's XEP-0004 implementation) and convert it to an advanced list. The raw data form is used because Wokkel doesn't manage result items parsing yet. @@ -370,11 +370,11 @@ @param form_xml (domish.Element): element of the data form @return: the completed XMLUI instance """ - headers, xmlui_data = dataFormEltResult2XMLUIData(form_xml) - XMLUIData2AdvancedList(xmlui, headers, xmlui_data) + headers, xmlui_data = data_form_elt_result_2_xmlui_data(form_xml) + xmlui_data_2_advanced_list(xmlui, headers, xmlui_data) -def dataFormEltResult2XMLUI(form_elt, session_id=None): +def data_form_elt_result_2_xmlui(form_elt, session_id=None): """Take a raw data form (not parsed by XEP-0004) and convert it to a SàT XMLUI. The raw data form is used because Wokkel doesn't manage result items parsing yet. @@ -384,14 +384,14 @@ """ xml_ui = XMLUI("window", "vertical", session_id=session_id) try: - dataFormResult2AdvancedList(xml_ui, form_elt) + data_form_result_2_advanced_list(xml_ui, form_elt) except exceptions.DataError: parsed_form = data_form.Form.fromElement(form_elt) - dataForm2Widgets(xml_ui, parsed_form, read_only=True) + data_form_2_widgets(xml_ui, parsed_form, read_only=True) return xml_ui -def dataFormResult2XMLUI(result_form, base_form, session_id=None, prepend=None, +def data_form_result_2_xmlui(result_form, base_form, session_id=None, prepend=None, filters=None, read_only=True): """Convert data form result to SàT XMLUI. @@ -399,12 +399,12 @@ @param base_form (data_form.Form): initial form (i.e. of form type "form") this one is necessary to reconstruct options when needed (e.g. list elements) @param session_id (unicode): session id to return with the data - @param prepend: same as for [dataForm2Widgets] - @param filters: same as for [dataForm2Widgets] - @param read_only: same as for [dataForm2Widgets] + @param prepend: same as for [data_form_2_widgets] + @param filters: same as for [data_form_2_widgets] + @param read_only: same as for [data_form_2_widgets] @return: XMLUI instance """ - # we deepcopy the form because _dataFormField2XMLUIData can modify the value + # we deepcopy the form because _data_form_field_2_xmlui_data can modify the value # FIXME: check if it's really important, the only modified value seems to be # the replacement of None by "" on fixed fields # form = deepcopy(result_form) @@ -416,11 +416,11 @@ continue field.options = base_field.options[:] xml_ui = XMLUI("window", "vertical", session_id=session_id) - dataForm2Widgets(xml_ui, form, read_only=read_only, prepend=prepend, filters=filters) + data_form_2_widgets(xml_ui, form, read_only=read_only, prepend=prepend, filters=filters) return xml_ui -def _cleanValue(value): +def _clean_value(value): """Workaround method to avoid DBus types with D-Bus bridge. @param value: value to clean @@ -433,7 +433,7 @@ return value -def XMLUIResult2DataFormResult(xmlui_data): +def xmlui_result_2_data_form_result(xmlui_data): """ Extract form data from a XMLUI return. @param xmlui_data (dict): data returned by frontends for XMLUI form @@ -451,11 +451,11 @@ # FIXME: workaround to handle multiple values. Proper serialisation must # be done in XMLUI value = value.split("\t") - ret[key[len(SAT_FORM_PREFIX) :]] = _cleanValue(value) + ret[key[len(SAT_FORM_PREFIX) :]] = _clean_value(value) return ret -def formEscape(name): +def form_escape(name): """Return escaped name for forms. @param name (unicode): form name @@ -464,23 +464,23 @@ return "%s%s" % (SAT_FORM_PREFIX, name) -def isXMLUICancelled(raw_xmlui): +def is_xmlui_cancelled(raw_xmlui): """Tell if an XMLUI has been cancelled by checking raw XML""" return C.bool(raw_xmlui.get('cancelled', C.BOOL_FALSE)) -def XMLUIResultToElt(xmlui_data): +def xmlui_result_to_elt(xmlui_data): """Construct result domish.Element from XMLUI result. @param xmlui_data (dict): data returned by frontends for XMLUI form @return: domish.Element """ form = data_form.Form("submit") - form.makeFields(XMLUIResult2DataFormResult(xmlui_data)) + form.makeFields(xmlui_result_2_data_form_result(xmlui_data)) return form.toElement() -def tupleList2dataForm(values): +def tuple_list_2_data_form(values): """Convert a list of tuples (name, value) to a wokkel submit data form. @param values (list): list of tuples @@ -494,7 +494,7 @@ return form -def paramsXML2XMLUI(xml): +def params_xml_2_xmlui(xml): """Convert the XML for parameter to a SàT XML User Interface. @param xml (unicode) @@ -516,7 +516,7 @@ raise exceptions.DataError( _("INTERNAL ERROR: params categories must have a name") ) - tabs_cont.addTab(category_name, label=label, container=LabelContainer) + tabs_cont.add_tab(category_name, label=label, container=LabelContainer) for param in category.getElementsByTagName("param"): widget_kwargs = {} @@ -530,12 +530,12 @@ callback_id = param.getAttribute("callback_id") or None if type_ == "list": - options, selected = _paramsGetListOptions(param) + options, selected = _params_get_list_options(param) widget_kwargs["options"] = options widget_kwargs["selected"] = selected widget_kwargs["styles"] = ["extensible"] elif type_ == "jids_list": - widget_kwargs["jids"] = _paramsGetListJids(param) + widget_kwargs["jids"] = _params_get_list_jids(param) if type_ in ("button", "text"): param_ui.addEmpty() @@ -562,12 +562,12 @@ param_name, ) - param_ui.addWidget(type_, **widget_kwargs) + param_ui.add_widget(type_, **widget_kwargs) return param_ui.toXml() -def _paramsGetListOptions(param): +def _params_get_list_options(param): """Retrieve the options for list element. The <option/> tags must be direct children of <param/>. @@ -599,7 +599,7 @@ return (options, selected) -def _paramsGetListJids(param): +def _params_get_list_jids(param): """Retrive jids from a jids_list element. the <jid/> tags must be direct children of <param/> @@ -679,9 +679,9 @@ self.elem.setAttribute("name", name) self.elem.setAttribute("label", label) if selected: - self.setSelected(selected) + self.set_selected(selected) - def setSelected(self, selected=False): + def set_selected(self, selected=False): """Set the tab selected. @param selected (bool): set to True to select this tab @@ -824,7 +824,7 @@ super(Container, self).__init__(xmlui, parent) self.elem.setAttribute("type", self.type) - def getParentContainer(self): + def get_parent_container(self): """ Return first parent container @return: parent container or None @@ -856,7 +856,7 @@ class TabsContainer(Container): type = "tabs" - def addTab(self, name, label=None, selected=None, container=VerticalContainer): + def add_tab(self, name, label=None, selected=None, container=VerticalContainer): """Add a tab. @param name (unicode): tab name @@ -869,15 +869,15 @@ label = name tab_elt = TabElement(self, name, label, selected) new_container = container(self.xmlui, tab_elt) - return self.xmlui.changeContainer(new_container) + return self.xmlui.change_container(new_container) def end(self): """ Called when we have finished tabs change current container to first container parent """ - parent_container = self.getParentContainer() - self.xmlui.changeContainer(parent_container) + parent_container = self.get_parent_container() + self.xmlui.change_container(parent_container) class AdvancedListContainer(Container): @@ -928,9 +928,9 @@ raise exceptions.DataError( _("Headers lenght doesn't correspond to columns") ) - self.addHeaders(headers) + self.add_headers(headers) if items: - self.addItems(items) + self.add_items(items) self.elem.setAttribute("columns", str(self._columns)) if callback_id is not None: self.elem.setAttribute("callback", callback_id) @@ -940,18 +940,18 @@ self.elem.setAttribute("auto_index", "true") self.next_row_idx = None - def addHeaders(self, headers): + def add_headers(self, headers): for header in headers: self.addHeader(header) def addHeader(self, header): pass # TODO - def addItems(self, items): + def add_items(self, items): for item in items: self.append(item) - def setRowIndex(self, idx): + def set_row_index(self, idx): """ Set index for next row index are returned when a row is selected, in data's "index" key @@ -974,8 +974,8 @@ """ if self._item_idx % self._columns != 0: raise exceptions.DataError(_("Incorrect number of items in list")) - parent_container = self.getParentContainer() - self.xmlui.changeContainer(parent_container) + parent_container = self.get_parent_container() + self.xmlui.change_container(parent_container) ## Widgets ## @@ -1004,7 +1004,7 @@ xmlui.named_widgets[name] = self self.elem.setAttribute("type", self.type) - def setInternalCallback(self, callback, fields, data_elts=None): + def set_internal_callback(self, callback, fields, data_elts=None): """Set an internal UI callback when the widget value is changed. The internal callbacks are NO callback ids, they are strings from @@ -1169,10 +1169,10 @@ word, set to False only if you made the XHTML yourself) """ if clean: - if cleanXHTML is None: + if clean_xhtml is None: raise exceptions.NotFound( "No cleaning method set, can't clean the XHTML") - value = cleanXHTML(value) + value = clean_xhtml(value) super(XHTMLBoxWidget, self).__init__( xmlui, value=value, name=name, parent=parent, read_only=read_only) @@ -1279,10 +1279,10 @@ # because we would not have the labels log.warning(_('empty "options" list')) super(ListWidget, self).__init__(xmlui, name, parent) - self.addOptions(options, selected) - self.setStyles(styles) + self.add_options(options, selected) + self.set_styles(styles) - def addOptions(self, options, selected=None): + def add_options(self, options, selected=None): """Add options to a multi-values element (e.g. list) """ if selected: if isinstance(selected, str): @@ -1294,7 +1294,7 @@ value = option if isinstance(option, str) else option[0] OptionElement(self, option, value in selected) - def setStyles(self, styles): + def set_styles(self, styles): if not styles.issubset(self.STYLES): raise exceptions.DataError(_("invalid styles")) for style in styles: @@ -1302,7 +1302,7 @@ # TODO: check flags incompatibily (noselect and multi) like in __init__ def setStyle(self, style): - self.setStyles([style]) + self.set_styles([style]) @property def value(self): @@ -1336,9 +1336,9 @@ if not jids: log.debug("empty jids list") else: - self.addJids(jids) + self.add_jids(jids) - def addJids(self, jids): + def add_jids(self, jids): for jid_ in jids: JidElement(self, jid_) @@ -1499,20 +1499,20 @@ if panel_type == C.XMLUI_DIALOG: if dialog_opt is None: dialog_opt = {} - self._createDialog(dialog_opt) + self._create_dialog(dialog_opt) return - self.main_container = self._createContainer(container, TopElement(self)) + self.main_container = self._create_container(container, TopElement(self)) self.current_container = self.main_container self.named_widgets = {} @staticmethod - def creatorWrapper(widget_cls, is_input): + def creator_wrapper(widget_cls, is_input): # TODO: once moved to Python 3, use functools.partialmethod and - # remove the creatorWrapper - def createWidget(self, *args, **kwargs): + # remove the creator_wrapper + def create_widget(self, *args, **kwargs): if self.type == C.XMLUI_DIALOG: raise exceptions.InternalError(_( - "createWidget can't be used with dialogs")) + "create_widget can't be used with dialogs")) if "parent" not in kwargs: kwargs["parent"] = self.current_container if "name" not in kwargs and is_input: @@ -1521,7 +1521,7 @@ args = list(args) kwargs["name"] = args.pop(0) return widget_cls(self, *args, **kwargs) - return createWidget + return create_widget @classmethod def _introspect(cls): @@ -1547,10 +1547,10 @@ # .format(creator_name=creator_name, is_input=is_input)) assert not hasattr(cls, creator_name) - # XXX: we need to use creatorWrapper because we are in a loop + # XXX: we need to use creator_wrapper because we are in a loop # and Python 2 doesn't support default values in kwargs # when using *args, **kwargs - setattr(cls, creator_name, cls.creatorWrapper(obj, is_input)) + setattr(cls, creator_name, cls.creator_wrapper(obj, is_input)) elif issubclass(obj, Container): if obj.__name__ == "Container": @@ -1602,7 +1602,7 @@ else: raise exceptions.DataError("session_id can't be empty") - def _createDialog(self, dialog_opt): + def _create_dialog(self, dialog_opt): dialog_type = dialog_opt.setdefault(C.XMLUI_DATA_TYPE, C.XMLUI_DIALOG_MESSAGE) if ( dialog_type in [C.XMLUI_DIALOG_CONFIRM, C.XMLUI_DIALOG_FILE] @@ -1630,7 +1630,7 @@ except KeyError: pass - def _createContainer(self, container, parent=None, **kwargs): + def _create_container(self, container, parent=None, **kwargs): """Create a container element @param type: container type (cf init doc) @@ -1642,15 +1642,15 @@ new_container = cls(self, parent=parent, **kwargs) return new_container - def changeContainer(self, container, **kwargs): + def change_container(self, container, **kwargs): """Change the current container @param container: either container type (container it then created), or an Container instance""" if isinstance(container, str): - self.current_container = self._createContainer( + self.current_container = self._create_container( container, - self.current_container.getParentContainer() or self.main_container, + self.current_container.get_parent_container() or self.main_container, **kwargs ) else: @@ -1660,7 +1660,7 @@ assert isinstance(self.current_container, Container) return self.current_container - def addWidget(self, type_, *args, **kwargs): + def add_widget(self, type_, *args, **kwargs): """Convenience method to add an element""" if "parent" not in kwargs: kwargs["parent"] = self.current_container @@ -1702,13 +1702,13 @@ return note_xmlui -def quickNote(host, client, message, title="", level=C.XMLUI_DATA_LVL_INFO): +def quick_note(host, client, message, title="", level=C.XMLUI_DATA_LVL_INFO): """more sugar to do the whole note process""" note_ui = note(message, title, level) - host.actionNew({"xmlui": note_ui.toXml()}, profile=client.profile) + host.action_new({"xmlui": note_ui.toXml()}, profile=client.profile) -def deferredUI(host, xmlui, chained=False): +def deferred_ui(host, xmlui, chained=False): """create a deferred linked to XMLUI @param xmlui(XMLUI): instance of the XMLUI @@ -1720,15 +1720,15 @@ assert xmlui.submit_id == "" xmlui_d = defer.Deferred() - def onSubmit(data, profile): + def on_submit(data, profile): xmlui_d.callback(data) return xmlui_d if chained else {} - xmlui.submit_id = host.registerCallback(onSubmit, with_data=True, one_shot=True) + xmlui.submit_id = host.register_callback(on_submit, with_data=True, one_shot=True) return xmlui_d -def deferXMLUI(host, xmlui, action_extra=None, security_limit=C.NO_SECURITY_LIMIT, +def defer_xmlui(host, xmlui, action_extra=None, security_limit=C.NO_SECURITY_LIMIT, chained=False, profile=C.PROF_KEY_NONE): """Create a deferred linked to XMLUI @@ -1736,17 +1736,17 @@ Must be an XMLUI that you can submit, with submit_id set to '' @param profile: %(doc_profile)s @param action_extra(None, dict): extra action to merge with xmlui - mainly used to add meta informations (see actionNew doc) + mainly used to add meta informations (see action_new doc) @param security_limit: %(doc_security_limit)s @param chained(bool): True if the Deferred result must be returned to the frontend useful when backend is in a series of dialogs with an ui @return (data): a deferred which fire the data """ - xmlui_d = deferredUI(host, xmlui, chained) + xmlui_d = deferred_ui(host, xmlui, chained) action_data = {"xmlui": xmlui.toXml()} if action_extra is not None: action_data.update(action_extra) - host.actionNew( + host.action_new( action_data, security_limit=security_limit, keep_id=xmlui.submit_id, @@ -1755,7 +1755,7 @@ return xmlui_d -def deferDialog(host, message, title="Please confirm", type_=C.XMLUI_DIALOG_CONFIRM, +def defer_dialog(host, message, title="Please confirm", type_=C.XMLUI_DIALOG_CONFIRM, options=None, action_extra=None, security_limit=C.NO_SECURITY_LIMIT, chained=False, profile=C.PROF_KEY_NONE): """Create a submitable dialog and manage it with a deferred @@ -1766,7 +1766,7 @@ @param options(None, dict): if not None, will be used to update (extend) dialog_opt arguments of XMLUI @param action_extra(None, dict): extra action to merge with xmlui - mainly used to add meta informations (see actionNew doc) + mainly used to add meta informations (see action_new doc) @param security_limit: %(doc_security_limit)s @param chained(bool): True if the Deferred result must be returned to the frontend useful when backend is in a series of dialogs with an ui @@ -1778,19 +1778,19 @@ if options is not None: dialog_opt.update(options) dialog = XMLUI(C.XMLUI_DIALOG, title=title, dialog_opt=dialog_opt, submit_id="") - return deferXMLUI(host, dialog, action_extra, security_limit, chained, profile) + return defer_xmlui(host, dialog, action_extra, security_limit, chained, profile) -def deferConfirm(*args, **kwargs): - """call deferDialog and return a boolean instead of the whole data dict""" - d = deferDialog(*args, **kwargs) +def defer_confirm(*args, **kwargs): + """call defer_dialog and return a boolean instead of the whole data dict""" + d = defer_dialog(*args, **kwargs) d.addCallback(lambda data: C.bool(data["answer"])) return d # Misc other funtions -def elementCopy( +def element_copy( element: domish.Element, with_parent: bool = True, with_children: bool = True @@ -1815,7 +1815,7 @@ return new_elt -def isXHTMLField(field): +def is_xhtml_field(field): """Check if a data_form.Field is an XHTML one""" return (field.fieldType is None and field.ext_type == "xml" and field.value.uri == C.NS_XHTML) @@ -1826,7 +1826,7 @@ # XXX: Found at http://stackoverflow.com/questions/2093400/how-to-create-twisted-words-xish-domish-element-entirely-from-raw-xml/2095942#2095942 - def _escapeHTML(self, matchobj): + def _escape_html(self, matchobj): entity = matchobj.group(1) if entity in XML_ENTITIES: # we don't escape XML entities @@ -1854,23 +1854,23 @@ raw_xml = "<div>{}</div>".format(raw_xml) # avoid ParserError on HTML escaped chars - raw_xml = html_entity_re.sub(self._escapeHTML, raw_xml) + raw_xml = html_entity_re.sub(self._escape_html, raw_xml) self.result = None - def onStart(elem): + def on_start(elem): self.result = elem - def onEnd(): + def on_end(): pass def onElement(elem): self.result.addChild(elem) parser = domish.elementStream() - parser.DocumentStartEvent = onStart + parser.DocumentStartEvent = on_start parser.ElementEvent = onElement - parser.DocumentEndEvent = onEnd + parser.DocumentEndEvent = on_end tmp = domish.Element((None, "s")) if force_spaces: raw_xml = raw_xml.replace("\n", " ").replace("\t", " ") @@ -1888,8 +1888,8 @@ parse = ElementParser() -# FIXME: this method is duplicated from frontends.tools.xmlui.getText -def getText(node): +# FIXME: this method is duplicated from frontends.tools.xmlui.get_text +def get_text(node): """Get child text nodes of a domish.Element. @param node (domish.Element) @@ -1902,7 +1902,7 @@ return "".join(data) -def findAll(elt, namespaces=None, names=None): +def find_all(elt, namespaces=None, names=None): """Find child element at any depth matching criteria @param elt(domish.Element): top parent of the elements to find @@ -1924,11 +1924,11 @@ and (not namespaces or child.uri in namespaces) ): yield child - for found in findAll(child, namespaces, names): + for found in find_all(child, namespaces, names): yield found -def findAncestor( +def find_ancestor( elt, name: str, namespace: Optional[Union[str, Iterable[str]]] = None @@ -1958,12 +1958,12 @@ current = current.parent -def pFmtElt(elt, indent=0, defaultUri=""): +def p_fmt_elt(elt, indent=0, defaultUri=""): """Pretty format a domish.Element""" strings = [] for child in elt.children: if domish.IElement.providedBy(child): - strings.append(pFmtElt(child, indent+2, defaultUri=elt.defaultUri)) + strings.append(p_fmt_elt(child, indent+2, defaultUri=elt.defaultUri)) else: strings.append(f"{(indent+2)*' '}{child!s}") if elt.children: @@ -1977,9 +1977,9 @@ return '\n'.join(strings) -def ppElt(elt): +def pp_elt(elt): """Pretty print a domish.Element""" - print(pFmtElt(elt)) + print(p_fmt_elt(elt)) # ElementTree
--- a/sat_frontends/bridge/dbus_bridge.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/bridge/dbus_bridge.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,8 +32,8 @@ # Interface prefix -const_INT_PREFIX = config.getConfig( - config.parseMainConf(), +const_INT_PREFIX = config.config_get( + config.parse_main_conf(), "", "bridge_dbus_int_prefix", "org.libervia.Libervia") @@ -66,9 +66,9 @@ return BridgeException(name, message, condition) -class Bridge: +class bridge: - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): try: self.sessions_bus = dbus.SessionBus() self.db_object = self.sessions_bus.get_object(const_INT_PREFIX, @@ -105,7 +105,7 @@ except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): # We first check if we have an async call. We detect this in two ways: # - if we have the 'callback' and 'errback' keyword arguments # - or if the last two arguments are callable @@ -156,9 +156,18 @@ return self.db_plugin_iface.get_dbus_method(name)(*args, **kwargs) raise e - return getPluginMethod + return get_plugin_method - def actionsGet(self, profile_key="@DEFAULT@", callback=None, errback=None): + def action_launch(self, callback_id, data, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.action_launch(callback_id, data, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def actions_get(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -170,9 +179,32 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.actionsGet(profile_key, **kwargs) + return self.db_core_iface.actions_get(profile_key, **kwargs) - def addContact(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): + def config_get(self, section, name, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + kwargs={} + if callback is not None: + kwargs['timeout'] = const_TIMEOUT + kwargs['reply_handler'] = callback + kwargs['error_handler'] = error_handler + return str(self.db_core_iface.config_get(section, name, **kwargs)) + + def connect(self, profile_key="@DEFAULT@", password='', options={}, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.connect(profile_key, password, options, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def contact_add(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -184,108 +216,27 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.addContact(entity_jid, profile_key, **kwargs) - - def asyncDeleteProfile(self, profile, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.asyncDeleteProfile(profile, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.asyncGetParamA(name, category, attribute, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + return self.db_core_iface.contact_add(entity_jid, profile_key, **kwargs) - def asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.asyncGetParamsValuesFromCategory(category, security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def connect(self, profile_key="@DEFAULT@", password='', options={}, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.connect(profile_key, password, options, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def contactGet(self, arg_0, profile_key="@DEFAULT@", callback=None, errback=None): + def contact_del(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.contactGet(arg_0, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def delContact(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.delContact(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.contact_del(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def devicesInfosGet(self, bare_jid, profile_key, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.devicesInfosGet(bare_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - - def discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@", callback=None, errback=None): + def contact_get(self, arg_0, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.discoFindByFeatures(namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.discoInfos(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.contact_get(arg_0, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.discoItems(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def disconnect(self, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.disconnect(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def encryptionNamespaceGet(self, arg_0, callback=None, errback=None): + def contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -297,9 +248,77 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.encryptionNamespaceGet(arg_0, **kwargs)) + return self.db_core_iface.contact_update(entity_jid, name, groups, profile_key, **kwargs) + + def contacts_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.contacts_get(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def contacts_get_from_group(self, group, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + kwargs={} + if callback is not None: + kwargs['timeout'] = const_TIMEOUT + kwargs['reply_handler'] = callback + kwargs['error_handler'] = error_handler + return self.db_core_iface.contacts_get_from_group(group, profile_key, **kwargs) + + def devices_infos_get(self, bare_jid, profile_key, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return str(self.db_core_iface.devices_infos_get(bare_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - def encryptionPluginsGet(self, callback=None, errback=None): + def disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.disco_find_by_features(namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.disco_infos(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.disco_items(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def disconnect(self, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.disconnect(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def encryption_namespace_get(self, arg_0, callback=None, errback=None): if callback is None: error_handler = None else: @@ -311,18 +330,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.encryptionPluginsGet(**kwargs)) + return str(self.db_core_iface.encryption_namespace_get(arg_0, **kwargs)) - def encryptionTrustUIGet(self, to_jid, namespace, profile_key, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.encryptionTrustUIGet(to_jid, namespace, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - - def getConfig(self, section, name, callback=None, errback=None): + def encryption_plugins_get(self, callback=None, errback=None): if callback is None: error_handler = None else: @@ -334,18 +344,18 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.getConfig(section, name, **kwargs)) + return str(self.db_core_iface.encryption_plugins_get(**kwargs)) - def getContacts(self, profile_key="@DEFAULT@", callback=None, errback=None): + def encryption_trust_ui_get(self, to_jid, namespace, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.getContacts(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return str(self.db_core_iface.encryption_trust_ui_get(to_jid, namespace, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - def getContactsFromGroup(self, group, profile_key="@DEFAULT@", callback=None, errback=None): + def entities_data_get(self, jids, keys, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -357,9 +367,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.getContactsFromGroup(group, profile_key, **kwargs) + return self.db_core_iface.entities_data_get(jids, keys, profile, **kwargs) - def getEntitiesData(self, jids, keys, profile, callback=None, errback=None): + def entity_data_get(self, jid, keys, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -371,32 +381,27 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.getEntitiesData(jids, keys, profile, **kwargs) + return self.db_core_iface.entity_data_get(jid, keys, profile, **kwargs) - def getEntityData(self, jid, keys, profile, callback=None, errback=None): + def features_get(self, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - kwargs={} - if callback is not None: - kwargs['timeout'] = const_TIMEOUT - kwargs['reply_handler'] = callback - kwargs['error_handler'] = error_handler - return self.db_core_iface.getEntityData(jid, keys, profile, **kwargs) + return self.db_core_iface.features_get(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def getFeatures(self, profile_key, callback=None, errback=None): + def history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.getFeatures(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.history_get(from_jid, to_jid, limit, between, filters, profile, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def getMainResource(self, contact_jid, profile_key="@DEFAULT@", callback=None, errback=None): + def image_check(self, arg_0, callback=None, errback=None): if callback is None: error_handler = None else: @@ -408,23 +413,36 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.getMainResource(contact_jid, profile_key, **kwargs)) + return str(self.db_core_iface.image_check(arg_0, **kwargs)) - def getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@", callback=None, errback=None): + def image_convert(self, source, dest, arg_2, extra, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - kwargs={} - if callback is not None: - kwargs['timeout'] = const_TIMEOUT - kwargs['reply_handler'] = callback - kwargs['error_handler'] = error_handler - return str(self.db_core_iface.getParamA(name, category, attribute, profile_key, **kwargs)) + return str(self.db_core_iface.image_convert(source, dest, arg_2, extra, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + + def image_generate_preview(self, image_path, profile_key, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return str(self.db_core_iface.image_generate_preview(image_path, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - def getParamsCategories(self, callback=None, errback=None): + def image_resize(self, image_path, width, height, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return str(self.db_core_iface.image_resize(image_path, width, height, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + + def is_connected(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -436,18 +454,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.getParamsCategories(**kwargs) + return self.db_core_iface.is_connected(profile_key, **kwargs) - def getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.getParamsUI(security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - - def getPresenceStatuses(self, profile_key="@DEFAULT@", callback=None, errback=None): + def main_resource_get(self, contact_jid, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -459,18 +468,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.getPresenceStatuses(profile_key, **kwargs) + return str(self.db_core_iface.main_resource_get(contact_jid, profile_key, **kwargs)) - def getReady(self, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.getReady(timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def getVersion(self, callback=None, errback=None): + def menu_help_get(self, menu_id, language, callback=None, errback=None): if callback is None: error_handler = None else: @@ -482,9 +482,18 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.getVersion(**kwargs)) + return str(self.db_core_iface.menu_help_get(menu_id, language, **kwargs)) - def getWaitingSub(self, profile_key="@DEFAULT@", callback=None, errback=None): + def menu_launch(self, menu_type, path, data, security_limit, profile_key, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.menu_launch(menu_type, path, data, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def menus_get(self, language, security_limit, callback=None, errback=None): if callback is None: error_handler = None else: @@ -496,18 +505,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.getWaitingSub(profile_key, **kwargs) + return self.db_core_iface.menus_get(language, security_limit, **kwargs) - def historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.historyGet(from_jid, to_jid, limit, between, filters, profile, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def imageCheck(self, arg_0, callback=None, errback=None): + def message_encryption_get(self, to_jid, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: @@ -519,36 +519,36 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.imageCheck(arg_0, **kwargs)) + return str(self.db_core_iface.message_encryption_get(to_jid, profile_key, **kwargs)) - def imageConvert(self, source, dest, arg_2, extra, callback=None, errback=None): + def message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.imageConvert(source, dest, arg_2, extra, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + return self.db_core_iface.message_encryption_start(to_jid, namespace, replace, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def imageGeneratePreview(self, image_path, profile_key, callback=None, errback=None): + def message_encryption_stop(self, to_jid, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.imageGeneratePreview(image_path, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + return self.db_core_iface.message_encryption_stop(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def imageResize(self, image_path, width, height, callback=None, errback=None): + def message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.imageResize(image_path, width, height, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + return self.db_core_iface.message_send(to_jid, message, subject, mess_type, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def isConnected(self, profile_key="@DEFAULT@", callback=None, errback=None): + def namespaces_get(self, callback=None, errback=None): if callback is None: error_handler = None else: @@ -560,18 +560,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.isConnected(profile_key, **kwargs) + return self.db_core_iface.namespaces_get(**kwargs) - def launchAction(self, callback_id, data, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.launchAction(callback_id, data, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def loadParamsTemplate(self, filename, callback=None, errback=None): + def param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -583,9 +574,18 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.loadParamsTemplate(filename, **kwargs) + return str(self.db_core_iface.param_get_a(name, category, attribute, profile_key, **kwargs)) - def menuHelpGet(self, menu_id, language, callback=None, errback=None): + def param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return str(self.db_core_iface.param_get_a_async(name, category, attribute, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + + def param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -597,18 +597,18 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.menuHelpGet(menu_id, language, **kwargs)) + return self.db_core_iface.param_set(name, value, category, security_limit, profile_key, **kwargs) - def menuLaunch(self, menu_type, path, data, security_limit, profile_key, callback=None, errback=None): + def param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.menuLaunch(menu_type, path, data, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return str(self.db_core_iface.param_ui_get(security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - def menusGet(self, language, security_limit, callback=None, errback=None): + def params_categories_get(self, callback=None, errback=None): if callback is None: error_handler = None else: @@ -620,9 +620,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.menusGet(language, security_limit, **kwargs) + return self.db_core_iface.params_categories_get(**kwargs) - def messageEncryptionGet(self, to_jid, profile_key, callback=None, errback=None): + def params_register_app(self, xml, security_limit=-1, app='', callback=None, errback=None): if callback is None: error_handler = None else: @@ -634,36 +634,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.messageEncryptionGet(to_jid, profile_key, **kwargs)) - - def messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.messageEncryptionStart(to_jid, namespace, replace, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.params_register_app(xml, security_limit, app, **kwargs) - def messageEncryptionStop(self, to_jid, profile_key, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.messageEncryptionStop(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.messageSend(to_jid, message, subject, mess_type, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def namespacesGet(self, callback=None, errback=None): + def params_template_load(self, filename, callback=None, errback=None): if callback is None: error_handler = None else: @@ -675,9 +648,46 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.namespacesGet(**kwargs) + return self.db_core_iface.params_template_load(filename, **kwargs) + + def params_template_save(self, filename, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + kwargs={} + if callback is not None: + kwargs['timeout'] = const_TIMEOUT + kwargs['reply_handler'] = callback + kwargs['error_handler'] = error_handler + return self.db_core_iface.params_template_save(filename, **kwargs) - def paramsRegisterApp(self, xml, security_limit=-1, app='', callback=None, errback=None): + def params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.params_values_from_category_get_async(category, security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + kwargs={} + if callback is not None: + kwargs['timeout'] = const_TIMEOUT + kwargs['reply_handler'] = callback + kwargs['error_handler'] = error_handler + return self.db_core_iface.presence_set(to_jid, show, statuses, profile_key, **kwargs) + + def presence_statuses_get(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -689,45 +699,54 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.paramsRegisterApp(xml, security_limit, app, **kwargs) + return self.db_core_iface.presence_statuses_get(profile_key, **kwargs) - def privateDataDelete(self, namespace, key, arg_2, callback=None, errback=None): + def private_data_delete(self, namespace, key, arg_2, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.privateDataDelete(namespace, key, arg_2, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.private_data_delete(namespace, key, arg_2, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def privateDataGet(self, namespace, key, profile_key, callback=None, errback=None): + def private_data_get(self, namespace, key, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return str(self.db_core_iface.privateDataGet(namespace, key, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) + return str(self.db_core_iface.private_data_get(namespace, key, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler)) - def privateDataSet(self, namespace, key, data, profile_key, callback=None, errback=None): + def private_data_set(self, namespace, key, data, profile_key, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.privateDataSet(namespace, key, data, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.private_data_set(namespace, key, data, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def profileCreate(self, profile, password='', component='', callback=None, errback=None): + def profile_create(self, profile, password='', component='', callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.profileCreate(profile, password, component, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.profile_create(profile, password, component, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def profileIsSessionStarted(self, profile_key="@DEFAULT@", callback=None, errback=None): + def profile_delete_async(self, profile, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.profile_delete_async(profile, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def profile_is_session_started(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -739,9 +758,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.profileIsSessionStarted(profile_key, **kwargs) + return self.db_core_iface.profile_is_session_started(profile_key, **kwargs) - def profileNameGet(self, profile_key="@DEFAULT@", callback=None, errback=None): + def profile_name_get(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -753,32 +772,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return str(self.db_core_iface.profileNameGet(profile_key, **kwargs)) + return str(self.db_core_iface.profile_name_get(profile_key, **kwargs)) - def profileSetDefault(self, profile, callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - kwargs={} - if callback is not None: - kwargs['timeout'] = const_TIMEOUT - kwargs['reply_handler'] = callback - kwargs['error_handler'] = error_handler - return self.db_core_iface.profileSetDefault(profile, **kwargs) - - def profileStartSession(self, password='', profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.profileStartSession(password, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def profilesListGet(self, clients=True, components=False, callback=None, errback=None): + def profile_set_default(self, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -790,9 +786,18 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.profilesListGet(clients, components, **kwargs) + return self.db_core_iface.profile_set_default(profile, **kwargs) - def progressGet(self, id, profile, callback=None, errback=None): + def profile_start_session(self, password='', profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.profile_start_session(password, profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def profiles_list_get(self, clients=True, components=False, callback=None, errback=None): if callback is None: error_handler = None else: @@ -804,9 +809,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.progressGet(id, profile, **kwargs) + return self.db_core_iface.profiles_list_get(clients, components, **kwargs) - def progressGetAll(self, profile, callback=None, errback=None): + def progress_get(self, id, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -818,9 +823,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.progressGetAll(profile, **kwargs) + return self.db_core_iface.progress_get(id, profile, **kwargs) - def progressGetAllMetadata(self, profile, callback=None, errback=None): + def progress_get_all(self, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -832,18 +837,9 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.progressGetAllMetadata(profile, **kwargs) + return self.db_core_iface.progress_get_all(profile, **kwargs) - def rosterResync(self, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.rosterResync(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - - def saveParamsTemplate(self, filename, callback=None, errback=None): + def progress_get_all_metadata(self, profile, callback=None, errback=None): if callback is None: error_handler = None else: @@ -855,18 +851,36 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.saveParamsTemplate(filename, **kwargs) + return self.db_core_iface.progress_get_all_metadata(profile, **kwargs) - def sessionInfosGet(self, profile_key, callback=None, errback=None): + def ready_get(self, callback=None, errback=None): if callback is None: error_handler = None else: if errback is None: errback = log.error error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - return self.db_core_iface.sessionInfosGet(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + return self.db_core_iface.ready_get(timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def roster_resync(self, profile_key="@DEFAULT@", callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.roster_resync(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) - def setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): + def session_infos_get(self, profile_key, callback=None, errback=None): + if callback is None: + error_handler = None + else: + if errback is None: + errback = log.error + error_handler = lambda err:errback(dbus_to_bridge_exception(err)) + return self.db_core_iface.session_infos_get(profile_key, timeout=const_TIMEOUT, reply_handler=callback, error_handler=error_handler) + + def sub_waiting_get(self, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: error_handler = None else: @@ -878,21 +892,7 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.setParam(name, value, category, security_limit, profile_key, **kwargs) - - def setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@", callback=None, errback=None): - if callback is None: - error_handler = None - else: - if errback is None: - errback = log.error - error_handler = lambda err:errback(dbus_to_bridge_exception(err)) - kwargs={} - if callback is not None: - kwargs['timeout'] = const_TIMEOUT - kwargs['reply_handler'] = callback - kwargs['error_handler'] = error_handler - return self.db_core_iface.setPresence(to_jid, show, statuses, profile_key, **kwargs) + return self.db_core_iface.sub_waiting_get(profile_key, **kwargs) def subscription(self, sub_type, entity, profile_key="@DEFAULT@", callback=None, errback=None): if callback is None: @@ -908,7 +908,7 @@ kwargs['error_handler'] = error_handler return self.db_core_iface.subscription(sub_type, entity, profile_key, **kwargs) - def updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@", callback=None, errback=None): + def version_get(self, callback=None, errback=None): if callback is None: error_handler = None else: @@ -920,10 +920,10 @@ kwargs['timeout'] = const_TIMEOUT kwargs['reply_handler'] = callback kwargs['error_handler'] = error_handler - return self.db_core_iface.updateContact(entity_jid, name, groups, profile_key, **kwargs) + return str(self.db_core_iface.version_get(**kwargs)) -class AIOBridge(Bridge): +class AIOBridge(bridge): def register_signal(self, functionName, handler, iface="core"): loop = asyncio.get_running_loop() @@ -936,7 +936,7 @@ return object.__getattribute__(self, name) except AttributeError: # The attribute is not found, we try the plugin proxy to find the requested method - def getPluginMethod(*args, **kwargs): + def get_plugin_method(*args, **kwargs): loop = asyncio.get_running_loop() fut = loop.create_future() method = getattr(self.db_plugin_iface, name) @@ -954,7 +954,7 @@ ) except ValueError as e: if e.args[0].startswith("Unable to guess signature"): - # same hack as for Bridge.__getattribute__ + # same hack as for bridge.__getattribute__ log.warning("using hack to work around inspection issue") proxy = self.db_plugin_iface.proxy_object IN_PROGRESS = proxy.INTROSPECT_STATE_INTROSPECT_IN_PROGRESS @@ -972,55 +972,39 @@ raise e return fut - return getPluginMethod + return get_plugin_method - def bridgeConnect(self): + def bridge_connect(self): loop = asyncio.get_running_loop() fut = loop.create_future() - super().bridgeConnect( + super().bridge_connect( callback=lambda: loop.call_soon_threadsafe(fut.set_result, None), errback=lambda e: loop.call_soon_threadsafe(fut.set_exception, e) ) return fut - def actionsGet(self, profile_key="@DEFAULT@"): - loop = asyncio.get_running_loop() - fut = loop.create_future() - reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) - error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.actionsGet(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) - return fut - - def addContact(self, entity_jid, profile_key="@DEFAULT@"): + def action_launch(self, callback_id, data, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.addContact(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.action_launch(callback_id, data, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def asyncDeleteProfile(self, profile): + def actions_get(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.asyncDeleteProfile(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.actions_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): + def config_get(self, section, name): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.asyncGetParamA(name, category, attribute, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) - return fut - - def asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): - loop = asyncio.get_running_loop() - fut = loop.create_future() - reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) - error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.asyncGetParamsValuesFromCategory(category, security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.config_get(section, name, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut def connect(self, profile_key="@DEFAULT@", password='', options={}): @@ -1031,52 +1015,84 @@ self.db_core_iface.connect(profile_key, password, options, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def contactGet(self, arg_0, profile_key="@DEFAULT@"): + def contact_add(self, entity_jid, profile_key="@DEFAULT@"): + loop = asyncio.get_running_loop() + fut = loop.create_future() + reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) + error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) + self.db_core_iface.contact_add(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + return fut + + def contact_del(self, entity_jid, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.contactGet(arg_0, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.contact_del(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def delContact(self, entity_jid, profile_key="@DEFAULT@"): + def contact_get(self, arg_0, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.delContact(entity_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.contact_get(arg_0, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def devicesInfosGet(self, bare_jid, profile_key): + def contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@"): + loop = asyncio.get_running_loop() + fut = loop.create_future() + reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) + error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) + self.db_core_iface.contact_update(entity_jid, name, groups, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + return fut + + def contacts_get(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.devicesInfosGet(bare_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.contacts_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): + def contacts_get_from_group(self, group, profile_key="@DEFAULT@"): + loop = asyncio.get_running_loop() + fut = loop.create_future() + reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) + error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) + self.db_core_iface.contacts_get_from_group(group, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + return fut + + def devices_infos_get(self, bare_jid, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.discoFindByFeatures(namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.devices_infos_get(bare_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + def disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.discoInfos(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.disco_find_by_features(namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + def disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.discoItems(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.disco_infos(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + return fut + + def disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + loop = asyncio.get_running_loop() + fut = loop.create_future() + reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) + error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) + self.db_core_iface.disco_items(entity_jid, node, use_cache, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut def disconnect(self, profile_key="@DEFAULT@"): @@ -1087,412 +1103,396 @@ self.db_core_iface.disconnect(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def encryptionNamespaceGet(self, arg_0): + def encryption_namespace_get(self, arg_0): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.encryptionNamespaceGet(arg_0, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.encryption_namespace_get(arg_0, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def encryptionPluginsGet(self): + def encryption_plugins_get(self): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.encryptionPluginsGet(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.encryption_plugins_get(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def encryptionTrustUIGet(self, to_jid, namespace, profile_key): + def encryption_trust_ui_get(self, to_jid, namespace, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.encryptionTrustUIGet(to_jid, namespace, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.encryption_trust_ui_get(to_jid, namespace, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getConfig(self, section, name): + def entities_data_get(self, jids, keys, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getConfig(section, name, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.entities_data_get(jids, keys, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getContacts(self, profile_key="@DEFAULT@"): + def entity_data_get(self, jid, keys, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getContacts(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.entity_data_get(jid, keys, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getContactsFromGroup(self, group, profile_key="@DEFAULT@"): + def features_get(self, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getContactsFromGroup(group, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.features_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getEntitiesData(self, jids, keys, profile): + def history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getEntitiesData(jids, keys, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.history_get(from_jid, to_jid, limit, between, filters, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getEntityData(self, jid, keys, profile): + def image_check(self, arg_0): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getEntityData(jid, keys, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.image_check(arg_0, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getFeatures(self, profile_key): + def image_convert(self, source, dest, arg_2, extra): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getFeatures(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.image_convert(source, dest, arg_2, extra, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getMainResource(self, contact_jid, profile_key="@DEFAULT@"): + def image_generate_preview(self, image_path, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getMainResource(contact_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.image_generate_preview(image_path, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@"): + def image_resize(self, image_path, width, height): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getParamA(name, category, attribute, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.image_resize(image_path, width, height, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getParamsCategories(self): + def is_connected(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getParamsCategories(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.is_connected(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): + def main_resource_get(self, contact_jid, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getParamsUI(security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.main_resource_get(contact_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getPresenceStatuses(self, profile_key="@DEFAULT@"): + def menu_help_get(self, menu_id, language): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getPresenceStatuses(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.menu_help_get(menu_id, language, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getReady(self): + def menu_launch(self, menu_type, path, data, security_limit, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getReady(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.menu_launch(menu_type, path, data, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getVersion(self): + def menus_get(self, language, security_limit): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getVersion(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.menus_get(language, security_limit, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def getWaitingSub(self, profile_key="@DEFAULT@"): - loop = asyncio.get_running_loop() - fut = loop.create_future() - reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) - error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.getWaitingSub(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) - return fut - - def historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): + def message_encryption_get(self, to_jid, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.historyGet(from_jid, to_jid, limit, between, filters, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.message_encryption_get(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def imageCheck(self, arg_0): + def message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.imageCheck(arg_0, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.message_encryption_start(to_jid, namespace, replace, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def imageConvert(self, source, dest, arg_2, extra): + def message_encryption_stop(self, to_jid, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.imageConvert(source, dest, arg_2, extra, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.message_encryption_stop(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def imageGeneratePreview(self, image_path, profile_key): + def message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.imageGeneratePreview(image_path, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.message_send(to_jid, message, subject, mess_type, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def imageResize(self, image_path, width, height): + def namespaces_get(self): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.imageResize(image_path, width, height, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.namespaces_get(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def isConnected(self, profile_key="@DEFAULT@"): + def param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.isConnected(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.param_get_a(name, category, attribute, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def launchAction(self, callback_id, data, profile_key="@DEFAULT@"): + def param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.launchAction(callback_id, data, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.param_get_a_async(name, category, attribute, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def loadParamsTemplate(self, filename): + def param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.loadParamsTemplate(filename, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.param_set(name, value, category, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def menuHelpGet(self, menu_id, language): + def param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.menuHelpGet(menu_id, language, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.param_ui_get(security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def menuLaunch(self, menu_type, path, data, security_limit, profile_key): + def params_categories_get(self): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.menuLaunch(menu_type, path, data, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.params_categories_get(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def menusGet(self, language, security_limit): + def params_register_app(self, xml, security_limit=-1, app=''): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.menusGet(language, security_limit, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.params_register_app(xml, security_limit, app, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def messageEncryptionGet(self, to_jid, profile_key): + def params_template_load(self, filename): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.messageEncryptionGet(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.params_template_load(filename, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): - loop = asyncio.get_running_loop() - fut = loop.create_future() - reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) - error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.messageEncryptionStart(to_jid, namespace, replace, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) - return fut - - def messageEncryptionStop(self, to_jid, profile_key): + def params_template_save(self, filename): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.messageEncryptionStop(to_jid, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.params_template_save(filename, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): + def params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.messageSend(to_jid, message, subject, mess_type, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.params_values_from_category_get_async(category, security_limit, app, extra, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def namespacesGet(self): + def presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.namespacesGet(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.presence_set(to_jid, show, statuses, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def paramsRegisterApp(self, xml, security_limit=-1, app=''): + def presence_statuses_get(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.paramsRegisterApp(xml, security_limit, app, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.presence_statuses_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def privateDataDelete(self, namespace, key, arg_2): + def private_data_delete(self, namespace, key, arg_2): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.privateDataDelete(namespace, key, arg_2, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.private_data_delete(namespace, key, arg_2, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def privateDataGet(self, namespace, key, profile_key): + def private_data_get(self, namespace, key, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.privateDataGet(namespace, key, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.private_data_get(namespace, key, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def privateDataSet(self, namespace, key, data, profile_key): + def private_data_set(self, namespace, key, data, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.privateDataSet(namespace, key, data, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.private_data_set(namespace, key, data, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profileCreate(self, profile, password='', component=''): + def profile_create(self, profile, password='', component=''): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profileCreate(profile, password, component, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_create(profile, password, component, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profileIsSessionStarted(self, profile_key="@DEFAULT@"): + def profile_delete_async(self, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profileIsSessionStarted(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_delete_async(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profileNameGet(self, profile_key="@DEFAULT@"): + def profile_is_session_started(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profileNameGet(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_is_session_started(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profileSetDefault(self, profile): + def profile_name_get(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profileSetDefault(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_name_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profileStartSession(self, password='', profile_key="@DEFAULT@"): + def profile_set_default(self, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profileStartSession(password, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_set_default(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def profilesListGet(self, clients=True, components=False): + def profile_start_session(self, password='', profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.profilesListGet(clients, components, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profile_start_session(password, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def progressGet(self, id, profile): + def profiles_list_get(self, clients=True, components=False): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.progressGet(id, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.profiles_list_get(clients, components, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def progressGetAll(self, profile): + def progress_get(self, id, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.progressGetAll(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.progress_get(id, profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def progressGetAllMetadata(self, profile): + def progress_get_all(self, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.progressGetAllMetadata(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.progress_get_all(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def rosterResync(self, profile_key="@DEFAULT@"): + def progress_get_all_metadata(self, profile): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.rosterResync(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.progress_get_all_metadata(profile, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def saveParamsTemplate(self, filename): + def ready_get(self): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.saveParamsTemplate(filename, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.ready_get(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def sessionInfosGet(self, profile_key): + def roster_resync(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.sessionInfosGet(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.roster_resync(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): + def session_infos_get(self, profile_key): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.setParam(name, value, category, security_limit, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.session_infos_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): + def sub_waiting_get(self, profile_key="@DEFAULT@"): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.setPresence(to_jid, show, statuses, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.sub_waiting_get(profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut def subscription(self, sub_type, entity, profile_key="@DEFAULT@"): @@ -1503,10 +1503,10 @@ self.db_core_iface.subscription(sub_type, entity, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut - def updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@"): + def version_get(self): loop = asyncio.get_running_loop() fut = loop.create_future() reply_handler = lambda ret=None: loop.call_soon_threadsafe(fut.set_result, ret) error_handler = lambda err: loop.call_soon_threadsafe(fut.set_exception, dbus_to_bridge_exception(err)) - self.db_core_iface.updateContact(entity_jid, name, groups, profile_key, timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) + self.db_core_iface.version_get(timeout=const_TIMEOUT, reply_handler=reply_handler, error_handler=error_handler) return fut
--- a/sat_frontends/bridge/pb.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/bridge/pb.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ setattr(self, method_name, handler) -class Bridge(object): +class bridge(object): def __init__(self): self.signals_handler = SignalsHandler() @@ -75,7 +75,7 @@ ) ) - def remoteCallback(self, result, callback): + def remote_callback(self, result, callback): """call callback with argument or None if result is not None not argument is used, @@ -112,11 +112,11 @@ callback = args.pop() d = self.root.callRemote(name, *args, **kwargs) if callback is not None: - d.addCallback(self.remoteCallback, callback) + d.addCallback(self.remote_callback, callback) if errback is not None: d.addErrback(errback) - def _initBridgeEb(self, failure_): + def _init_bridge_eb(self, failure_): log.error("Can't init bridge: {msg}".format(msg=failure_)) return failure_ @@ -127,28 +127,28 @@ """ self.root = root d = root.callRemote("initBridge", self.signals_handler) - d.addErrback(self._initBridgeEb) + d.addErrback(self._init_bridge_eb) return d - def getRootObjectEb(self, failure_): + def get_root_object_eb(self, failure_): """Call errback with appropriate bridge error""" if failure_.check(ConnectionRefusedError, ConnectError): raise exceptions.BridgeExceptionNoService else: raise failure_ - def bridgeConnect(self, callback, errback): + def bridge_connect(self, callback, errback): factory = pb.PBClientFactory() - conf = config.parseMainConf() - getConf = partial(config.getConf, conf, "bridge_pb", "") - conn_type = getConf("connection_type", "unix_socket") + conf = config.parse_main_conf() + get_conf = partial(config.get_conf, conf, "bridge_pb", "") + conn_type = get_conf("connection_type", "unix_socket") if conn_type == "unix_socket": - local_dir = Path(config.getConfig(conf, "", "local_dir")).resolve() + local_dir = Path(config.config_get(conf, "", "local_dir")).resolve() socket_path = local_dir / "bridge_pb" reactor.connectUNIX(str(socket_path), factory) elif conn_type == "socket": - host = getConf("host", "localhost") - port = int(getConf("port", 8789)) + host = get_conf("host", "localhost") + port = int(get_conf("port", 8789)) reactor.connectTCP(host, port, factory) else: raise ValueError(f"Unknown pb connection type: {conn_type!r}") @@ -156,7 +156,7 @@ d.addCallback(self._set_root) if callback is not None: d.addCallback(lambda __: callback()) - d.addErrback(self.getRootObjectEb) + d.addErrback(self.get_root_object_eb) if errback is not None: d.addErrback(lambda failure_: errback(failure_.value)) return d @@ -165,8 +165,8 @@ self.signals_handler.register_signal(functionName, handler, iface) - def actionsGet(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("actionsGet", profile_key) + def action_launch(self, callback_id, data, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("action_launch", callback_id, data, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -174,26 +174,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def addContact(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("addContact", entity_jid, profile_key) - if callback is not None: - d.addCallback(lambda __: callback()) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def asyncDeleteProfile(self, profile, callback=None, errback=None): - d = self.root.callRemote("asyncDeleteProfile", profile) - if callback is not None: - d.addCallback(lambda __: callback()) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("asyncGetParamA", name, category, attribute, security_limit, profile_key) + def actions_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("actions_get", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -201,8 +183,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("asyncGetParamsValuesFromCategory", category, security_limit, app, extra, profile_key) + def config_get(self, section, name, callback=None, errback=None): + d = self.root.callRemote("config_get", section, name) if callback is not None: d.addCallback(callback) if errback is None: @@ -219,8 +201,26 @@ else: d.addErrback(self._errback, ori_errback=errback) - def contactGet(self, arg_0, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("contactGet", arg_0, profile_key) + def contact_add(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contact_add", entity_jid, profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def contact_del(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contact_del", entity_jid, profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def contact_get(self, arg_0, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contact_get", arg_0, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -228,8 +228,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def delContact(self, entity_jid, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("delContact", entity_jid, profile_key) + def contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contact_update", entity_jid, name, groups, profile_key) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -237,8 +237,17 @@ else: d.addErrback(self._errback, ori_errback=errback) - def devicesInfosGet(self, bare_jid, profile_key, callback=None, errback=None): - d = self.root.callRemote("devicesInfosGet", bare_jid, profile_key) + def contacts_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contacts_get", profile_key) + if callback is not None: + d.addCallback(callback) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def contacts_get_from_group(self, group, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("contacts_get_from_group", group, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -246,8 +255,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("discoFindByFeatures", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) + def devices_infos_get(self, bare_jid, profile_key, callback=None, errback=None): + d = self.root.callRemote("devices_infos_get", bare_jid, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -255,8 +264,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("discoInfos", entity_jid, node, use_cache, profile_key) + def disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("disco_find_by_features", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -264,8 +273,17 @@ else: d.addErrback(self._errback, ori_errback=errback) - def discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("discoItems", entity_jid, node, use_cache, profile_key) + def disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("disco_infos", entity_jid, node, use_cache, profile_key) + if callback is not None: + d.addCallback(callback) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("disco_items", entity_jid, node, use_cache, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -282,8 +300,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def encryptionNamespaceGet(self, arg_0, callback=None, errback=None): - d = self.root.callRemote("encryptionNamespaceGet", arg_0) + def encryption_namespace_get(self, arg_0, callback=None, errback=None): + d = self.root.callRemote("encryption_namespace_get", arg_0) if callback is not None: d.addCallback(callback) if errback is None: @@ -291,8 +309,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def encryptionPluginsGet(self, callback=None, errback=None): - d = self.root.callRemote("encryptionPluginsGet") + def encryption_plugins_get(self, callback=None, errback=None): + d = self.root.callRemote("encryption_plugins_get") if callback is not None: d.addCallback(callback) if errback is None: @@ -300,8 +318,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def encryptionTrustUIGet(self, to_jid, namespace, profile_key, callback=None, errback=None): - d = self.root.callRemote("encryptionTrustUIGet", to_jid, namespace, profile_key) + def encryption_trust_ui_get(self, to_jid, namespace, profile_key, callback=None, errback=None): + d = self.root.callRemote("encryption_trust_ui_get", to_jid, namespace, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -309,8 +327,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getConfig(self, section, name, callback=None, errback=None): - d = self.root.callRemote("getConfig", section, name) + def entities_data_get(self, jids, keys, profile, callback=None, errback=None): + d = self.root.callRemote("entities_data_get", jids, keys, profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -318,8 +336,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getContacts(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getContacts", profile_key) + def entity_data_get(self, jid, keys, profile, callback=None, errback=None): + d = self.root.callRemote("entity_data_get", jid, keys, profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -327,8 +345,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getContactsFromGroup(self, group, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getContactsFromGroup", group, profile_key) + def features_get(self, profile_key, callback=None, errback=None): + d = self.root.callRemote("features_get", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -336,17 +354,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getEntitiesData(self, jids, keys, profile, callback=None, errback=None): - d = self.root.callRemote("getEntitiesData", jids, keys, profile) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def getEntityData(self, jid, keys, profile, callback=None, errback=None): - d = self.root.callRemote("getEntityData", jid, keys, profile) + def history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@", callback=None, errback=None): + d = self.root.callRemote("history_get", from_jid, to_jid, limit, between, filters, profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -354,8 +363,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getFeatures(self, profile_key, callback=None, errback=None): - d = self.root.callRemote("getFeatures", profile_key) + def image_check(self, arg_0, callback=None, errback=None): + d = self.root.callRemote("image_check", arg_0) if callback is not None: d.addCallback(callback) if errback is None: @@ -363,8 +372,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getMainResource(self, contact_jid, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getMainResource", contact_jid, profile_key) + def image_convert(self, source, dest, arg_2, extra, callback=None, errback=None): + d = self.root.callRemote("image_convert", source, dest, arg_2, extra) if callback is not None: d.addCallback(callback) if errback is None: @@ -372,8 +381,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getParamA", name, category, attribute, profile_key) + def image_generate_preview(self, image_path, profile_key, callback=None, errback=None): + d = self.root.callRemote("image_generate_preview", image_path, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -381,8 +390,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getParamsCategories(self, callback=None, errback=None): - d = self.root.callRemote("getParamsCategories") + def image_resize(self, image_path, width, height, callback=None, errback=None): + d = self.root.callRemote("image_resize", image_path, width, height) if callback is not None: d.addCallback(callback) if errback is None: @@ -390,8 +399,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getParamsUI", security_limit, app, extra, profile_key) + def is_connected(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("is_connected", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -399,8 +408,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getPresenceStatuses(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getPresenceStatuses", profile_key) + def main_resource_get(self, contact_jid, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("main_resource_get", contact_jid, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -408,17 +417,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getReady(self, callback=None, errback=None): - d = self.root.callRemote("getReady") - if callback is not None: - d.addCallback(lambda __: callback()) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def getVersion(self, callback=None, errback=None): - d = self.root.callRemote("getVersion") + def menu_help_get(self, menu_id, language, callback=None, errback=None): + d = self.root.callRemote("menu_help_get", menu_id, language) if callback is not None: d.addCallback(callback) if errback is None: @@ -426,8 +426,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def getWaitingSub(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("getWaitingSub", profile_key) + def menu_launch(self, menu_type, path, data, security_limit, profile_key, callback=None, errback=None): + d = self.root.callRemote("menu_launch", menu_type, path, data, security_limit, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -435,8 +435,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@", callback=None, errback=None): - d = self.root.callRemote("historyGet", from_jid, to_jid, limit, between, filters, profile) + def menus_get(self, language, security_limit, callback=None, errback=None): + d = self.root.callRemote("menus_get", language, security_limit) if callback is not None: d.addCallback(callback) if errback is None: @@ -444,26 +444,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def imageCheck(self, arg_0, callback=None, errback=None): - d = self.root.callRemote("imageCheck", arg_0) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def imageConvert(self, source, dest, arg_2, extra, callback=None, errback=None): - d = self.root.callRemote("imageConvert", source, dest, arg_2, extra) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def imageGeneratePreview(self, image_path, profile_key, callback=None, errback=None): - d = self.root.callRemote("imageGeneratePreview", image_path, profile_key) + def message_encryption_get(self, to_jid, profile_key, callback=None, errback=None): + d = self.root.callRemote("message_encryption_get", to_jid, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -471,8 +453,35 @@ else: d.addErrback(self._errback, ori_errback=errback) - def imageResize(self, image_path, width, height, callback=None, errback=None): - d = self.root.callRemote("imageResize", image_path, width, height) + def message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@", callback=None, errback=None): + d = self.root.callRemote("message_encryption_start", to_jid, namespace, replace, profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def message_encryption_stop(self, to_jid, profile_key, callback=None, errback=None): + d = self.root.callRemote("message_encryption_stop", to_jid, profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@", callback=None, errback=None): + d = self.root.callRemote("message_send", to_jid, message, subject, mess_type, extra, profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def namespaces_get(self, callback=None, errback=None): + d = self.root.callRemote("namespaces_get") if callback is not None: d.addCallback(callback) if errback is None: @@ -480,8 +489,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def isConnected(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("isConnected", profile_key) + def param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("param_get_a", name, category, attribute, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -489,17 +498,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def launchAction(self, callback_id, data, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("launchAction", callback_id, data, profile_key) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def loadParamsTemplate(self, filename, callback=None, errback=None): - d = self.root.callRemote("loadParamsTemplate", filename) + def param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("param_get_a_async", name, category, attribute, security_limit, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -507,17 +507,17 @@ else: d.addErrback(self._errback, ori_errback=errback) - def menuHelpGet(self, menu_id, language, callback=None, errback=None): - d = self.root.callRemote("menuHelpGet", menu_id, language) + def param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("param_set", name, value, category, security_limit, profile_key) if callback is not None: - d.addCallback(callback) + d.addCallback(lambda __: callback()) if errback is None: d.addErrback(self._generic_errback) else: d.addErrback(self._errback, ori_errback=errback) - def menuLaunch(self, menu_type, path, data, security_limit, profile_key, callback=None, errback=None): - d = self.root.callRemote("menuLaunch", menu_type, path, data, security_limit, profile_key) + def param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("param_ui_get", security_limit, app, extra, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -525,8 +525,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def menusGet(self, language, security_limit, callback=None, errback=None): - d = self.root.callRemote("menusGet", language, security_limit) + def params_categories_get(self, callback=None, errback=None): + d = self.root.callRemote("params_categories_get") if callback is not None: d.addCallback(callback) if errback is None: @@ -534,17 +534,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def messageEncryptionGet(self, to_jid, profile_key, callback=None, errback=None): - d = self.root.callRemote("messageEncryptionGet", to_jid, profile_key) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@", callback=None, errback=None): - d = self.root.callRemote("messageEncryptionStart", to_jid, namespace, replace, profile_key) + def params_register_app(self, xml, security_limit=-1, app='', callback=None, errback=None): + d = self.root.callRemote("params_register_app", xml, security_limit, app) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -552,26 +543,26 @@ else: d.addErrback(self._errback, ori_errback=errback) - def messageEncryptionStop(self, to_jid, profile_key, callback=None, errback=None): - d = self.root.callRemote("messageEncryptionStop", to_jid, profile_key) + def params_template_load(self, filename, callback=None, errback=None): + d = self.root.callRemote("params_template_load", filename) if callback is not None: - d.addCallback(lambda __: callback()) + d.addCallback(callback) if errback is None: d.addErrback(self._generic_errback) else: d.addErrback(self._errback, ori_errback=errback) - def messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@", callback=None, errback=None): - d = self.root.callRemote("messageSend", to_jid, message, subject, mess_type, extra, profile_key) + def params_template_save(self, filename, callback=None, errback=None): + d = self.root.callRemote("params_template_save", filename) if callback is not None: - d.addCallback(lambda __: callback()) + d.addCallback(callback) if errback is None: d.addErrback(self._generic_errback) else: d.addErrback(self._errback, ori_errback=errback) - def namespacesGet(self, callback=None, errback=None): - d = self.root.callRemote("namespacesGet") + def params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("params_values_from_category_get_async", category, security_limit, app, extra, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -579,8 +570,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def paramsRegisterApp(self, xml, security_limit=-1, app='', callback=None, errback=None): - d = self.root.callRemote("paramsRegisterApp", xml, security_limit, app) + def presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("presence_set", to_jid, show, statuses, profile_key) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -588,8 +579,17 @@ else: d.addErrback(self._errback, ori_errback=errback) - def privateDataDelete(self, namespace, key, arg_2, callback=None, errback=None): - d = self.root.callRemote("privateDataDelete", namespace, key, arg_2) + def presence_statuses_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("presence_statuses_get", profile_key) + if callback is not None: + d.addCallback(callback) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def private_data_delete(self, namespace, key, arg_2, callback=None, errback=None): + d = self.root.callRemote("private_data_delete", namespace, key, arg_2) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -597,8 +597,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def privateDataGet(self, namespace, key, profile_key, callback=None, errback=None): - d = self.root.callRemote("privateDataGet", namespace, key, profile_key) + def private_data_get(self, namespace, key, profile_key, callback=None, errback=None): + d = self.root.callRemote("private_data_get", namespace, key, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -606,8 +606,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def privateDataSet(self, namespace, key, data, profile_key, callback=None, errback=None): - d = self.root.callRemote("privateDataSet", namespace, key, data, profile_key) + def private_data_set(self, namespace, key, data, profile_key, callback=None, errback=None): + d = self.root.callRemote("private_data_set", namespace, key, data, profile_key) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -615,8 +615,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def profileCreate(self, profile, password='', component='', callback=None, errback=None): - d = self.root.callRemote("profileCreate", profile, password, component) + def profile_create(self, profile, password='', component='', callback=None, errback=None): + d = self.root.callRemote("profile_create", profile, password, component) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -624,8 +624,17 @@ else: d.addErrback(self._errback, ori_errback=errback) - def profileIsSessionStarted(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("profileIsSessionStarted", profile_key) + def profile_delete_async(self, profile, callback=None, errback=None): + d = self.root.callRemote("profile_delete_async", profile) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def profile_is_session_started(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("profile_is_session_started", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -633,8 +642,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def profileNameGet(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("profileNameGet", profile_key) + def profile_name_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("profile_name_get", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -642,8 +651,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def profileSetDefault(self, profile, callback=None, errback=None): - d = self.root.callRemote("profileSetDefault", profile) + def profile_set_default(self, profile, callback=None, errback=None): + d = self.root.callRemote("profile_set_default", profile) if callback is not None: d.addCallback(lambda __: callback()) if errback is None: @@ -651,17 +660,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def profileStartSession(self, password='', profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("profileStartSession", password, profile_key) - if callback is not None: - d.addCallback(callback) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def profilesListGet(self, clients=True, components=False, callback=None, errback=None): - d = self.root.callRemote("profilesListGet", clients, components) + def profile_start_session(self, password='', profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("profile_start_session", password, profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -669,8 +669,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def progressGet(self, id, profile, callback=None, errback=None): - d = self.root.callRemote("progressGet", id, profile) + def profiles_list_get(self, clients=True, components=False, callback=None, errback=None): + d = self.root.callRemote("profiles_list_get", clients, components) if callback is not None: d.addCallback(callback) if errback is None: @@ -678,8 +678,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def progressGetAll(self, profile, callback=None, errback=None): - d = self.root.callRemote("progressGetAll", profile) + def progress_get(self, id, profile, callback=None, errback=None): + d = self.root.callRemote("progress_get", id, profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -687,8 +687,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def progressGetAllMetadata(self, profile, callback=None, errback=None): - d = self.root.callRemote("progressGetAllMetadata", profile) + def progress_get_all(self, profile, callback=None, errback=None): + d = self.root.callRemote("progress_get_all", profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -696,17 +696,8 @@ else: d.addErrback(self._errback, ori_errback=errback) - def rosterResync(self, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("rosterResync", profile_key) - if callback is not None: - d.addCallback(lambda __: callback()) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def saveParamsTemplate(self, filename, callback=None, errback=None): - d = self.root.callRemote("saveParamsTemplate", filename) + def progress_get_all_metadata(self, profile, callback=None, errback=None): + d = self.root.callRemote("progress_get_all_metadata", profile) if callback is not None: d.addCallback(callback) if errback is None: @@ -714,8 +705,26 @@ else: d.addErrback(self._errback, ori_errback=errback) - def sessionInfosGet(self, profile_key, callback=None, errback=None): - d = self.root.callRemote("sessionInfosGet", profile_key) + def ready_get(self, callback=None, errback=None): + d = self.root.callRemote("ready_get") + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def roster_resync(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("roster_resync", profile_key) + if callback is not None: + d.addCallback(lambda __: callback()) + if errback is None: + d.addErrback(self._generic_errback) + else: + d.addErrback(self._errback, ori_errback=errback) + + def session_infos_get(self, profile_key, callback=None, errback=None): + d = self.root.callRemote("session_infos_get", profile_key) if callback is not None: d.addCallback(callback) if errback is None: @@ -723,19 +732,10 @@ else: d.addErrback(self._errback, ori_errback=errback) - def setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("setParam", name, value, category, security_limit, profile_key) + def sub_waiting_get(self, profile_key="@DEFAULT@", callback=None, errback=None): + d = self.root.callRemote("sub_waiting_get", profile_key) if callback is not None: - d.addCallback(lambda __: callback()) - if errback is None: - d.addErrback(self._generic_errback) - else: - d.addErrback(self._errback, ori_errback=errback) - - def setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("setPresence", to_jid, show, statuses, profile_key) - if callback is not None: - d.addCallback(lambda __: callback()) + d.addCallback(callback) if errback is None: d.addErrback(self._generic_errback) else: @@ -750,10 +750,10 @@ else: d.addErrback(self._errback, ori_errback=errback) - def updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@", callback=None, errback=None): - d = self.root.callRemote("updateContact", entity_jid, name, groups, profile_key) + def version_get(self, callback=None, errback=None): + d = self.root.callRemote("version_get") if callback is not None: - d.addCallback(lambda __: callback()) + d.addCallback(callback) if errback is None: d.addErrback(self._generic_errback) else: @@ -768,7 +768,7 @@ return super().register_signal(name, async_handler, iface) -class AIOBridge(Bridge): +class AIOBridge(bridge): def __init__(self): self.signals_handler = AIOSignalsHandler() @@ -785,32 +785,22 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - async def bridgeConnect(self): - d = super().bridgeConnect(callback=None, errback=None) + async def bridge_connect(self): + d = super().bridge_connect(callback=None, errback=None) return await d.asFuture(asyncio.get_event_loop()) - def actionsGet(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("actionsGet", profile_key) + def action_launch(self, callback_id, data, profile_key="@DEFAULT@"): + d = self.root.callRemote("action_launch", callback_id, data, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def addContact(self, entity_jid, profile_key="@DEFAULT@"): - d = self.root.callRemote("addContact", entity_jid, profile_key) + def actions_get(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("actions_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def asyncDeleteProfile(self, profile): - d = self.root.callRemote("asyncDeleteProfile", profile) - d.addErrback(self._errback) - return d.asFuture(asyncio.get_event_loop()) - - def asyncGetParamA(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): - d = self.root.callRemote("asyncGetParamA", name, category, attribute, security_limit, profile_key) - d.addErrback(self._errback) - return d.asFuture(asyncio.get_event_loop()) - - def asyncGetParamsValuesFromCategory(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): - d = self.root.callRemote("asyncGetParamsValuesFromCategory", category, security_limit, app, extra, profile_key) + def config_get(self, section, name): + d = self.root.callRemote("config_get", section, name) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) @@ -819,33 +809,53 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def contactGet(self, arg_0, profile_key="@DEFAULT@"): - d = self.root.callRemote("contactGet", arg_0, profile_key) + def contact_add(self, entity_jid, profile_key="@DEFAULT@"): + d = self.root.callRemote("contact_add", entity_jid, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def delContact(self, entity_jid, profile_key="@DEFAULT@"): - d = self.root.callRemote("delContact", entity_jid, profile_key) + def contact_del(self, entity_jid, profile_key="@DEFAULT@"): + d = self.root.callRemote("contact_del", entity_jid, profile_key) + d.addErrback(self._errback) + return d.asFuture(asyncio.get_event_loop()) + + def contact_get(self, arg_0, profile_key="@DEFAULT@"): + d = self.root.callRemote("contact_get", arg_0, profile_key) + d.addErrback(self._errback) + return d.asFuture(asyncio.get_event_loop()) + + def contact_update(self, entity_jid, name, groups, profile_key="@DEFAULT@"): + d = self.root.callRemote("contact_update", entity_jid, name, groups, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def devicesInfosGet(self, bare_jid, profile_key): - d = self.root.callRemote("devicesInfosGet", bare_jid, profile_key) + def contacts_get(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("contacts_get", profile_key) + d.addErrback(self._errback) + return d.asFuture(asyncio.get_event_loop()) + + def contacts_get_from_group(self, group, profile_key="@DEFAULT@"): + d = self.root.callRemote("contacts_get_from_group", group, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def discoFindByFeatures(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): - d = self.root.callRemote("discoFindByFeatures", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) + def devices_infos_get(self, bare_jid, profile_key): + d = self.root.callRemote("devices_infos_get", bare_jid, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def discoInfos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - d = self.root.callRemote("discoInfos", entity_jid, node, use_cache, profile_key) + def disco_find_by_features(self, namespaces, identities, bare_jid=False, service=True, roster=True, own_jid=True, local_device=False, profile_key="@DEFAULT@"): + d = self.root.callRemote("disco_find_by_features", namespaces, identities, bare_jid, service, roster, own_jid, local_device, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def discoItems(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): - d = self.root.callRemote("discoItems", entity_jid, node, use_cache, profile_key) + def disco_infos(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + d = self.root.callRemote("disco_infos", entity_jid, node, use_cache, profile_key) + d.addErrback(self._errback) + return d.asFuture(asyncio.get_event_loop()) + + def disco_items(self, entity_jid, node=u'', use_cache=True, profile_key="@DEFAULT@"): + d = self.root.callRemote("disco_items", entity_jid, node, use_cache, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) @@ -854,258 +864,248 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def encryptionNamespaceGet(self, arg_0): - d = self.root.callRemote("encryptionNamespaceGet", arg_0) + def encryption_namespace_get(self, arg_0): + d = self.root.callRemote("encryption_namespace_get", arg_0) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def encryptionPluginsGet(self): - d = self.root.callRemote("encryptionPluginsGet") + def encryption_plugins_get(self): + d = self.root.callRemote("encryption_plugins_get") d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def encryptionTrustUIGet(self, to_jid, namespace, profile_key): - d = self.root.callRemote("encryptionTrustUIGet", to_jid, namespace, profile_key) + def encryption_trust_ui_get(self, to_jid, namespace, profile_key): + d = self.root.callRemote("encryption_trust_ui_get", to_jid, namespace, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getConfig(self, section, name): - d = self.root.callRemote("getConfig", section, name) + def entities_data_get(self, jids, keys, profile): + d = self.root.callRemote("entities_data_get", jids, keys, profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getContacts(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("getContacts", profile_key) + def entity_data_get(self, jid, keys, profile): + d = self.root.callRemote("entity_data_get", jid, keys, profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getContactsFromGroup(self, group, profile_key="@DEFAULT@"): - d = self.root.callRemote("getContactsFromGroup", group, profile_key) + def features_get(self, profile_key): + d = self.root.callRemote("features_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getEntitiesData(self, jids, keys, profile): - d = self.root.callRemote("getEntitiesData", jids, keys, profile) - d.addErrback(self._errback) - return d.asFuture(asyncio.get_event_loop()) - - def getEntityData(self, jid, keys, profile): - d = self.root.callRemote("getEntityData", jid, keys, profile) + def history_get(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): + d = self.root.callRemote("history_get", from_jid, to_jid, limit, between, filters, profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getFeatures(self, profile_key): - d = self.root.callRemote("getFeatures", profile_key) + def image_check(self, arg_0): + d = self.root.callRemote("image_check", arg_0) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getMainResource(self, contact_jid, profile_key="@DEFAULT@"): - d = self.root.callRemote("getMainResource", contact_jid, profile_key) + def image_convert(self, source, dest, arg_2, extra): + d = self.root.callRemote("image_convert", source, dest, arg_2, extra) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getParamA(self, name, category, attribute="value", profile_key="@DEFAULT@"): - d = self.root.callRemote("getParamA", name, category, attribute, profile_key) + def image_generate_preview(self, image_path, profile_key): + d = self.root.callRemote("image_generate_preview", image_path, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getParamsCategories(self): - d = self.root.callRemote("getParamsCategories") + def image_resize(self, image_path, width, height): + d = self.root.callRemote("image_resize", image_path, width, height) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getParamsUI(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): - d = self.root.callRemote("getParamsUI", security_limit, app, extra, profile_key) + def is_connected(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("is_connected", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getPresenceStatuses(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("getPresenceStatuses", profile_key) + def main_resource_get(self, contact_jid, profile_key="@DEFAULT@"): + d = self.root.callRemote("main_resource_get", contact_jid, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getReady(self): - d = self.root.callRemote("getReady") + def menu_help_get(self, menu_id, language): + d = self.root.callRemote("menu_help_get", menu_id, language) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getVersion(self): - d = self.root.callRemote("getVersion") + def menu_launch(self, menu_type, path, data, security_limit, profile_key): + d = self.root.callRemote("menu_launch", menu_type, path, data, security_limit, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def getWaitingSub(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("getWaitingSub", profile_key) + def menus_get(self, language, security_limit): + d = self.root.callRemote("menus_get", language, security_limit) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def historyGet(self, from_jid, to_jid, limit, between=True, filters='', profile="@NONE@"): - d = self.root.callRemote("historyGet", from_jid, to_jid, limit, between, filters, profile) + def message_encryption_get(self, to_jid, profile_key): + d = self.root.callRemote("message_encryption_get", to_jid, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def imageCheck(self, arg_0): - d = self.root.callRemote("imageCheck", arg_0) + def message_encryption_start(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): + d = self.root.callRemote("message_encryption_start", to_jid, namespace, replace, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def imageConvert(self, source, dest, arg_2, extra): - d = self.root.callRemote("imageConvert", source, dest, arg_2, extra) + def message_encryption_stop(self, to_jid, profile_key): + d = self.root.callRemote("message_encryption_stop", to_jid, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def imageGeneratePreview(self, image_path, profile_key): - d = self.root.callRemote("imageGeneratePreview", image_path, profile_key) + def message_send(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): + d = self.root.callRemote("message_send", to_jid, message, subject, mess_type, extra, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def imageResize(self, image_path, width, height): - d = self.root.callRemote("imageResize", image_path, width, height) + def namespaces_get(self): + d = self.root.callRemote("namespaces_get") d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def isConnected(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("isConnected", profile_key) + def param_get_a(self, name, category, attribute="value", profile_key="@DEFAULT@"): + d = self.root.callRemote("param_get_a", name, category, attribute, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def launchAction(self, callback_id, data, profile_key="@DEFAULT@"): - d = self.root.callRemote("launchAction", callback_id, data, profile_key) + def param_get_a_async(self, name, category, attribute="value", security_limit=-1, profile_key="@DEFAULT@"): + d = self.root.callRemote("param_get_a_async", name, category, attribute, security_limit, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def loadParamsTemplate(self, filename): - d = self.root.callRemote("loadParamsTemplate", filename) + def param_set(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): + d = self.root.callRemote("param_set", name, value, category, security_limit, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def menuHelpGet(self, menu_id, language): - d = self.root.callRemote("menuHelpGet", menu_id, language) + def param_ui_get(self, security_limit=-1, app='', extra='', profile_key="@DEFAULT@"): + d = self.root.callRemote("param_ui_get", security_limit, app, extra, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def menuLaunch(self, menu_type, path, data, security_limit, profile_key): - d = self.root.callRemote("menuLaunch", menu_type, path, data, security_limit, profile_key) + def params_categories_get(self): + d = self.root.callRemote("params_categories_get") d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def menusGet(self, language, security_limit): - d = self.root.callRemote("menusGet", language, security_limit) + def params_register_app(self, xml, security_limit=-1, app=''): + d = self.root.callRemote("params_register_app", xml, security_limit, app) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def messageEncryptionGet(self, to_jid, profile_key): - d = self.root.callRemote("messageEncryptionGet", to_jid, profile_key) + def params_template_load(self, filename): + d = self.root.callRemote("params_template_load", filename) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def messageEncryptionStart(self, to_jid, namespace='', replace=False, profile_key="@NONE@"): - d = self.root.callRemote("messageEncryptionStart", to_jid, namespace, replace, profile_key) + def params_template_save(self, filename): + d = self.root.callRemote("params_template_save", filename) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def messageEncryptionStop(self, to_jid, profile_key): - d = self.root.callRemote("messageEncryptionStop", to_jid, profile_key) + def params_values_from_category_get_async(self, category, security_limit=-1, app="", extra="", profile_key="@DEFAULT@"): + d = self.root.callRemote("params_values_from_category_get_async", category, security_limit, app, extra, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def messageSend(self, to_jid, message, subject={}, mess_type="auto", extra={}, profile_key="@NONE@"): - d = self.root.callRemote("messageSend", to_jid, message, subject, mess_type, extra, profile_key) + def presence_set(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): + d = self.root.callRemote("presence_set", to_jid, show, statuses, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def namespacesGet(self): - d = self.root.callRemote("namespacesGet") + def presence_statuses_get(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("presence_statuses_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def paramsRegisterApp(self, xml, security_limit=-1, app=''): - d = self.root.callRemote("paramsRegisterApp", xml, security_limit, app) + def private_data_delete(self, namespace, key, arg_2): + d = self.root.callRemote("private_data_delete", namespace, key, arg_2) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def privateDataDelete(self, namespace, key, arg_2): - d = self.root.callRemote("privateDataDelete", namespace, key, arg_2) + def private_data_get(self, namespace, key, profile_key): + d = self.root.callRemote("private_data_get", namespace, key, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def privateDataGet(self, namespace, key, profile_key): - d = self.root.callRemote("privateDataGet", namespace, key, profile_key) + def private_data_set(self, namespace, key, data, profile_key): + d = self.root.callRemote("private_data_set", namespace, key, data, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def privateDataSet(self, namespace, key, data, profile_key): - d = self.root.callRemote("privateDataSet", namespace, key, data, profile_key) + def profile_create(self, profile, password='', component=''): + d = self.root.callRemote("profile_create", profile, password, component) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profileCreate(self, profile, password='', component=''): - d = self.root.callRemote("profileCreate", profile, password, component) + def profile_delete_async(self, profile): + d = self.root.callRemote("profile_delete_async", profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profileIsSessionStarted(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("profileIsSessionStarted", profile_key) + def profile_is_session_started(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("profile_is_session_started", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profileNameGet(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("profileNameGet", profile_key) + def profile_name_get(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("profile_name_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profileSetDefault(self, profile): - d = self.root.callRemote("profileSetDefault", profile) + def profile_set_default(self, profile): + d = self.root.callRemote("profile_set_default", profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profileStartSession(self, password='', profile_key="@DEFAULT@"): - d = self.root.callRemote("profileStartSession", password, profile_key) + def profile_start_session(self, password='', profile_key="@DEFAULT@"): + d = self.root.callRemote("profile_start_session", password, profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def profilesListGet(self, clients=True, components=False): - d = self.root.callRemote("profilesListGet", clients, components) + def profiles_list_get(self, clients=True, components=False): + d = self.root.callRemote("profiles_list_get", clients, components) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def progressGet(self, id, profile): - d = self.root.callRemote("progressGet", id, profile) + def progress_get(self, id, profile): + d = self.root.callRemote("progress_get", id, profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def progressGetAll(self, profile): - d = self.root.callRemote("progressGetAll", profile) + def progress_get_all(self, profile): + d = self.root.callRemote("progress_get_all", profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def progressGetAllMetadata(self, profile): - d = self.root.callRemote("progressGetAllMetadata", profile) + def progress_get_all_metadata(self, profile): + d = self.root.callRemote("progress_get_all_metadata", profile) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def rosterResync(self, profile_key="@DEFAULT@"): - d = self.root.callRemote("rosterResync", profile_key) - d.addErrback(self._errback) - return d.asFuture(asyncio.get_event_loop()) - - def saveParamsTemplate(self, filename): - d = self.root.callRemote("saveParamsTemplate", filename) + def ready_get(self): + d = self.root.callRemote("ready_get") d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def sessionInfosGet(self, profile_key): - d = self.root.callRemote("sessionInfosGet", profile_key) + def roster_resync(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("roster_resync", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def setParam(self, name, value, category, security_limit=-1, profile_key="@DEFAULT@"): - d = self.root.callRemote("setParam", name, value, category, security_limit, profile_key) + def session_infos_get(self, profile_key): + d = self.root.callRemote("session_infos_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def setPresence(self, to_jid='', show='', statuses={}, profile_key="@DEFAULT@"): - d = self.root.callRemote("setPresence", to_jid, show, statuses, profile_key) + def sub_waiting_get(self, profile_key="@DEFAULT@"): + d = self.root.callRemote("sub_waiting_get", profile_key) d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) @@ -1114,7 +1114,7 @@ d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop()) - def updateContact(self, entity_jid, name, groups, profile_key="@DEFAULT@"): - d = self.root.callRemote("updateContact", entity_jid, name, groups, profile_key) + def version_get(self): + d = self.root.callRemote("version_get") d.addErrback(self._errback) return d.asFuture(asyncio.get_event_loop())
--- a/sat_frontends/jp/base.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/base.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,7 +46,7 @@ from sat.tools.common.ansi import ANSI as A from sat.core import exceptions import sat_frontends.jp -from sat_frontends.jp.loops import QuitException, getJPLoop +from sat_frontends.jp.loops import QuitException, get_jp_loop from sat_frontends.jp.constants import Const as C from sat_frontends.bridge.bridge_frontend import BridgeException from sat_frontends.tools import misc @@ -55,9 +55,9 @@ ## bridge handling # we get bridge name from conf and initialise the right class accordingly -main_config = config.parseMainConf() -bridge_name = config.getConfig(main_config, '', 'bridge', 'dbus') -JPLoop = getJPLoop(bridge_name) +main_config = config.parse_main_conf() +bridge_name = config.config_get(main_config, '', 'bridge', 'dbus') +JPLoop = get_jp_loop(bridge_name) try: @@ -118,7 +118,7 @@ def get_config(self, name, section=C.CONFIG_SECTION, default=None): """Retrieve a setting value from sat.conf""" - return config.getConfig(self.sat_conf, section, name, default=default) + return config.config_get(self.sat_conf, section, name, default=default) def guess_background(self): # cf. https://unix.stackexchange.com/a/245568 (thanks!) @@ -204,7 +204,7 @@ C.A_DIRECTORY = A.BOLD + A.FG_MAGENTA C.A_FILE = A.FG_BLACK - def _bridgeConnected(self): + def _bridge_connected(self): self.parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=DESCRIPTION) self._make_parents() @@ -231,7 +231,7 @@ async def set_progress_id(self, progress_id): # because we use async, we need an explicit setter self._progress_id = progress_id - await self.replayCache('progress_ids_cache') + await self.replay_cache('progress_ids_cache') @property def watch_progress(self): @@ -254,7 +254,7 @@ except AttributeError: return 0 - async def replayCache(self, cache_attribute): + async def replay_cache(self, cache_attribute): """Replay cached signals @param cache_attribute(str): name of the attribute containing the cache @@ -294,14 +294,14 @@ if inspect.isawaitable(ret): await ret - def addOnQuitCallback(self, callback, *args, **kwargs): + def add_on_quit_callback(self, callback, *args, **kwargs): """Add a callback which will be called on quit command @param callback(callback): method to call """ self._onQuitCallbacks.append((callback, args, kwargs)) - def getOutputChoices(self, output_type): + def get_output_choices(self, output_type): """Return valid output filters for output_type @param output_type: True for default, @@ -612,7 +612,7 @@ url = self.get_xmpp_uri_from_http(url) try: - uri_data = uri.parseXMPPUri(url) + uri_data = uri.parse_xmpp_uri(url) except ValueError: self.parser.error(_('invalid XMPP URL: {url}').format(url=url)) else: @@ -694,7 +694,7 @@ async def main(self, args, namespace): try: - await self.bridge.bridgeConnect() + await self.bridge.bridge_connect() except Exception as e: if isinstance(e, exceptions.BridgeExceptionNoService): print( @@ -711,9 +711,9 @@ ) self.quit(C.EXIT_BRIDGE_ERROR, raise_exc=False) return - await self.bridge.getReady() - self.version = await self.bridge.getVersion() - self._bridgeConnected() + await self.bridge.ready_get() + self.version = await self.bridge.version_get() + self._bridge_connected() self.import_plugins() try: self.args = self.parser.parse_args(args, namespace=None) @@ -756,14 +756,14 @@ res = await self.ainput(f"{message} (y/N)? ") return res in ("y", "Y") - async def confirmOrQuit(self, message, cancel_message=_("action cancelled by user")): + async def confirm_or_quit(self, message, cancel_message=_("action cancelled by user")): """Request user to confirm action, and quit if he doesn't""" confirmed = await self.confirm(message) if not confirmed: self.disp(cancel_message) self.quit(C.EXIT_USER_CANCELLED) - def quitFromSignal(self, exit_code=0): + def quit_from_signal(self, exit_code=0): r"""Same as self.quit, but from a signal handler /!\: return must be used after calling this method ! @@ -805,7 +805,7 @@ nodes2jid = {} try: - contacts = await self.bridge.getContacts(self.profile) + contacts = await self.bridge.contacts_get(self.profile) except BridgeException as e: if e.classname == "AttributeError": # we may get an AttributeError if we use a component profile @@ -872,7 +872,7 @@ self.disp('') return pwd - async def connectOrPrompt(self, method, err_msg=None): + async def connect_or_prompt(self, method, err_msg=None): """Try to connect/start profile session and prompt for password if needed @param method(callable): bridge method to either connect or start profile session @@ -907,7 +907,7 @@ """ # FIXME: need better exit codes - self.profile = await self.bridge.profileNameGet(self.args.profile) + self.profile = await self.bridge.profile_name_get(self.args.profile) if not self.profile: log.error( @@ -922,12 +922,12 @@ pass else: if start_session: - await self.connectOrPrompt( - lambda pwd: self.bridge.profileStartSession(pwd, self.profile), + await self.connect_or_prompt( + lambda pwd: self.bridge.profile_start_session(pwd, self.profile), err_msg="Can't start {profile}'s session: {e}" ) return - elif not await self.bridge.profileIsSessionStarted(self.profile): + elif not await self.bridge.profile_is_session_started(self.profile): if not self.args.connect: self.disp(_( "Session for [{profile}] is not started, please start it " @@ -945,13 +945,13 @@ # creation/deletion) return elif self.args.connect is True: # if connection is asked, we connect the profile - await self.connectOrPrompt( + await self.connect_or_prompt( lambda pwd: self.bridge.connect(self.profile, pwd, {}), err_msg = 'Can\'t connect profile "{profile!s}": {e}' ) return else: - if not await self.bridge.isConnected(self.profile): + if not await self.bridge.is_connected(self.profile): log.error( _("Profile [{profile}] is not connected, please connect it " "before using jp, or use --connect option") @@ -966,14 +966,14 @@ _jid = JID(param_jid) if not _jid.resource: #if the resource is not given, we try to add the main resource - main_resource = await self.bridge.getMainResource(param_jid, self.profile) + main_resource = await self.bridge.main_resource_get(param_jid, self.profile) if main_resource: return f"{_jid.bare}/{main_resource}" return param_jid async def get_profile_jid(self): """Retrieve current profile bare JID if possible""" - full_jid = await self.bridge.asyncGetParamA( + full_jid = await self.bridge.param_get_a_async( "JabberID", "Connection", profile_key=self.profile ) return full_jid.rsplit("/", 1)[0] @@ -1054,7 +1054,7 @@ assert use_output in C.OUTPUT_TYPES self._output_type = use_output output_parent = argparse.ArgumentParser(add_help=False) - choices = set(self.host.getOutputChoices(use_output)) + choices = set(self.host.get_output_choices(use_output)) choices.update(extra_outputs) if not choices: raise exceptions.InternalError( @@ -1125,14 +1125,14 @@ async def set_progress_id(self, progress_id): return await self.host.set_progress_id(progress_id) - async def progressStartedHandler(self, uid, metadata, profile): + async def progress_started_handler(self, uid, metadata, profile): if profile != self.profile: return if self.progress_id is None: # the progress started message can be received before the id - # so we keep progressStarted signals in cache to replay they + # so we keep progress_started signals in cache to replay they # when the progress_id is received - cache_data = (self.progressStartedHandler, uid, metadata, profile) + cache_data = (self.progress_started_handler, uid, metadata, profile) try: cache = self.host.progress_ids_cache except AttributeError: @@ -1140,14 +1140,14 @@ cache.append(cache_data) else: if self.host.watch_progress and uid == self.progress_id: - await self.onProgressStarted(metadata) + await self.on_progress_started(metadata) while True: await asyncio.sleep(PROGRESS_DELAY) - cont = await self.progressUpdate() + cont = await self.progress_update() if not cont: break - async def progressFinishedHandler(self, uid, metadata, profile): + async def progress_finished_handler(self, uid, metadata, profile): if profile != self.profile: return if uid == self.progress_id: @@ -1155,26 +1155,26 @@ self.host.pbar.finish() except AttributeError: pass - await self.onProgressFinished(metadata) + await self.on_progress_finished(metadata) if self.host.quit_on_progress_end: - self.host.quitFromSignal() + self.host.quit_from_signal() - async def progressErrorHandler(self, uid, message, profile): + async def progress_error_handler(self, uid, message, profile): if profile != self.profile: return if uid == self.progress_id: if self.args.progress: self.disp('') # progress is not finished, so we skip a line if self.host.quit_on_progress_end: - await self.onProgressError(message) - self.host.quitFromSignal(C.EXIT_ERROR) + await self.on_progress_error(message) + self.host.quit_from_signal(C.EXIT_ERROR) - async def progressUpdate(self): + async def progress_update(self): """This method is continualy called to update the progress bar @return (bool): False to stop being called """ - data = await self.host.bridge.progressGet(self.progress_id, self.profile) + data = await self.host.bridge.progress_get(self.progress_id, self.profile) if data: try: size = data['size'] @@ -1214,38 +1214,38 @@ elif self.host.pbar is not None: return False - await self.onProgressUpdate(data) + await self.on_progress_update(data) return True - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): """Called when progress has just started can be overidden by a command - @param metadata(dict): metadata as sent by bridge.progressStarted + @param metadata(dict): metadata as sent by bridge.progress_started """ self.disp(_("Operation started"), 2) - async def onProgressUpdate(self, metadata): + async def on_progress_update(self, metadata): """Method called on each progress updata can be overidden by a command to handle progress metadata - @para metadata(dict): metadata as returned by bridge.progressGet + @para metadata(dict): metadata as returned by bridge.progress_get """ pass - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): """Called when progress has just finished can be overidden by a command - @param metadata(dict): metadata as sent by bridge.progressFinished + @param metadata(dict): metadata as sent by bridge.progress_finished """ self.disp(_("Operation successfully finished"), 2) - async def onProgressError(self, e): + async def on_progress_error(self, e): """Called when a progress failed - @param error_msg(unicode): error message as sent by bridge.progressError + @param error_msg(unicode): error message as sent by bridge.progress_error """ self.disp(_("Error while doing operation: {e}").format(e=e), error=True) @@ -1260,7 +1260,7 @@ _('trying to use output when use_output has not been set')) return self.host.output(output_type, self.args.output, self.extra_outputs, data) - def getPubsubExtra(self, extra: Optional[dict] = None) -> str: + def get_pubsub_extra(self, extra: Optional[dict] = None) -> str: """Helper method to compute extra data from pubsub arguments @param extra: base extra dict, or None to generate a new one @@ -1328,7 +1328,7 @@ for cls in subcommands: cls(self) - def overridePubsubFlags(self, new_flags: Set[str]) -> None: + def override_pubsub_flags(self, new_flags: Set[str]) -> None: """Replace pubsub_flags given in __init__ useful when a command is extending an other command (e.g. blog command which does @@ -1356,11 +1356,11 @@ # we need to register the following signal even if we don't display the # progress bar self.host.bridge.register_signal( - "progressStarted", self.progressStartedHandler) + "progress_started", self.progress_started_handler) self.host.bridge.register_signal( - "progressFinished", self.progressFinishedHandler) + "progress_finished", self.progress_finished_handler) self.host.bridge.register_signal( - "progressError", self.progressErrorHandler) + "progress_error", self.progress_error_handler) if self.need_connect is not None: await self.host.connect_profile() @@ -1387,7 +1387,7 @@ def __init__(self, *args, **kwargs): super(CommandAnswering, self).__init__(*args, **kwargs) - async def onActionNew(self, action_data, action_id, security_limit, profile): + async def on_action_new(self, action_data, action_id, security_limit, profile): if profile != self.profile: return try: @@ -1398,7 +1398,7 @@ except KeyError: pass else: - self.onXMLUI(xml_ui) + self.on_xmlui(xml_ui) else: try: callback = self.action_callbacks[action_type] @@ -1407,7 +1407,7 @@ else: await callback(action_data, action_id, security_limit, profile) - def onXMLUI(self, xml_ui): + def on_xmlui(self, xml_ui): """Display a dialog received from the backend. @param xml_ui (unicode): dialog XML representation @@ -1422,7 +1422,7 @@ async def start_answering(self): """Auto reply to confirmation requests""" - self.host.bridge.register_signal("actionNew", self.onActionNew) - actions = await self.host.bridge.actionsGet(self.profile) + self.host.bridge.register_signal("action_new", self.on_action_new) + actions = await self.host.bridge.actions_get(self.profile) for action_data, action_id, security_limit in actions: - await self.onActionNew(action_data, action_id, security_limit, self.profile) + await self.on_action_new(action_data, action_id, security_limit, self.profile)
--- a/sat_frontends/jp/cmd_account.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_account.py Sat Apr 08 13:54:42 2023 +0200 @@ -80,7 +80,7 @@ async def start(self): try: - await self.host.bridge.inBandAccountNew( + await self.host.bridge.in_band_account_new( self.args.jid, self.args.password, self.args.email, @@ -112,7 +112,7 @@ self.disp(_("creating profile"), 2) try: - await self.host.bridge.profileCreate( + await self.host.bridge.profile_create( self.args.profile, self.args.password, "", @@ -141,7 +141,7 @@ self.disp(_("profile created"), 1) try: - await self.host.bridge.profileStartSession( + await self.host.bridge.profile_start_session( self.args.password, self.args.profile, ) @@ -150,7 +150,7 @@ self.host.quit(C.EXIT_BRIDGE_ERRBACK) try: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "JabberID", self.args.jid, "Connection", @@ -161,7 +161,7 @@ self.host.quit(C.EXIT_BRIDGE_ERRBACK) try: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "Password", self.args.password, "Connection", @@ -190,7 +190,7 @@ async def start(self): try: - await self.host.bridge.inBandPasswordChange( + await self.host.bridge.in_band_password_change( self.args.password, self.args.profile, ) @@ -217,7 +217,7 @@ async def start(self): try: - jid_str = await self.host.bridge.asyncGetParamA( + jid_str = await self.host.bridge.param_get_a_async( "JabberID", "Connection", profile_key=self.profile, @@ -233,10 +233,10 @@ f"This is the XMPP account of profile {self.profile!r}\n" f"Are you sure that you want to delete this account?" ) - await self.host.confirmOrQuit(message, _("Account deletion cancelled")) + await self.host.confirm_or_quit(message, _("Account deletion cancelled")) try: - await self.host.bridge.inBandUnregister(jid_.domain, self.args.profile) + await self.host.bridge.in_band_unregister(jid_.domain, self.args.profile) except Exception as e: self.disp(f"can't delete XMPP account with jid {jid_!r}: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK)
--- a/sat_frontends/jp/cmd_adhoc.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_adhoc.py Sat Apr 08 13:54:42 2023 +0200 @@ -75,7 +75,7 @@ if self.args.loop: flags.append(FLAG_LOOP) try: - bus_name, methods = await self.host.bridge.adHocDBusAddAuto( + bus_name, methods = await self.host.bridge.ad_hoc_dbus_add_auto( name, list(jids), self.args.groups, @@ -145,7 +145,7 @@ async def start(self): try: - xmlui_raw = await self.host.bridge.adHocRun( + xmlui_raw = await self.host.bridge.ad_hoc_run( self.args.jid, self.args.node, self.profile, @@ -159,7 +159,7 @@ await xmlui.show(workflow) if not workflow: if xmlui.type == "form": - await xmlui.submitForm() + await xmlui.submit_form() self.host.quit() @@ -181,7 +181,7 @@ async def start(self): try: - xmlui_raw = await self.host.bridge.adHocList( + xmlui_raw = await self.host.bridge.ad_hoc_list( self.args.jid, self.profile, )
--- a/sat_frontends/jp/cmd_application.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_application.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ self.args.filters = ['available'] try: - found_apps = await self.host.bridge.applicationsList(self.args.filters) + found_apps = await self.host.bridge.applications_list(self.args.filters) except Exception as e: self.disp(f"can't get applications list: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -80,7 +80,7 @@ async def start(self): try: - await self.host.bridge.applicationStart( + await self.host.bridge.application_start( self.args.name, "", ) @@ -117,7 +117,7 @@ args = [self.args.name, "name"] else: args = [self.args.id, "instance"] - await self.host.bridge.applicationStop( + await self.host.bridge.application_stop( *args, "", ) @@ -161,7 +161,7 @@ args = [self.args.name, "name"] else: args = [self.args.id, "instance"] - exposed_data_raw = await self.host.bridge.applicationExposedGet( + exposed_data_raw = await self.host.bridge.application_exposed_get( *args, "", )
--- a/sat_frontends/jp/cmd_avatar.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_avatar.py Sat Apr 08 13:54:42 2023 +0200 @@ -47,9 +47,9 @@ ) self.parser.add_argument("jid", nargs='?', default='', help=_("entity")) - async def showImage(self, path): - sat_conf = config.parseMainConf() - cmd = config.getConfig(sat_conf, C.CONFIG_SECTION, "image_cmd") + async def show_image(self, path): + sat_conf = config.parse_main_conf() + cmd = config.config_get(sat_conf, C.CONFIG_SECTION, "image_cmd") cmds = [cmd] + DISPLAY_CMD if cmd else DISPLAY_CMD for cmd in cmds: try: @@ -72,7 +72,7 @@ async def start(self): try: - avatar_data_raw = await self.host.bridge.avatarGet( + avatar_data_raw = await self.host.bridge.avatar_get( self.args.jid, not self.args.no_cache, self.profile, @@ -91,7 +91,7 @@ self.disp(avatar_path) if self.args.show: - await self.showImage(avatar_path) + await self.show_image(avatar_path) self.host.quit() @@ -117,7 +117,7 @@ self.host.quit(C.EXIT_BAD_ARG) path = os.path.abspath(path) try: - await self.host.bridge.avatarSet(path, self.args.jid, self.profile) + await self.host.bridge.avatar_set(path, self.args.jid, self.profile) except Exception as e: self.disp(f"can't set avatar: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK)
--- a/sat_frontends/jp/cmd_blocking.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_blocking.py Sat Apr 08 13:54:42 2023 +0200 @@ -43,7 +43,7 @@ async def start(self): try: - blocked_jids = await self.host.bridge.blockingList( + blocked_jids = await self.host.bridge.blocking_list( self.profile, ) except Exception as e: @@ -72,7 +72,7 @@ async def start(self): try: - await self.host.bridge.blockingBlock( + await self.host.bridge.blocking_block( self.args.entities, self.profile ) @@ -108,7 +108,7 @@ async def start(self): if self.args.entities == ["all"]: if not self.args.force: - await self.host.confirmOrQuit( + await self.host.confirm_or_quit( _("All entities will be unblocked, are you sure"), _("unblock cancelled") ) @@ -117,7 +117,7 @@ self.parser.error(_('--force is only allowed when "all" is used as target')) try: - await self.host.bridge.blockingUnblock( + await self.host.bridge.blocking_unblock( self.args.entities, self.profile )
--- a/sat_frontends/jp/cmd_blog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_blog.py Sat Apr 08 13:54:42 2023 +0200 @@ -92,7 +92,7 @@ ALLOWER_ATTACH_MD_KEY = ("desc", "media_type", "external") -async def guessSyntaxFromPath(host, sat_conf, path): +async def guess_syntax_from_path(host, sat_conf, path): """Return syntax guessed according to filename extension @param sat_conf(ConfigParser.ConfigParser): instance opened on sat configuration @@ -107,7 +107,7 @@ return k # if not found, we use current syntax - return await host.bridge.getParamA("Syntax", "Composition", "value", host.profile) + return await host.bridge.param_get_a("Syntax", "Composition", "value", host.profile) class BlogPublishCommon: @@ -121,13 +121,13 @@ """ if self.args.syntax is None: self.default_syntax_used = True - return await self.host.bridge.getParamA( + return await self.host.bridge.param_get_a( "Syntax", "Composition", "value", self.profile ) else: self.default_syntax_used = False try: - syntax = await self.host.bridge.syntaxGet(self.args.syntax) + syntax = await self.host.bridge.syntax_get(self.args.syntax) self.current_syntax = self.args.syntax = syntax except Exception as e: if e.classname == "NotFound": @@ -215,7 +215,7 @@ elif self.current_syntax == SYNTAX_XHTML: mb_data["content_xhtml"] = content else: - mb_data["content_xhtml"] = await self.host.bridge.syntaxConvert( + mb_data["content_xhtml"] = await self.host.bridge.syntax_convert( content, self.current_syntax, SYNTAX_XHTML, False, self.profile ) @@ -311,7 +311,7 @@ await self.set_mb_data_content(content, mb_data) try: - item_id = await self.host.bridge.mbSend( + item_id = await self.host.bridge.mb_send( self.args.service, self.args.node, data_format.serialise(mb_data), @@ -381,7 +381,7 @@ return common.format_time(item["published"]) def format_url(self, item, keys): - return uri.buildXMPPUri( + return uri.build_xmpp_uri( "pubsub", subtype="microblog", path=self.metadata["service"], @@ -509,12 +509,12 @@ async def start(self): try: mb_data = data_format.deserialise( - await self.host.bridge.mbGet( + await self.host.bridge.mb_get( self.args.service, self.args.node, self.args.max, self.args.items, - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ) ) @@ -556,7 +556,7 @@ help=_('add "publish: False" to metadata'), ) - def buildMetadataFile(self, content_file_path, mb_data=None): + def build_metadata_file(self, content_file_path, mb_data=None): """Build a metadata file using json The file is named after content_file_path, with extension replaced by @@ -620,7 +620,7 @@ async def edit(self, content_file_path, content_file_obj, mb_data=None): """Edit the file contening the content using editor, and publish it""" # we first create metadata file - meta_ori, meta_file_path = self.buildMetadataFile(content_file_path, mb_data) + meta_ori, meta_file_path = self.build_metadata_file(content_file_path, mb_data) coroutines = [] @@ -646,7 +646,7 @@ # we launch editor coroutines.append( - self.runEditor( + self.run_editor( "blog_editor_args", content_file_path, content_file_obj, @@ -665,20 +665,20 @@ mb_data = data_format.serialise(mb_data) - await self.host.bridge.mbSend( + await self.host.bridge.mb_send( self.pubsub_service, self.pubsub_node, mb_data, self.profile ) self.disp("Blog item published") - def getTmpSuff(self): + def get_tmp_suff(self): # we get current syntax to determine file extension return SYNTAX_EXT.get(self.current_syntax, SYNTAX_EXT[""]) - async def getItemData(self, service, node, item): + async def get_item_data(self, service, node, item): items = [item] if item else [] mb_data = data_format.deserialise( - await self.host.bridge.mbGet( + await self.host.bridge.mb_get( service, node, 1, items, data_format.serialise({}), self.profile ) ) @@ -689,12 +689,12 @@ except KeyError: content = item["content"] if content: - content = await self.host.bridge.syntaxConvert( + content = await self.host.bridge.syntax_convert( content, "text", SYNTAX_XHTML, False, self.profile ) if content and self.current_syntax != SYNTAX_XHTML: - content = await self.host.bridge.syntaxConvert( + content = await self.host.bridge.syntax_convert( content, SYNTAX_XHTML, self.current_syntax, False, self.profile ) @@ -716,7 +716,7 @@ async def start(self): # if there are user defined extension, we use them SYNTAX_EXT.update( - config.getConfig(self.sat_conf, C.CONFIG_SECTION, CONF_SYNTAX_EXT, {}) + config.config_get(self.sat_conf, C.CONFIG_SECTION, CONF_SYNTAX_EXT, {}) ) self.current_syntax = await self.get_current_syntax() @@ -727,7 +727,7 @@ content_file_path, content_file_obj, mb_data, - ) = await self.getItemPath() + ) = await self.get_item_path() await self.edit(content_file_path, content_file_obj, mb_data=mb_data) self.host.quit() @@ -749,7 +749,7 @@ async def start(self): try: - await self.host.bridge.mbRename( + await self.host.bridge.mb_rename( self.args.service, self.args.node, self.args.item, @@ -779,7 +779,7 @@ async def start(self): try: - repeat_id = await self.host.bridge.mbRepeat( + repeat_id = await self.host.bridge.mb_repeat( self.args.service, self.args.node, self.args.item, @@ -821,13 +821,13 @@ help=_("path to the content file"), ) - async def showPreview(self): - # we implement showPreview here so we don't have to import webbrowser and urllib + async def show_preview(self): + # we implement show_preview here so we don't have to import webbrowser and urllib # when preview is not used url = "file:{}".format(self.urllib.parse.quote(self.preview_file_path)) self.webbrowser.open_new_tab(url) - async def _launchPreviewExt(self, cmd_line, opt_name): + async def _launch_preview_ext(self, cmd_line, opt_name): url = "file:{}".format(self.urllib.parse.quote(self.preview_file_path)) args = common.parse_args( self.host, cmd_line, url=url, preview_file=self.preview_file_path @@ -840,19 +840,19 @@ self.host.quit(1) subprocess.Popen(args) - async def openPreviewExt(self): - await self._launchPreviewExt(self.open_cb_cmd, "blog_preview_open_cmd") + async def open_preview_ext(self): + await self._launch_preview_ext(self.open_cb_cmd, "blog_preview_open_cmd") - async def updatePreviewExt(self): - await self._launchPreviewExt(self.update_cb_cmd, "blog_preview_update_cmd") + async def update_preview_ext(self): + await self._launch_preview_ext(self.update_cb_cmd, "blog_preview_update_cmd") - async def updateContent(self): + async def update_content(self): with self.content_file_path.open("rb") as f: content = f.read().decode("utf-8-sig") if content and self.syntax != SYNTAX_XHTML: # we use safe=True because we want to have a preview as close as possible # to what the people will see - content = await self.host.bridge.syntaxConvert( + content = await self.host.bridge.syntax_convert( content, self.syntax, SYNTAX_XHTML, True, self.profile ) @@ -896,30 +896,30 @@ sat_conf = self.sat_conf SYNTAX_EXT.update( - config.getConfig(sat_conf, C.CONFIG_SECTION, CONF_SYNTAX_EXT, {}) + config.config_get(sat_conf, C.CONFIG_SECTION, CONF_SYNTAX_EXT, {}) ) try: - self.open_cb_cmd = config.getConfig( + self.open_cb_cmd = config.config_get( sat_conf, C.CONFIG_SECTION, "blog_preview_open_cmd", Exception ) except (NoOptionError, NoSectionError): self.open_cb_cmd = None - open_cb = self.showPreview + open_cb = self.show_preview else: - open_cb = self.openPreviewExt + open_cb = self.open_preview_ext - self.update_cb_cmd = config.getConfig( + self.update_cb_cmd = config.config_get( sat_conf, C.CONFIG_SECTION, "blog_preview_update_cmd", self.open_cb_cmd ) if self.update_cb_cmd is None: - update_cb = self.showPreview + update_cb = self.show_preview else: - update_cb = self.updatePreviewExt + update_cb = self.update_preview_ext # which file do we need to edit? if self.args.file == "current": - self.content_file_path = self.getCurrentFile(self.profile) + self.content_file_path = self.get_current_file(self.profile) else: try: self.content_file_path = Path(self.args.file).resolve(strict=True) @@ -927,7 +927,7 @@ self.disp(_('File "{file}" doesn\'t exist!').format(file=self.args.file)) self.host.quit(C.EXIT_NOT_FOUND) - self.syntax = await guessSyntaxFromPath( + self.syntax = await guess_syntax_from_path( self.host, sat_conf, self.content_file_path ) @@ -935,7 +935,7 @@ preview_file = tempfile.NamedTemporaryFile(suffix=".xhtml", delete=False) self.preview_file_path = preview_file.name preview_file.close() - await self.updateContent() + await self.update_content() if aionotify is None: # XXX: we don't delete file automatically because browser needs it @@ -989,7 +989,7 @@ # as a workaround, we do a little rest and try again await asyncio.sleep(1) await watcher.setup(loop) - await self.updateContent() + await self.update_content() await update_cb() except FileNotFoundError: self.disp("The file seems to have been deleted.", error=True) @@ -1007,7 +1007,7 @@ class Import(base.CommandBase): def __init__(self, host): - super(Import, self).__init__( + super().__init__( host, "import", use_pubsub=True, @@ -1054,10 +1054,10 @@ ), ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("Blog upload started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("Blog uploaded successfully"), 2) redirections = { k[len(URL_REDIRECT_PREFIX) :]: v @@ -1085,7 +1085,7 @@ ).format(conf=conf) ) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): self.disp( _("Error while uploading blog: {error_msg}").format(error_msg=error_msg), error=True, @@ -1163,12 +1163,12 @@ def __init__(self, host): super().__init__(host) - self.overridePubsubFlags({C.SERVICE, C.SINGLE_ITEM}) + self.override_pubsub_flags({C.SERVICE, C.SINGLE_ITEM}) async def start(self): if not self.args.node: - namespaces = await self.host.bridge.namespacesGet() + namespaces = await self.host.bridge.namespaces_get() try: ns_microblog = namespaces["microblog"] except KeyError: @@ -1183,11 +1183,11 @@ def __init__(self, host): super().__init__(host) - self.overridePubsubFlags({C.SERVICE, C.SINGLE_ITEM}) + self.override_pubsub_flags({C.SERVICE, C.SINGLE_ITEM}) async def start(self): if not self.args.node: - namespaces = await self.host.bridge.namespacesGet() + namespaces = await self.host.bridge.namespaces_get() try: ns_microblog = namespaces["microblog"] except KeyError:
--- a/sat_frontends/jp/cmd_bookmarks.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_bookmarks.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ async def start(self): try: - data = await self.host.bridge.bookmarksList( + data = await self.host.bridge.bookmarks_list( self.args.type, self.args.location, self.host.profile ) except Exception as e: @@ -107,10 +107,10 @@ async def start(self): if not self.args.force: - await self.host.confirmOrQuit(_("Are you sure to delete this bookmark?")) + await self.host.confirm_or_quit(_("Are you sure to delete this bookmark?")) try: - await self.host.bridge.bookmarksRemove( + await self.host.bridge.bookmarks_remove( self.args.type, self.args.bookmark, self.args.location, self.host.profile ) except Exception as e: @@ -151,7 +151,7 @@ if self.args.name is not None: data["name"] = self.args.name try: - await self.host.bridge.bookmarksAdd( + await self.host.bridge.bookmarks_add( self.args.type, self.args.bookmark, data,
--- a/sat_frontends/jp/cmd_debug.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_debug.py Sat Apr 08 13:54:42 2023 +0200 @@ -28,7 +28,7 @@ class BridgeCommon(object): - def evalArgs(self): + def eval_args(self): if self.args.arg: try: return eval("[{}]".format(",".join(self.args.arg))) @@ -67,7 +67,7 @@ elif "profile" in argspec.args: kwargs["profile"] = self.profile - args = self.evalArgs() + args = self.eval_args() try: ret = await method( @@ -100,12 +100,12 @@ self.parser.add_argument("arg", nargs="*", help=_("argument of the signal")) async def start(self): - args = self.evalArgs() + args = self.eval_args() json_args = json.dumps(args) # XXX: we use self.args.profile and not self.profile # because we want the raw profile_key (so plugin handle C.PROF_KEY_NONE) try: - await self.host.bridge.debugFakeSignal( + await self.host.bridge.debug_signal_fake( self.args.signal, json_args, self.args.profile ) except Exception as e: @@ -115,11 +115,11 @@ self.host.quit() -class Bridge(base.CommandBase): +class bridge(base.CommandBase): subcommands = (Method, Signal) def __init__(self, host): - super(Bridge, self).__init__( + super(bridge, self).__init__( host, "bridge", use_profile=False, help=_("bridge s(t)imulation") ) @@ -144,7 +144,7 @@ help=_("stream direction filter"), ) - async def printXML(self, direction, xml_data, profile): + async def print_xml(self, direction, xml_data, profile): if self.args.direction == "in" and direction != "IN": return if self.args.direction == "out" and direction != "OUT": @@ -186,7 +186,7 @@ self.disp("") async def start(self): - self.host.bridge.register_signal("xmlLog", self.printXML, "plugin") + self.host.bridge.register_signal("xml_log", self.print_xml, "plugin") class Theme(base.CommandBase): @@ -220,7 +220,7 @@ class Debug(base.CommandBase): - subcommands = (Bridge, Monitor, Theme) + subcommands = (bridge, Monitor, Theme) def __init__(self, host): super(Debug, self).__init__(
--- a/sat_frontends/jp/cmd_encryption.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_encryption.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,7 +49,7 @@ async def start(self): try: - plugins_ser = await self.host.bridge.encryptionPluginsGet() + plugins_ser = await self.host.bridge.encryption_plugins_get() plugins = data_format.deserialise(plugins_ser, type_check=list) except Exception as e: self.disp(f"can't retrieve plugins: {e}", error=True) @@ -77,7 +77,7 @@ jids = await self.host.check_jids([self.args.jid]) jid = jids[0] try: - serialised = await self.host.bridge.messageEncryptionGet(jid, self.profile) + serialised = await self.host.bridge.message_encryption_get(jid, self.profile) except Exception as e: self.disp(f"can't get session: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -117,7 +117,7 @@ async def start(self): if self.args.name is not None: try: - namespace = await self.host.bridge.encryptionNamespaceGet(self.args.name) + namespace = await self.host.bridge.encryption_namespace_get(self.args.name) except Exception as e: self.disp(f"can't get encryption namespace: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -130,7 +130,7 @@ jid = jids[0] try: - await self.host.bridge.messageEncryptionStart( + await self.host.bridge.message_encryption_start( jid, namespace, not self.args.encrypt_noreplace, self.profile) except Exception as e: @@ -157,7 +157,7 @@ jids = await self.host.check_jids([self.args.jid]) jid = jids[0] try: - await self.host.bridge.messageEncryptionStop(jid, self.profile) + await self.host.bridge.message_encryption_stop(jid, self.profile) except Exception as e: self.disp(f"can't end encrypted session: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -187,7 +187,7 @@ async def start(self): if self.args.name is not None: try: - namespace = await self.host.bridge.encryptionNamespaceGet(self.args.name) + namespace = await self.host.bridge.encryption_namespace_get(self.args.name) except Exception as e: self.disp(f"can't get encryption namespace: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -200,7 +200,7 @@ jid = jids[0] try: - xmlui_raw = await self.host.bridge.encryptionTrustUIGet( + xmlui_raw = await self.host.bridge.encryption_trust_ui_get( jid, namespace, self.profile) except Exception as e: self.disp(f"can't get encryption session trust UI: {e}", error=True) @@ -209,7 +209,7 @@ xmlui = xmlui_manager.create(self.host, xmlui_raw) await xmlui.show() if xmlui.type != C.XMLUI_DIALOG: - await xmlui.submitForm() + await xmlui.submit_form() self.host.quit() class EncryptionTrust(base.CommandBase):
--- a/sat_frontends/jp/cmd_event.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_event.py Sat Apr 08 13:54:42 2023 +0200 @@ -62,11 +62,11 @@ async def start(self): try: - events_data_s = await self.host.bridge.eventsGet( + events_data_s = await self.host.bridge.events_get( self.args.service, self.args.node, self.args.items, - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ) except Exception as e: @@ -325,7 +325,7 @@ if self.args.end is None: self.parser.error("--end or --duration must be set") try: - await self.host.bridge.eventCreate( + await self.host.bridge.event_create( data_format.serialise(event_data), self.args.id, self.args.node, @@ -359,7 +359,7 @@ async def start(self): event_data = self.get_event_data() try: - await self.host.bridge.eventModify( + await self.host.bridge.event_modify( data_format.serialise(event_data), self.args.item, self.args.service, @@ -394,7 +394,7 @@ async def start(self): try: - event_data_s = await self.host.bridge.eventInviteeGet( + event_data_s = await self.host.bridge.event_invitee_get( self.args.service, self.args.node, self.args.item, @@ -436,7 +436,7 @@ # TODO: handle RSVP with XMLUI in a similar way as for `ad-hoc run` fields = dict(self.args.fields) if self.args.fields else {} try: - self.host.bridge.eventInviteeSet( + self.host.bridge.event_invitee_set( self.args.service, self.args.node, self.args.item, @@ -522,7 +522,7 @@ show_table = OUTPUT_OPT_TABLE in self.args.output_opts - table = common.Table.fromListDict( + table = common.Table.from_list_dict( self.host, data, ("nick",) + (("jid",) if self.host.verbosity else ()) + ("attend", "guests"), @@ -596,7 +596,7 @@ else: # we get prefilled data with all people try: - affiliations = await self.host.bridge.psNodeAffiliationsGet( + affiliations = await self.host.bridge.ps_node_affiliations_get( self.args.service, self.args.node, self.profile, @@ -615,7 +615,7 @@ } try: - event_data = await self.host.bridge.eventInviteesList( + event_data = await self.host.bridge.event_invitees_list( self.args.service, self.args.node, self.profile, @@ -647,7 +647,7 @@ # we get nicknames for everybody, make it easier for organisers for jid_, data in prefilled.items(): - id_data = await self.host.bridge.identityGet(jid_, [], True, self.profile) + id_data = await self.host.bridge.identity_get(jid_, [], True, self.profile) id_data = data_format.deserialise(id_data) data["nick"] = id_data["nicknames"][0] @@ -716,7 +716,7 @@ emails_extra = self.args.email[1:] try: - await self.host.bridge.eventInviteByEmail( + await self.host.bridge.event_invite_by_email( self.args.service, self.args.node, self.args.item,
--- a/sat_frontends/jp/cmd_file.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_file.py Sat Apr 08 13:54:42 2023 +0200 @@ -82,19 +82,19 @@ help=_("end-to-end encrypt the file transfer") ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("File copy started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("File sent successfully"), 2) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): if error_msg == C.PROGRESS_ERROR_DECLINED: self.disp(_("The file has been refused by your contact")) else: self.disp(_("Error while sending file: {}").format(error_msg), error=True) - async def gotId(self, data, file_): + async def got_id(self, data, file_): """Called when a progress id has been received @param pid(unicode): progress id @@ -134,7 +134,7 @@ if self.args.bz2: with tempfile.NamedTemporaryFile("wb", delete=False) as buf: - self.host.addOnQuitCallback(os.unlink, buf.name) + self.host.add_on_quit_callback(os.unlink, buf.name) self.disp(_("bz2 is an experimental option, use with caution")) # FIXME: check free space self.disp(_("Starting compression, please wait...")) @@ -150,7 +150,7 @@ self.disp(_("Done !"), 1) try: - send_data = await self.host.bridge.fileSend( + send_data = await self.host.bridge.file_send( self.args.jid, buf.name, self.args.name or archive_name, @@ -162,12 +162,12 @@ self.disp(f"can't send file: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.gotId(send_data, file_) + await self.got_id(send_data, file_) else: for file_ in self.args.files: path = os.path.abspath(file_) try: - send_data = await self.host.bridge.fileSend( + send_data = await self.host.bridge.file_send( self.args.jid, path, self.args.name, @@ -179,7 +179,7 @@ self.disp(f"can't send file {file_!r}: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.gotId(send_data, file_) + await self.got_id(send_data, file_) class Request(base.CommandBase): @@ -241,13 +241,13 @@ help=_("overwrite existing file without confirmation"), ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("File copy started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("File received successfully"), 2) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): if error_msg == C.PROGRESS_ERROR_DECLINED: self.disp(_("The file request has been refused")) else: @@ -267,7 +267,7 @@ message = _("File {path} already exists! Do you want to overwrite?").format( path=path ) - await self.host.confirmOrQuit(message, _("file request cancelled")) + await self.host.confirm_or_quit(message, _("file request cancelled")) self.full_dest_jid = await self.host.get_full_jid(self.args.jid) extra = {} @@ -276,7 +276,7 @@ if self.args.namespace: extra["namespace"] = self.args.namespace try: - progress_id = await self.host.bridge.fileJingleRequest( + progress_id = await self.host.bridge.file_jingle_request( self.full_dest_jid, path, self.args.name, @@ -303,9 +303,9 @@ ) self._overwrite_refused = False # True when one overwrite as already been refused self.action_callbacks = { - C.META_TYPE_FILE: self.onFileAction, - C.META_TYPE_OVERWRITE: self.onOverwriteAction, - C.META_TYPE_NOT_IN_ROSTER_LEAK: self.onNotInRosterAction, + C.META_TYPE_FILE: self.on_file_action, + C.META_TYPE_OVERWRITE: self.on_overwrite_action, + C.META_TYPE_NOT_IN_ROSTER_LEAK: self.on_not_in_roster_action, } def add_parser_options(self): @@ -335,10 +335,10 @@ help=_("destination path (default: working directory)"), ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("File copy started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("File received successfully"), 2) if metadata.get("hash_verified", False): try: @@ -350,10 +350,10 @@ else: self.disp(_("hash can't be verified"), 1) - async def onProgressError(self, e): + async def on_progress_error(self, e): self.disp(_("Error while receiving file: {e}").format(e=e), error=True) - def getXmluiId(self, action_data): + def get_xmlui_id(self, action_data): # FIXME: we temporarily use ElementTree, but a real XMLUI managing module # should be available in the futur # TODO: XMLUI module @@ -368,10 +368,10 @@ self.disp(_("Invalid XMLUI received"), error=True) return xmlui_id - async def onFileAction(self, action_data, action_id, security_limit, profile): - xmlui_id = self.getXmluiId(action_data) + async def on_file_action(self, action_data, action_id, security_limit, profile): + xmlui_id = self.get_xmlui_id(action_data) if xmlui_id is None: - return self.host.quitFromSignal(1) + return self.host.quit_from_signal(1) try: from_jid = jid.JID(action_data["meta_from_jid"]) except KeyError: @@ -386,18 +386,18 @@ if not self.bare_jids or from_jid.bare in self.bare_jids: if self._overwrite_refused: self.disp(_("File refused because overwrite is needed"), error=True) - await self.host.bridge.launchAction( + await self.host.bridge.action_launch( xmlui_id, {"cancelled": C.BOOL_TRUE}, profile_key=profile ) - return self.host.quitFromSignal(2) + return self.host.quit_from_signal(2) await self.set_progress_id(progress_id) xmlui_data = {"path": self.path} - await self.host.bridge.launchAction(xmlui_id, xmlui_data, profile_key=profile) + await self.host.bridge.action_launch(xmlui_id, xmlui_data, profile_key=profile) - async def onOverwriteAction(self, action_data, action_id, security_limit, profile): - xmlui_id = self.getXmluiId(action_data) + async def on_overwrite_action(self, action_data, action_id, security_limit, profile): + xmlui_id = self.get_xmlui_id(action_data) if xmlui_id is None: - return self.host.quitFromSignal(1) + return self.host.quit_from_signal(1) try: progress_id = action_data["meta_progress_id"] except KeyError: @@ -412,13 +412,13 @@ self.disp(_("Refused to overwrite"), 2) self._overwrite_refused = True - xmlui_data = {"answer": C.boolConst(self.args.force)} - await self.host.bridge.launchAction(xmlui_id, xmlui_data, profile_key=profile) + xmlui_data = {"answer": C.bool_const(self.args.force)} + await self.host.bridge.action_launch(xmlui_id, xmlui_data, profile_key=profile) - async def onNotInRosterAction(self, action_data, action_id, security_limit, profile): - xmlui_id = self.getXmluiId(action_data) + async def on_not_in_roster_action(self, action_data, action_id, security_limit, profile): + xmlui_id = self.get_xmlui_id(action_data) if xmlui_id is None: - return self.host.quitFromSignal(1) + return self.host.quit_from_signal(1) try: from_jid = jid.JID(action_data["meta_from_jid"]) except ValueError: @@ -442,11 +442,11 @@ xmlui = xmlui_manager.create(self.host, action_data["xmlui"]) confirmed = await self.host.confirm(xmlui.dlg.message) - xmlui_data = {"answer": C.boolConst(confirmed)} - await self.host.bridge.launchAction(xmlui_id, xmlui_data, profile_key=profile) + xmlui_data = {"answer": C.bool_const(confirmed)} + await self.host.bridge.action_launch(xmlui_id, xmlui_data, profile_key=profile) if not confirmed and not self.args.multiple: self.disp(_("Session refused for {from_jid}").format(from_jid=from_jid)) - self.host.quitFromSignal(0) + self.host.quit_from_signal(0) async def start(self): self.bare_jids = [jid.JID(jid_).bare for jid_ in self.args.jids] @@ -489,16 +489,16 @@ help=_("URI of the file to retrieve or JSON of the whole attachment") ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("File download started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("File downloaded successfully"), 2) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): self.disp(_("Error while downloading file: {}").format(error_msg), error=True) - async def gotId(self, data): + async def got_id(self, data): """Called when a progress id has been received""" try: await self.set_progress_id(data["progress"]) @@ -532,12 +532,12 @@ message = _("File {path} already exists! Do you want to overwrite?").format( path=dest_file ) - await self.host.confirmOrQuit(message, _("file download cancelled")) + await self.host.confirm_or_quit(message, _("file download cancelled")) options = {} try: - download_data_s = await self.host.bridge.fileDownload( + download_data_s = await self.host.bridge.file_download( data_format.serialise(attachment), str(dest_file), data_format.serialise(options), @@ -548,7 +548,7 @@ self.disp(f"error while trying to download a file: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.gotId(download_data) + await self.got_id(download_data) class Upload(base.CommandBase): @@ -576,10 +576,10 @@ help=_(r"ignore invalide TLS certificate (/!\ Dangerous /!\)"), ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("File upload started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("File uploaded successfully"), 2) try: url = metadata["url"] @@ -590,10 +590,10 @@ # XXX: url is displayed alone on a line to make parsing easier self.disp(url) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): self.disp(_("Error while uploading file: {}").format(error_msg), error=True) - async def gotId(self, data, file_): + async def got_id(self, data, file_): """Called when a progress id has been received @param pid(unicode): progress id @@ -633,7 +633,7 @@ path = os.path.abspath(file_) try: - upload_data = await self.host.bridge.fileUpload( + upload_data = await self.host.bridge.file_upload( path, "", self.full_dest_jid, @@ -644,7 +644,7 @@ self.disp(f"error while trying to upload a file: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.gotId(upload_data, file_) + await self.got_id(upload_data, file_) class ShareAffiliationsSet(base.CommandBase): @@ -687,7 +687,7 @@ async def start(self): affiliations = dict(self.args.affiliations) try: - affiliations = await self.host.bridge.FISAffiliationsSet( + affiliations = await self.host.bridge.fis_affiliations_set( self.args.jid, self.args.namespace, self.args.path, @@ -730,7 +730,7 @@ async def start(self): try: - affiliations = await self.host.bridge.FISAffiliationsGet( + affiliations = await self.host.bridge.fis_affiliations_get( self.args.jid, self.args.namespace, self.args.path, @@ -793,7 +793,7 @@ async def start(self): configuration = dict(self.args.fields) try: - configuration = await self.host.bridge.FISConfigurationSet( + configuration = await self.host.bridge.fis_configuration_set( self.args.jid, self.args.namespace, self.args.path, @@ -836,7 +836,7 @@ async def start(self): try: - configuration = await self.host.bridge.FISConfigurationGet( + configuration = await self.host.bridge.fis_configuration_get( self.args.jid, self.args.namespace, self.args.path, @@ -900,7 +900,7 @@ def _size_filter(self, size, row): if not size: return "" - return A.color(A.BOLD, utils.getHumanSize(size)) + return A.color(A.BOLD, utils.get_human_size(size)) def default_output(self, files_data): """display files a way similar to ls""" @@ -914,7 +914,7 @@ show_header = True keys = ("name", "type", "size", "file_hash") headers = ("name", "type", "size", "hash") - table = common.Table.fromListDict( + table = common.Table.from_list_dict( self.host, files_data, keys=keys, @@ -926,7 +926,7 @@ async def start(self): try: - files_data = await self.host.bridge.FISList( + files_data = await self.host.bridge.fis_list( self.args.jid, self.args.path, {}, @@ -987,7 +987,7 @@ else: access = {} try: - name = await self.host.bridge.FISSharePath( + name = await self.host.bridge.fis_share_path( self.args.name, self.path, json.dumps(access, ensure_ascii=False), @@ -1059,7 +1059,7 @@ else: extra["thumb_url"] = self.args.thumbnail try: - await self.host.bridge.FISInvite( + await self.host.bridge.fis_invite( self.args.jid, self.args.service, self.args.type,
--- a/sat_frontends/jp/cmd_forums.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_forums.py Sat Apr 08 13:54:42 2023 +0200 @@ -54,13 +54,13 @@ help=_("forum key (DEFAULT: default forums)"), ) - def getTmpSuff(self): + def get_tmp_suff(self): """return suffix used for content file""" return "json" async def publish(self, forums_raw): try: - await self.host.bridge.forumsSet( + await self.host.bridge.forums_set( forums_raw, self.args.service, self.args.node, @@ -76,7 +76,7 @@ async def start(self): try: - forums_json = await self.host.bridge.forumsGet( + forums_json = await self.host.bridge.forums_get( self.args.service, self.args.node, self.args.key, @@ -89,7 +89,7 @@ self.disp(f"can't get node configuration: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) - content_file_obj, content_file_path = self.getTmpFile() + content_file_obj, content_file_path = self.get_tmp_file() forums_json = forums_json.strip() if forums_json: # we loads and dumps to have pretty printed json @@ -98,7 +98,7 @@ f = codecs.getwriter("utf-8")(content_file_obj) json.dump(forums, f, ensure_ascii=False, indent=4) content_file_obj.seek(0) - await self.runEditor("forums_editor_args", content_file_path, content_file_obj) + await self.run_editor("forums_editor_args", content_file_path, content_file_obj) class Get(base.CommandBase): @@ -154,7 +154,7 @@ async def start(self): try: - forums_raw = await self.host.bridge.forumsGet( + forums_raw = await self.host.bridge.forums_get( self.args.service, self.args.node, self.args.key,
--- a/sat_frontends/jp/cmd_identity.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_identity.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,7 +48,7 @@ async def start(self): jid_ = (await self.host.check_jids([self.args.jid]))[0] try: - data = await self.host.bridge.identityGet( + data = await self.host.bridge.identity_get( jid_, [], not self.args.no_cache, @@ -91,7 +91,7 @@ if not id_data: self.parser.error("At least one metadata must be set") try: - self.host.bridge.identitySet( + self.host.bridge.identity_set( data_format.serialise(id_data), self.profile, )
--- a/sat_frontends/jp/cmd_info.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_info.py Sat Apr 08 13:54:42 2023 +0200 @@ -230,7 +230,7 @@ # infos if infos_requested: try: - infos = await self.host.bridge.discoInfos( + infos = await self.host.bridge.disco_infos( jid, node=self.args.node, use_cache=self.args.use_cache, @@ -251,7 +251,7 @@ # items if items_requested: try: - items = await self.host.bridge.discoItems( + items = await self.host.bridge.disco_items( jid, node=self.args.node, use_cache=self.args.use_cache, @@ -298,7 +298,7 @@ jids = await self.host.check_jids([self.args.jid]) jid = jids[0] try: - data = await self.host.bridge.getSoftwareVersion(jid, self.host.profile) + data = await self.host.bridge.software_version_get(jid, self.host.profile) except Exception as e: self.disp(_("error while trying to get version: {e}").format(e=e), error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -340,7 +340,7 @@ async def start(self): try: - data = await self.host.bridge.sessionInfosGet(self.host.profile) + data = await self.host.bridge.session_infos_get(self.host.profile) except Exception as e: self.disp(_("Error getting session infos: {e}").format(e=e), error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -362,7 +362,7 @@ async def start(self): try: - data = await self.host.bridge.devicesInfosGet( + data = await self.host.bridge.devices_infos_get( self.args.jid, self.host.profile ) except Exception as e:
--- a/sat_frontends/jp/cmd_input.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_input.py Sat Apr 08 13:54:42 2023 +0200 @@ -117,7 +117,7 @@ def opt(self, type_): return lambda s: (type_, s) - def addValue(self, value): + def add_value(self, value): """add a parsed value according to arguments sequence""" self._values_ori.append(value) arguments = self.args.arguments @@ -329,7 +329,7 @@ if idx < self.args.row: continue for value in row: - self.addValue(value) + self.add_value(value) await self.runCommand() except exceptions.CancelError: # this row has been cancelled, we skip it
--- a/sat_frontends/jp/cmd_invitation.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_invitation.py Sat Apr 08 13:54:42 2023 +0200 @@ -128,7 +128,7 @@ ) try: - invitation_data = await self.host.bridge.invitationCreate( + invitation_data = await self.host.bridge.invitation_create( email, emails_extra, self.args.jid, @@ -178,7 +178,7 @@ async def start(self): try: - invitation_data = await self.host.bridge.invitationGet( + invitation_data = await self.host.bridge.invitation_get( self.args.id, ) except Exception as e: @@ -190,7 +190,7 @@ else: profile = invitation_data["guest_profile"] try: - await self.host.bridge.profileStartSession( + await self.host.bridge.profile_start_session( invitation_data["password"], profile, ) @@ -199,7 +199,7 @@ self.host.quit(C.EXIT_BRIDGE_ERRBACK) try: - jid_ = await self.host.bridge.asyncGetParamA( + jid_ = await self.host.bridge.param_get_a_async( "JabberID", "Connection", profile_key=profile, @@ -226,7 +226,7 @@ async def start(self): try: - await self.host.bridge.invitationDelete( + await self.host.bridge.invitation_delete( self.args.id, ) except Exception as e: @@ -302,7 +302,7 @@ ) extra[arg_name] = value try: - await self.host.bridge.invitationModify( + await self.host.bridge.invitation_modify( self.args.id, extra, self.args.replace, @@ -348,7 +348,7 @@ async def start(self): try: - data = await self.host.bridge.invitationList( + data = await self.host.bridge.invitation_list( self.args.profile, ) except Exception as e:
--- a/sat_frontends/jp/cmd_list.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_list.py Sat Apr 08 13:54:42 2023 +0200 @@ -52,13 +52,13 @@ await common.fill_well_known_uri(self, os.getcwd(), "tickets", meta_map={}) try: lists_data = data_format.deserialise( - await self.host.bridge.listGet( + await self.host.bridge.list_get( self.args.service, self.args.node, self.args.max, self.args.items, "", - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ), type_check=list, @@ -123,7 +123,7 @@ extra = {"update": update} try: - item_id = await self.host.bridge.listSet( + item_id = await self.host.bridge.list_set( self.args.service, self.args.node, values, @@ -171,9 +171,9 @@ message = _("Are you sure to delete list item {item_id} ?").format( item_id=self.args.item ) - await self.host.confirmOrQuit(message, _("item deletion cancelled")) + await self.host.confirm_or_quit(message, _("item deletion cancelled")) try: - await self.host.bridge.listDeleteItem( + await self.host.bridge.list_delete_item( self.args.service, self.args.node, self.args.item, @@ -192,7 +192,7 @@ # TODO: factorize with blog/import def __init__(self, host): - super(Import, self).__init__( + super().__init__( host, "import", use_progress=True, @@ -253,13 +253,13 @@ ), ) - async def onProgressStarted(self, metadata): + async def on_progress_started(self, metadata): self.disp(_("Tickets upload started"), 2) - async def onProgressFinished(self, metadata): + async def on_progress_finished(self, metadata): self.disp(_("Tickets uploaded successfully"), 2) - async def onProgressError(self, error_msg): + async def on_progress_error(self, error_msg): self.disp( _("Error while uploading tickets: {error_msg}").format(error_msg=error_msg), error=True,
--- a/sat_frontends/jp/cmd_merge_request.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_merge_request.py Sat Apr 08 13:54:42 2023 +0200 @@ -76,7 +76,7 @@ "You are going to publish your changes to service " "[{service}], are you sure ?" ).format(service=self.args.service) - await self.host.confirmOrQuit( + await self.host.confirm_or_quit( message, _("merge request publication cancelled") ) @@ -85,7 +85,7 @@ if self.args.labels is not None: values["labels"] = self.args.labels try: - published_id = await self.host.bridge.mergeRequestSet( + published_id = await self.host.bridge.merge_request_set( self.args.service, self.args.node, self.repository, @@ -133,7 +133,7 @@ extra = {} try: requests_data = data_format.deserialise( - await self.host.bridge.mergeRequestsGet( + await self.host.bridge.merge_requests_get( self.args.service, self.args.node, self.args.max, @@ -186,7 +186,7 @@ ) extra = {} try: - await self.host.bridge.mergeRequestsImport( + await self.host.bridge.merge_requests_import( self.repository, self.args.item, self.args.service,
--- a/sat_frontends/jp/cmd_message.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_message.py Sat Apr 08 13:54:42 2023 +0200 @@ -91,7 +91,7 @@ "jid", help=_("the destination jid") ) - async def sendStdin(self, dest_jid): + async def send_stdin(self, dest_jid): """Send incomming data on stdin to jabber contact @param dest_jid: destination jid @@ -123,7 +123,7 @@ if header: # first we sent the header try: - await self.host.bridge.messageSend( + await self.host.bridge.message_send( dest_jid, {self.args.lang: header}, subject, @@ -160,7 +160,7 @@ # first one del extra[C.KEY_ATTACHMENTS] try: - await self.host.bridge.messageSend( + await self.host.bridge.message_send( dest_jid, msg, subject, @@ -193,21 +193,21 @@ if self.args.encrypt is not None: try: - namespace = await self.host.bridge.encryptionNamespaceGet( + namespace = await self.host.bridge.encryption_namespace_get( self.args.encrypt) except Exception as e: self.disp(f"can't get encryption namespace: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) try: - await self.host.bridge.messageEncryptionStart( + await self.host.bridge.message_encryption_start( jid_, namespace, not self.args.encrypt_noreplace, self.profile ) except Exception as e: self.disp(f"can't start encryption session: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) - await self.sendStdin(jid_) + await self.send_stdin(jid_) class Retract(base.CommandBase): @@ -223,7 +223,7 @@ async def start(self): try: - await self.host.bridge.messageRetract( + await self.host.bridge.message_retract( self.args.message_id, self.profile ) @@ -286,7 +286,7 @@ if value is not None: extra[key] = str(value) try: - data, metadata_s, profile = await self.host.bridge.MAMGet( + data, metadata_s, profile = await self.host.bridge.mam_get( self.args.service, data_format.serialise(extra), self.profile) except Exception as e: self.disp(f"can't retrieve MAM archives: {e}", error=True) @@ -295,7 +295,7 @@ metadata = data_format.deserialise(metadata_s) try: - session_info = await self.host.bridge.sessionInfosGet(self.profile) + session_info = await self.host.bridge.session_infos_get(self.profile) except Exception as e: self.disp(f"can't get session infos: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK)
--- a/sat_frontends/jp/cmd_param.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_param.py Sat Apr 08 13:54:42 2023 +0200 @@ -50,11 +50,11 @@ async def start(self): if self.args.category is None: - categories = await self.host.bridge.getParamsCategories() + categories = await self.host.bridge.params_categories_get() print("\n".join(categories)) elif self.args.name is None: try: - values_dict = await self.host.bridge.asyncGetParamsValuesFromCategory( + values_dict = await self.host.bridge.params_values_from_category_get_async( self.args.category, self.args.security_limit, "", "", self.profile ) except Exception as e: @@ -67,7 +67,7 @@ print(f"{name}\t{value}") else: try: - value = await self.host.bridge.asyncGetParamA( + value = await self.host.bridge.param_get_a_async( self.args.name, self.args.category, self.args.attribute, @@ -100,7 +100,7 @@ async def start(self): try: - await self.host.bridge.setParam( + await self.host.bridge.param_set( self.args.name, self.args.value, self.args.category, @@ -131,7 +131,7 @@ async def start(self): """Save parameters template to XML file""" try: - await self.host.bridge.saveParamsTemplate(self.args.filename) + await self.host.bridge.params_template_save(self.args.filename) except Exception as e: self.disp(_("can't save parameters to file: {e}").format(e=e), error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -161,7 +161,7 @@ async def start(self): """Load parameters template from xml file""" try: - self.host.bridge.loadParamsTemplate(self.args.filename) + self.host.bridge.params_template_load(self.args.filename) except Exception as e: self.disp(_("can't load parameters from file: {e}").format(e=e), error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK)
--- a/sat_frontends/jp/cmd_pipe.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_pipe.py Sat Apr 08 13:54:42 2023 +0200 @@ -45,7 +45,7 @@ async def start(self): """ Create named pipe, and send stdin to it """ try: - port = await self.host.bridge.streamOut( + port = await self.host.bridge.stream_out( await self.host.get_full_jid(self.args.jid), self.profile, ) @@ -89,13 +89,13 @@ except IOError as e: sys.stderr.write(f"{e}\n") break - host.quitFromSignal() + host.quit_from_signal() class PipeIn(base.CommandAnswering): def __init__(self, host): super(PipeIn, self).__init__(host, "in", help=_("receive a pipe stream")) - self.action_callbacks = {"STREAM": self.onStreamAction} + self.action_callbacks = {"STREAM": self.on_stream_action} def add_parser_options(self): self.parser.add_argument( @@ -104,7 +104,7 @@ help=_('Jids accepted (none means "accept everything")'), ) - def getXmluiId(self, action_data): + def get_xmlui_id(self, action_data): try: xml_ui = action_data["xmlui"] except KeyError: @@ -113,13 +113,13 @@ ui = xmlui_manager.create(self.host, xml_ui) if not ui.submit_id: self.disp(_("Invalid XMLUI received"), error=True) - self.quitFromSignal(C.EXIT_INTERNAL_ERROR) + self.quit_from_signal(C.EXIT_INTERNAL_ERROR) return ui.submit_id - async def onStreamAction(self, action_data, action_id, security_limit, profile): - xmlui_id = self.getXmluiId(action_data) + async def on_stream_action(self, action_data, action_id, security_limit, profile): + xmlui_id = self.get_xmlui_id(action_data) if xmlui_id is None: - self.host.quitFromSignal(C.EXIT_ERROR) + self.host.quit_from_signal(C.EXIT_ERROR) try: from_jid = jid.JID(action_data["meta_from_jid"]) except KeyError: @@ -140,11 +140,11 @@ else: break xmlui_data = {"answer": C.BOOL_TRUE, "port": str(port)} - await self.host.bridge.launchAction( + await self.host.bridge.action_launch( xmlui_id, xmlui_data, profile_key=profile) async with server: await server.serve_forever() - self.host.quitFromSignal() + self.host.quit_from_signal() async def start(self): self.bare_jids = [jid.JID(jid_).bare for jid_ in self.args.jids]
--- a/sat_frontends/jp/cmd_profile.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_profile.py Sat Apr 08 13:54:42 2023 +0200 @@ -98,33 +98,33 @@ async def start(self): """Create a new profile""" - if self.args.profile in await self.host.bridge.profilesListGet(): + if self.args.profile in await self.host.bridge.profiles_list_get(): self.disp(f"Profile {self.args.profile} already exists.", error=True) self.host.quit(C.EXIT_BRIDGE_ERROR) try: - await self.host.bridge.profileCreate( + await self.host.bridge.profile_create( self.args.profile, self.args.password, self.args.component) except Exception as e: self.disp(f"can't create profile: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) try: - await self.host.bridge.profileStartSession( + await self.host.bridge.profile_start_session( self.args.password, self.args.profile) except Exception as e: self.disp(f"can't start profile session: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) if self.args.jid: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "JabberID", self.args.jid, "Connection", profile_key=self.args.profile) xmpp_pwd = self.args.password or self.args.xmpp_password if xmpp_pwd: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "Password", xmpp_pwd, "Connection", profile_key=self.args.profile) if self.args.autoconnect is not None: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "autoconnect_backend", self.args.autoconnect, "Connection", profile_key=self.args.profile) @@ -141,7 +141,7 @@ pass async def start(self): - print(await self.host.bridge.profileNameGet('@DEFAULT@')) + print(await self.host.bridge.profile_name_get('@DEFAULT@')) self.host.quit() @@ -154,15 +154,15 @@ self.parser.add_argument('-f', '--force', action='store_true', help=_('delete profile without confirmation')) async def start(self): - if self.args.profile not in await self.host.bridge.profilesListGet(): + if self.args.profile not in await self.host.bridge.profiles_list_get(): log.error(f"Profile {self.args.profile} doesn't exist.") self.host.quit(C.EXIT_NOT_FOUND) if not self.args.force: message = f"Are you sure to delete profile [{self.args.profile}] ?" cancel_message = "Profile deletion cancelled" - await self.host.confirmOrQuit(message, cancel_message) + await self.host.confirm_or_quit(message, cancel_message) - await self.host.bridge.asyncDeleteProfile(self.args.profile) + await self.host.bridge.profile_delete_async(self.args.profile) self.host.quit() @@ -187,7 +187,7 @@ data = {} for label, category, name in self.to_show: try: - value = await self.host.bridge.asyncGetParamA( + value = await self.host.bridge.param_get_a_async( name, category, profile_key=self.host.profile) except Exception as e: self.disp(f"can't get {name}/{category} param: {e}", error=True) @@ -218,7 +218,7 @@ clients, components = False, True else: clients, components = True, True - await self.output(await self.host.bridge.profilesListGet(clients, components)) + await self.output(await self.host.bridge.profiles_list_get(clients, components)) self.host.quit() @@ -251,19 +251,19 @@ if self.args.disable_password: self.args.password = '' if self.args.password is not None: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "Password", self.args.password, "General", profile_key=self.host.profile) if self.args.jid is not None: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "JabberID", self.args.jid, "Connection", profile_key=self.host.profile) if self.args.xmpp_password is not None: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "Password", self.args.xmpp_password, "Connection", profile_key=self.host.profile) if self.args.default: - await self.host.bridge.profileSetDefault(self.host.profile) + await self.host.bridge.profile_set_default(self.host.profile) if self.args.autoconnect is not None: - await self.host.bridge.setParam( + await self.host.bridge.param_set( "autoconnect_backend", self.args.autoconnect, "Connection", profile_key=self.host.profile)
--- a/sat_frontends/jp/cmd_pubsub.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_pubsub.py Sat Apr 08 13:54:42 2023 +0200 @@ -70,15 +70,15 @@ help=_("data key to filter"), ) - def removePrefix(self, key): + def remove_prefix(self, key): return key[7:] if key.startswith("pubsub#") else key - def filterKey(self, key): + def filter_key(self, key): return any((key == k or key == "pubsub#" + k) for k in self.args.keys) async def start(self): try: - config_dict = await self.host.bridge.psNodeConfigurationGet( + config_dict = await self.host.bridge.ps_node_configuration_get( self.args.service, self.args.node, self.profile, @@ -97,9 +97,9 @@ self.disp(f"Internal error: {e}", error=True) self.host.quit(C.EXIT_INTERNAL_ERROR) else: - key_filter = (lambda k: True) if not self.args.keys else self.filterKey + key_filter = (lambda k: True) if not self.args.keys else self.filter_key config_dict = { - self.removePrefix(k): v for k, v in config_dict.items() if key_filter(k) + self.remove_prefix(k): v for k, v in config_dict.items() if key_filter(k) } await self.output(config_dict) self.host.quit() @@ -150,7 +150,7 @@ async def start(self): options = self.get_config_options(self.args) try: - node_id = await self.host.bridge.psNodeCreate( + node_id = await self.host.bridge.ps_node_create( self.args.service, self.args.node, options, @@ -198,10 +198,10 @@ "Are you sure to delete node [{node}] on service " "[{service}]? This will delete ALL items from it!" ).format(node=self.args.node, service=self.args.service) - await self.host.confirmOrQuit(message, _("node purge cancelled")) + await self.host.confirm_or_quit(message, _("node purge cancelled")) try: - await self.host.bridge.psNodePurge( + await self.host.bridge.ps_node_purge( self.args.service, self.args.node, self.profile, @@ -243,10 +243,10 @@ message = _( "Are you sure to delete node [{node}] on " "service [{service}]?" ).format(node=self.args.node, service=self.args.service) - await self.host.confirmOrQuit(message, _("node deletion cancelled")) + await self.host.confirm_or_quit(message, _("node deletion cancelled")) try: - await self.host.bridge.psNodeDelete( + await self.host.bridge.ps_node_delete( self.args.service, self.args.node, self.profile, @@ -290,7 +290,7 @@ help=_('don\'t prepend "pubsub#" prefix to field names'), ) - def getKeyName(self, k): + def get_key_name(self, k): if self.args.full_prefix or k.startswith("pubsub#"): return k else: @@ -298,10 +298,10 @@ async def start(self): try: - await self.host.bridge.psNodeConfigurationSet( + await self.host.bridge.ps_node_configuration_set( self.args.service, self.args.node, - {self.getKeyName(k): v for k, v in self.args.fields}, + {self.get_key_name(k): v for k, v in self.args.fields}, self.profile, ) except Exception as e: @@ -339,7 +339,7 @@ async def start(self): try: - element, etree = xml_tools.etreeParse( + element, etree = xml_tools.etree_parse( self, self.args.import_file, reraise=True ) except Exception as e: @@ -350,7 +350,7 @@ # so we wrap them here and try again self.args.import_file.seek(0) xml_buf = "<import>" + self.args.import_file.read() + "</import>" - element, etree = xml_tools.etreeParse(self, xml_buf) + element, etree = xml_tools.etree_parse(self, xml_buf) # we reverse element as we expect to have most recently published element first # TODO: make this more explicit and add an option @@ -366,7 +366,7 @@ items = [etree.tostring(i, encoding="unicode") for i in element] if self.args.admin: - method = self.host.bridge.psAdminItemsSend + method = self.host.bridge.ps_admin_items_send else: self.disp( _( @@ -374,7 +374,7 @@ "be changed" ) ) - method = self.host.bridge.psItemsSend + method = self.host.bridge.ps_items_send try: items_ids = await method( @@ -416,7 +416,7 @@ async def start(self): try: - affiliations = await self.host.bridge.psNodeAffiliationsGet( + affiliations = await self.host.bridge.ps_node_affiliations_get( self.args.service, self.args.node, self.profile, @@ -458,7 +458,7 @@ async def start(self): affiliations = dict(self.args.affiliations) try: - await self.host.bridge.psNodeAffiliationsSet( + await self.host.bridge.ps_node_affiliations_set( self.args.service, self.args.node, affiliations, @@ -505,9 +505,9 @@ async def start(self): if self.args.public: - method = self.host.bridge.psPublicNodeSubscriptionsGet + method = self.host.bridge.ps_public_node_subscriptions_get else: - method = self.host.bridge.psNodeSubscriptionsGet + method = self.host.bridge.ps_node_subscriptions_get try: subscriptions = await method( self.args.service, @@ -575,7 +575,7 @@ async def start(self): try: - self.host.bridge.psNodeSubscriptionsSet( + self.host.bridge.ps_node_subscriptions_set( self.args.service, self.args.node, self.args.subscriptions, @@ -618,7 +618,7 @@ async def start(self): try: - await self.host.bridge.psSchemaSet( + await self.host.bridge.ps_schema_set( self.args.service, self.args.node, self.args.schema, @@ -653,7 +653,7 @@ async def publish(self, schema): try: - await self.host.bridge.psSchemaSet( + await self.host.bridge.ps_schema_set( self.args.service, self.args.node, schema, @@ -666,7 +666,7 @@ self.disp(_("schema has been set"), 1) self.host.quit() - async def psSchemaGetCb(self, schema): + async def ps_schema_get_cb(self, schema): try: from lxml import etree except ImportError: @@ -676,7 +676,7 @@ error=True, ) self.host.quit(1) - content_file_obj, content_file_path = self.getTmpFile() + content_file_obj, content_file_path = self.get_tmp_file() schema = schema.strip() if schema: parser = etree.XMLParser(remove_blank_text=True) @@ -685,13 +685,13 @@ etree.tostring(schema_elt, encoding="utf-8", pretty_print=True) ) content_file_obj.seek(0) - await self.runEditor( + await self.run_editor( "pubsub_schema_editor_args", content_file_path, content_file_obj ) async def start(self): try: - schema = await self.host.bridge.psSchemaGet( + schema = await self.host.bridge.ps_schema_get( self.args.service, self.args.node, self.profile, @@ -703,7 +703,7 @@ self.disp(f"can't edit schema: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) - await self.psSchemaGetCb(schema) + await self.ps_schema_get_cb(schema) class NodeSchemaGet(base.CommandBase): @@ -724,7 +724,7 @@ async def start(self): try: - schema = await self.host.bridge.psSchemaGet( + schema = await self.host.bridge.ps_schema_get( self.args.service, self.args.node, self.profile, @@ -794,13 +794,13 @@ async def start(self): try: ps_result = data_format.deserialise( - await self.host.bridge.psCacheGet( + await self.host.bridge.ps_cache_get( self.args.service, self.args.node, self.args.max, self.args.items, self.args.sub_id, - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ) ) @@ -839,7 +839,7 @@ async def start(self): try: - await self.host.bridge.psCacheSync( + await self.host.bridge.ps_cache_sync( self.args.service, self.args.node, self.profile, @@ -910,7 +910,7 @@ async def start(self): if not self.args.force: - await self.host.confirmOrQuit( + await self.host.confirm_or_quit( _( "Are you sure to purge items from cache? You'll have to bypass cache " "or resynchronise nodes to access deleted items again." @@ -926,7 +926,7 @@ if value is not None: purge_data[key] = value try: - await self.host.bridge.psCachePurge( + await self.host.bridge.ps_cache_purge( data_format.serialise( purge_data ) @@ -956,7 +956,7 @@ async def start(self): if not self.args.force: - await self.host.confirmOrQuit( + await self.host.confirm_or_quit( _( "Are you sure to reset cache? All nodes and items will be removed " "from it, then it will be progressively refilled as if it were new. " @@ -965,7 +965,7 @@ _("Pubsub cache reset has been cancelled.") ) try: - await self.host.bridge.psCacheReset() + await self.host.bridge.ps_cache_reset() except Exception as e: self.disp(f"Internal error: {e}", error=True) self.host.quit(C.EXIT_INTERNAL_ERROR) @@ -1147,7 +1147,7 @@ self.args.keys.append("item_payload") try: found_items = data_format.deserialise( - await self.host.bridge.psCacheSearch( + await self.host.bridge.ps_cache_search( data_format.serialise(query) ), type_check=list, @@ -1249,8 +1249,8 @@ ) async def start(self): - element, etree = xml_tools.etreeParse(self, sys.stdin) - element = xml_tools.getPayload(self, element) + element, etree = xml_tools.etree_parse(self, sys.stdin) + element = xml_tools.get_payload(self, element) payload = etree.tostring(element, encoding="unicode") extra = {} if self.args.encrypt: @@ -1264,7 +1264,7 @@ extra["publish_options"] = publish_options try: - published_id = await self.host.bridge.psItemSend( + published_id = await self.host.bridge.ps_item_send( self.args.service, self.args.node, payload, @@ -1318,13 +1318,13 @@ extra["decrypt"] = False try: ps_result = data_format.deserialise( - await self.host.bridge.psItemsGet( + await self.host.bridge.ps_items_get( self.args.service, self.args.node, self.args.max, self.args.items, self.args.sub_id, - self.getPubsubExtra(extra), + self.get_pubsub_extra(extra), self.profile, ) ) @@ -1373,9 +1373,9 @@ message = _("Are you sure to delete item {item_id} ?").format( item_id=self.args.item ) - await self.host.confirmOrQuit(message, _("item deletion cancelled")) + await self.host.confirm_or_quit(message, _("item deletion cancelled")) try: - await self.host.bridge.psItemRetract( + await self.host.bridge.ps_item_retract( self.args.service, self.args.node, self.args.item, @@ -1432,7 +1432,7 @@ extra["encrypted_for"] = {"targets": self.args.encrypt_for} if self.args.sign: extra["signed"] = True - published_id = await self.host.bridge.psItemSend( + published_id = await self.host.bridge.ps_item_send( self.pubsub_service, self.pubsub_node, content, @@ -1445,7 +1445,7 @@ else: self.disp("Item published") - async def getItemData(self, service, node, item): + async def get_item_data(self, service, node, item): try: from lxml import etree except ImportError: @@ -1457,7 +1457,7 @@ self.host.quit(1) items = [item] if item else [] ps_result = data_format.deserialise( - await self.host.bridge.psItemsGet( + await self.host.bridge.ps_items_get( service, node, 1, items, "", data_format.serialise({}), self.profile ) ) @@ -1479,8 +1479,8 @@ self.pubsub_item, content_file_path, content_file_obj, - ) = await self.getItemPath() - await self.runEditor("pubsub_editor_args", content_file_path, content_file_obj) + ) = await self.get_item_path() + await self.run_editor("pubsub_editor_args", content_file_path, content_file_obj) self.host.quit() @@ -1500,7 +1500,7 @@ async def start(self): try: - await self.host.bridge.psItemRename( + await self.host.bridge.ps_item_rename( self.args.service, self.args.node, self.args.item, @@ -1537,7 +1537,7 @@ async def start(self): options = {} if self.args.public: - namespaces = await self.host.bridge.namespacesGet() + namespaces = await self.host.bridge.namespaces_get() try: ns_pps = namespaces["pps"] except KeyError: @@ -1549,7 +1549,7 @@ else: options[f"{{{ns_pps}}}public"] = True try: - sub_id = await self.host.bridge.psSubscribe( + sub_id = await self.host.bridge.ps_subscribe( self.args.service, self.args.node, data_format.serialise(options), @@ -1584,7 +1584,7 @@ async def start(self): try: - await self.host.bridge.psUnsubscribe( + await self.host.bridge.ps_unsubscribe( self.args.service, self.args.node, self.profile, @@ -1617,9 +1617,9 @@ async def start(self): if self.args.public: - method = self.host.bridge.psPublicSubscriptionsGet + method = self.host.bridge.ps_public_subscriptions_get else: - method = self.host.bridge.psSubscriptionsGet + method = self.host.bridge.ps_subscriptions_get try: subscriptions = data_format.deserialise( await method( @@ -1653,7 +1653,7 @@ async def start(self): try: - affiliations = await self.host.bridge.psAffiliationsGet( + affiliations = await self.host.bridge.ps_affiliations_get( self.args.service, self.args.node, self.profile, @@ -1692,14 +1692,14 @@ async def start(self): service = self.args.service or await self.host.get_profile_jid() if self.args.item: - anchor = uri.buildXMPPUri( + anchor = uri.build_xmpp_uri( "pubsub", path=service, node=self.args.node, item=self.args.item ) else: - anchor = uri.buildXMPPUri("pubsub", path=service, node=self.args.node) + anchor = uri.build_xmpp_uri("pubsub", path=service, node=self.args.node) try: - await self.host.bridge.referenceSend( + await self.host.bridge.reference_send( self.args.recipient, anchor, self.args.type, @@ -1899,17 +1899,17 @@ ) self.parser.add_argument("command", nargs=argparse.REMAINDER) - async def getItems(self, depth, service, node, items): + async def get_items(self, depth, service, node, items): self.to_get += 1 try: ps_result = data_format.deserialise( - await self.host.bridge.psItemsGet( + await self.host.bridge.ps_items_get( service, node, self.args.node_max, items, "", - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ) ) @@ -1922,7 +1922,7 @@ else: await self.search(ps_result, depth) - def _checkPubsubURL(self, match, found_nodes): + def _check_pubsub_url(self, match, found_nodes): """check that the matched URL is an xmpp: one @param found_nodes(list[unicode]): found_nodes @@ -1931,7 +1931,7 @@ url = match.group(0) if url.startswith("xmpp"): try: - url_data = uri.parseXMPPUri(url) + url_data = uri.parse_xmpp_uri(url) except ValueError: return if url_data["type"] == "pubsub": @@ -1940,13 +1940,13 @@ found_node["item"] = url_data["item"] found_nodes.append(found_node) - async def getSubNodes(self, item, depth): - """look for pubsub URIs in item, and getItems on the linked nodes""" + async def get_sub_nodes(self, item, depth): + """look for pubsub URIs in item, and get_items on the linked nodes""" found_nodes = [] - checkURI = partial(self._checkPubsubURL, found_nodes=found_nodes) + checkURI = partial(self._check_pubsub_url, found_nodes=found_nodes) strings.RE_URL.sub(checkURI, item) for data in found_nodes: - await self.getItems( + await self.get_items( depth + 1, data["service"], data["node"], @@ -2061,7 +2061,7 @@ return True, item - async def doItemAction(self, item, metadata): + async def do_item_action(self, item, metadata): """called when item has been kepts and the action need to be done @param item(unicode): accepted item @@ -2114,23 +2114,23 @@ ) async def search(self, ps_result, depth): - """callback of getItems + """callback of get_items this method filters items, get sub nodes if needed, do the requested action, and exit the command when everything is done - @param items_data(tuple): result of getItems + @param items_data(tuple): result of get_items @param depth(int): current depth level 0 for first node, 1 for first children, and so on """ for item in ps_result["items"]: if depth < self.args.max_depth: - await self.getSubNodes(item, depth) + await self.get_sub_nodes(item, depth) keep, item = self.filter(item) if not keep: continue - await self.doItemAction(item, ps_result) + await self.do_item_action(item, ps_result) - # we check if we got all getItems results + # we check if we got all get_items results self.to_get -= 1 if self.to_get == 0: # yes, we can quit @@ -2160,7 +2160,7 @@ self.args.namespace = dict( self.args.namespace + [("pubsub", "http://jabber.org/protocol/pubsub")] ) - await self.getItems(0, self.args.service, self.args.node, self.args.items) + await self.get_items(0, self.args.service, self.args.node, self.args.items) class Transform(base.CommandBase): @@ -2208,7 +2208,7 @@ ), ) - async def psItemsSendCb(self, item_ids, metadata): + async def ps_items_send_cb(self, item_ids, metadata): if item_ids: self.disp( _("items published with ids {item_ids}").format( @@ -2218,15 +2218,15 @@ else: self.disp(_("items published")) if self.args.all: - return await self.handleNextPage(metadata) + return await self.handle_next_page(metadata) else: self.host.quit() - async def handleNextPage(self, metadata): + async def handle_next_page(self, metadata): """Retrieve new page through RSM or quit if we're in the last page use to handle --all option - @param metadata(dict): metadata as returned by psItemsGet + @param metadata(dict): metadata as returned by ps_items_get """ try: last = metadata["rsm"]["last"] @@ -2255,11 +2255,11 @@ ) ) - extra = self.getPubsubExtra() + extra = self.get_pubsub_extra() extra["rsm_after"] = last try: ps_result = await data_format.deserialise( - self.host.bridge.psItemsGet( + self.host.bridge.ps_items_get( self.args.service, self.args.node, self.args.rsm_max, @@ -2273,9 +2273,9 @@ self.disp(f"can't retrieve items: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.psItemsGetCb(ps_result) + await self.ps_items_get_cb(ps_result) - async def psItemsGetCb(self, ps_result): + async def ps_items_get_cb(self, ps_result): encoding = "utf-8" new_items = [] @@ -2283,7 +2283,7 @@ if self.check_duplicates: # this is used when we are not ordering by creation # to avoid infinite loop - item_elt, __ = xml_tools.etreeParse(self, item) + item_elt, __ = xml_tools.etree_parse(self, item) item_id = item_elt.get("id") if item_id in self.items_ids: self.disp( @@ -2322,12 +2322,12 @@ self.disp(cmd_std_err, error=True) cmd_std_out = cmd_std_out.decode(encoding).strip() if cmd_std_out == "DELETE": - item_elt, __ = xml_tools.etreeParse(self, item) + item_elt, __ = xml_tools.etree_parse(self, item) item_id = item_elt.get("id") self.disp(_("Deleting item {item_id}").format(item_id=item_id)) if self.args.apply: try: - await self.host.bridge.psItemRetract( + await self.host.bridge.ps_item_retract( self.args.service, self.args.node, item_id, @@ -2339,11 +2339,11 @@ self.host.quit(C.EXIT_BRIDGE_ERRBACK) continue elif cmd_std_out == "SKIP": - item_elt, __ = xml_tools.etreeParse(self, item) + item_elt, __ = xml_tools.etree_parse(self, item) item_id = item_elt.get("id") self.disp(_("Skipping item {item_id}").format(item_id=item_id)) continue - element, etree = xml_tools.etreeParse(self, cmd_std_out) + element, etree = xml_tools.etree_parse(self, cmd_std_out) # at this point command has been run and we have a etree.Element object if element.tag not in ("item", "{http://jabber.org/protocol/pubsub}item"): @@ -2367,13 +2367,13 @@ if not self.args.apply: # on dry run we have nothing to wait for, we can quit if self.args.all: - return await self.handleNextPage(ps_result) + return await self.handle_next_page(ps_result) self.host.quit() else: if self.args.admin: - bridge_method = self.host.bridge.psAdminItemsSend + bridge_method = self.host.bridge.ps_admin_items_send else: - bridge_method = self.host.bridge.psItemsSend + bridge_method = self.host.bridge.ps_items_send try: ps_items_send_result = await bridge_method( @@ -2387,7 +2387,7 @@ self.disp(f"can't send item: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.psItemsSendCb(ps_items_send_result, metadata=ps_result) + await self.ps_items_send_cb(ps_items_send_result, metadata=ps_result) async def start(self): if self.args.all and self.args.order_by != C.ORDER_BY_CREATION: @@ -2409,13 +2409,13 @@ try: ps_result = data_format.deserialise( - await self.host.bridge.psItemsGet( + await self.host.bridge.ps_items_get( self.args.service, self.args.node, self.args.max, self.args.items, "", - self.getPubsubExtra(), + self.get_pubsub_extra(), self.profile, ) ) @@ -2423,7 +2423,7 @@ self.disp(f"can't retrieve items: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self.psItemsGetCb(ps_result) + await self.ps_items_get_cb(ps_result) class Uri(base.CommandBase): @@ -2457,13 +2457,13 @@ key = "path" if value: uri_args[key] = value - self.disp(uri.buildXMPPUri("pubsub", **uri_args)) + self.disp(uri.build_xmpp_uri("pubsub", **uri_args)) self.host.quit() async def start(self): if not self.args.service: try: - jid_ = await self.host.bridge.asyncGetParamA( + jid_ = await self.host.bridge.param_get_a_async( "JabberID", "Connection", profile_key=self.args.profile ) except Exception as e: @@ -2500,7 +2500,7 @@ async def start(self): try: - attached_data, __ = await self.host.bridge.psAttachmentsGet( + attached_data, __ = await self.host.bridge.ps_attachments_get( self.args.service, self.args.node, self.args.item, @@ -2587,7 +2587,7 @@ self.parser.error(_("At leat one attachment must be specified.")) try: - await self.host.bridge.psAttachmentsSet( + await self.host.bridge.ps_attachments_set( data_format.serialise(attachments_data), self.profile, ) @@ -2635,7 +2635,7 @@ } } try: - await self.host.bridge.psAttachmentsSet( + await self.host.bridge.ps_attachments_set( data_format.serialise(attachments_data), self.profile, ) @@ -2667,7 +2667,7 @@ async def start(self): try: - ret_s = await self.host.bridge.psSignatureCheck( + ret_s = await self.host.bridge.ps_signature_check( self.args.service, self.args.node, self.args.item, @@ -2718,7 +2718,7 @@ async def start(self): try: - await self.host.bridge.psSecretShare( + await self.host.bridge.ps_secret_share( self.args.recipient, self.args.service, self.args.node, @@ -2757,7 +2757,7 @@ async def start(self): try: - await self.host.bridge.psSecretRevoke( + await self.host.bridge.ps_secret_revoke( self.args.service, self.args.node, self.args.secret_id, @@ -2793,7 +2793,7 @@ async def start(self): try: - await self.host.bridge.psSecretRotate( + await self.host.bridge.ps_secret_rotate( self.args.service, self.args.node, self.args.recipients, @@ -2824,7 +2824,7 @@ async def start(self): try: - secrets = data_format.deserialise(await self.host.bridge.psSecretsList( + secrets = data_format.deserialise(await self.host.bridge.ps_secrets_list( self.args.service, self.args.node, self.profile, @@ -2885,7 +2885,7 @@ ) @staticmethod - def checkArgs(self): + def check_args(self): if self.args.type == "python_file": self.args.hook_arg = os.path.abspath(self.args.hook_arg) if not os.path.isfile(self.args.hook_arg): @@ -2894,9 +2894,9 @@ ) async def start(self): - self.checkArgs(self) + self.check_args(self) try: - await self.host.bridge.psHookAdd( + await self.host.bridge.ps_hook_add( self.args.service, self.args.node, self.args.type, @@ -2941,9 +2941,9 @@ ) async def start(self): - HookCreate.checkArgs(self) + HookCreate.check_args(self) try: - nb_deleted = await self.host.bridge.psHookRemove( + nb_deleted = await self.host.bridge.ps_hook_remove( self.args.service, self.args.node, self.args.type, @@ -2975,7 +2975,7 @@ async def start(self): try: - data = await self.host.bridge.psHookList( + data = await self.host.bridge.ps_hook_list( self.profile, ) except Exception as e:
--- a/sat_frontends/jp/cmd_roster.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_roster.py Sat Apr 08 13:54:42 2023 +0200 @@ -64,7 +64,7 @@ async def start(self): try: - contacts = await self.host.bridge.getContacts(profile_key=self.host.profile) + contacts = await self.host.bridge.contacts_get(profile_key=self.host.profile) except Exception as e: self.disp(f"error while retrieving the contacts: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -72,7 +72,7 @@ contacts_dict = {} for contact_jid_s, data, groups in contacts: # FIXME: we have to convert string to bool here for historical reason - # getContacts format should be changed and serialised properly + # contacts_get format should be changed and serialised properly for key in ('from', 'to', 'ask'): if key in data: data[key] = C.bool(data[key]) @@ -107,7 +107,7 @@ groups = self.args.groups else: try: - entity_data = await self.host.bridge.contactGet( + entity_data = await self.host.bridge.contact_get( self.args.jid, self.host.profile) except Exception as e: self.disp(f"error while retrieving the contact: {e}", error=True) @@ -117,7 +117,7 @@ groups = list(groups.union(self.args.groups)) try: - await self.host.bridge.updateContact( + await self.host.bridge.contact_update( self.args.jid, name, groups, self.host.profile) except Exception as e: self.disp(f"error while updating the contact: {e}", error=True) @@ -142,9 +142,9 @@ message = _("Are you sure to delete {entity} from your roster?").format( entity=self.args.jid ) - await self.host.confirmOrQuit(message, _("entity deletion cancelled")) + await self.host.confirm_or_quit(message, _("entity deletion cancelled")) try: - await self.host.bridge.delContact( + await self.host.bridge.contact_del( self.args.jid, self.host.profile) except Exception as e: self.disp(f"error while deleting the entity: {e}", error=True) @@ -162,7 +162,7 @@ async def start(self): try: - contacts = await self.host.bridge.getContacts(profile_key=self.host.profile) + contacts = await self.host.bridge.contacts_get(profile_key=self.host.profile) except Exception as e: self.disp(f"error while retrieving the contacts: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -235,7 +235,7 @@ async def start(self): try: - contacts = await self.host.bridge.getContacts(self.host.profile) + contacts = await self.host.bridge.contacts_get(self.host.profile) except Exception as e: self.disp(f"error while retrieving the contacts: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -258,7 +258,7 @@ elif await self.ask_confirmation(no_sub, no_from, no_to): for contact in no_sub + no_from + no_to: try: - await self.host.bridge.delContact( + await self.host.bridge.contact_del( contact, profile_key=self.host.profile) except Exception as e: self.disp(f"can't delete contact {contact!r}: {e}", error=True) @@ -310,7 +310,7 @@ async def start(self): try: - await self.host.bridge.rosterResync(profile_key=self.host.profile) + await self.host.bridge.roster_resync(profile_key=self.host.profile) except Exception as e: self.disp(f"can't resynchronise roster: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK)
--- a/sat_frontends/jp/cmd_shell.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_shell.py Sat Apr 08 13:54:42 2023 +0200 @@ -295,7 +295,7 @@ async def start(self): # FIXME: "shell" is currently kept synchronous as it works well as it # and it will be refactored soon. - default_profile = self.host.bridge.profileNameGet(C.PROF_KEY_DEFAULT) + default_profile = self.host.bridge.profile_name_get(C.PROF_KEY_DEFAULT) self._not_default_profile = self.profile != default_profile self.path = [] self._cur_parser = self.host.parser
--- a/sat_frontends/jp/cmd_uri.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/cmd_uri.py Sat Apr 08 13:54:42 2023 +0200 @@ -43,7 +43,7 @@ ) async def start(self): - await self.output(uri.parseXMPPUri(self.args.uri)) + await self.output(uri.parse_xmpp_uri(self.args.uri)) self.host.quit() @@ -68,7 +68,7 @@ async def start(self): fields = dict(self.args.fields) if self.args.fields else {} - self.disp(uri.buildXMPPUri(self.args.type, path=self.args.path, **fields)) + self.disp(uri.build_xmpp_uri(self.args.type, path=self.args.path, **fields)) self.host.quit()
--- a/sat_frontends/jp/common.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/common.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,13 +65,13 @@ def ansi_ljust(s, width): """ljust method handling ANSI escape codes""" - cleaned = regex.ansiRemove(s) + cleaned = regex.ansi_remove(s) return s + " " * (width - len(cleaned)) def ansi_center(s, width): """ljust method handling ANSI escape codes""" - cleaned = regex.ansiRemove(s) + cleaned = regex.ansi_remove(s) diff = width - len(cleaned) half = diff / 2 return half * " " + s + (half + diff % 2) * " " @@ -79,25 +79,25 @@ def ansi_rjust(s, width): """ljust method handling ANSI escape codes""" - cleaned = regex.ansiRemove(s) + cleaned = regex.ansi_remove(s) return " " * (width - len(cleaned)) + s -def getTmpDir(sat_conf, cat_dir, sub_dir=None): +def get_tmp_dir(sat_conf, cat_dir, sub_dir=None): """Return directory used to store temporary files @param sat_conf(ConfigParser.ConfigParser): instance opened on sat configuration @param cat_dir(str): directory of the category (e.g. "blog") @param sub_dir(str): sub directory where data need to be put profile can be used here, or special directory name - sub_dir will be escaped to be usable in path (use regex.pathUnescape to find + sub_dir will be escaped to be usable in path (use regex.path_unescape to find initial str) @return (Path): path to the dir """ - local_dir = config.getConfig(sat_conf, "", "local_dir", Exception) + local_dir = config.config_get(sat_conf, "", "local_dir", Exception) path_elts = [local_dir, cat_dir] if sub_dir is not None: - path_elts.append(regex.pathEscape(sub_dir)) + path_elts.append(regex.path_escape(sub_dir)) return Path(*path_elts) @@ -141,7 +141,7 @@ self.cat_dir = cat_dir self.use_metadata = use_metadata - def secureUnlink(self, path): + def secure_unlink(self, path): """Unlink given path after keeping it for a while This method is used to prevent accidental deletion of a draft @@ -152,7 +152,7 @@ path = Path(path).resolve() if not path.is_file: raise OSError("path must link to a regular file") - if path.parent != getTmpDir(self.sat_conf, self.cat_dir): + if path.parent != get_tmp_dir(self.sat_conf, self.cat_dir): self.disp( f"File {path} is not in SàT temporary hierarchy, we do not remove " f"it", 2, @@ -160,7 +160,7 @@ return # we have 2 files per draft with use_metadata, so we double max unlink_max = SECURE_UNLINK_MAX * 2 if self.use_metadata else SECURE_UNLINK_MAX - backup_dir = getTmpDir(self.sat_conf, self.cat_dir, SECURE_UNLINK_DIR) + backup_dir = get_tmp_dir(self.sat_conf, self.cat_dir, SECURE_UNLINK_DIR) if not os.path.exists(backup_dir): os.makedirs(backup_dir) filename = os.path.basename(path) @@ -179,7 +179,7 @@ self.host.disp("Purging backup file {}".format(path), 2) os.unlink(path) - async def runEditor( + async def run_editor( self, editor_args_opt, content_file_path, @@ -210,12 +210,12 @@ content_file_obj.close() # we prepare arguments - editor = config.getConfig(self.sat_conf, C.CONFIG_SECTION, "editor") or os.getenv( + editor = config.config_get(self.sat_conf, C.CONFIG_SECTION, "editor") or os.getenv( "EDITOR", "vi" ) try: # is there custom arguments in sat.conf ? - editor_args = config.getConfig( + editor_args = config.config_get( self.sat_conf, C.CONFIG_SECTION, editor_args_opt, Exception ) except (NoOptionError, NoSectionError): @@ -291,7 +291,7 @@ if len(content) == 0: self.disp("Content is empty, cancelling the edition") - if content_file_path.parent != getTmpDir(self.sat_conf, self.cat_dir): + if content_file_path.parent != get_tmp_dir(self.sat_conf, self.cat_dir): self.disp( "File are not in SàT temporary hierarchy, we do not remove them", 2, @@ -335,22 +335,22 @@ ) self.host.quit(1) - self.secureUnlink(content_file_path) + self.secure_unlink(content_file_path) if self.use_metadata: - self.secureUnlink(meta_file_path) + self.secure_unlink(meta_file_path) async def publish(self, content): # if metadata is needed, publish will be called with it last argument raise NotImplementedError - def getTmpFile(self): + def get_tmp_file(self): """Create a temporary file @return (tuple(file, Path)): opened (w+b) file object and file path """ - suff = "." + self.getTmpSuff() + suff = "." + self.get_tmp_suff() cat_dir_str = self.cat_dir - tmp_dir = getTmpDir(self.sat_conf, self.cat_dir, self.profile) + tmp_dir = get_tmp_dir(self.sat_conf, self.cat_dir, self.profile) if not tmp_dir.exists(): try: tmp_dir.mkdir(parents=True) @@ -372,7 +372,7 @@ self.disp(f"Can't create temporary file: {e}", error=True) self.host.quit(1) - def getCurrentFile(self, profile): + def get_current_file(self, profile): """Get most recently edited file @param profile(unicode): profile linked to the draft @@ -381,7 +381,7 @@ # we guess the item currently edited by choosing # the most recent file corresponding to temp file pattern # in tmp_dir, excluding metadata files - tmp_dir = getTmpDir(self.sat_conf, self.cat_dir, profile) + tmp_dir = get_tmp_dir(self.sat_conf, self.cat_dir, profile) available = [ p for p in tmp_dir.glob(f"{self.cat_dir}_*") @@ -395,15 +395,15 @@ self.host.quit(1) return max(available, key=lambda p: p.stat().st_mtime) - async def getItemData(self, service, node, item): + async def get_item_data(self, service, node, item): """return formatted content, metadata (or not if use_metadata is false), and item id""" raise NotImplementedError - def getTmpSuff(self): + def get_tmp_suff(self): """return suffix used for content file""" return "xml" - async def getItemPath(self): + async def get_item_path(self): """Retrieve item path (i.e. service and node) from item argument This method is obviously only useful for edition of PubSub based features @@ -415,7 +415,7 @@ if self.args.current: # user wants to continue current draft - content_file_path = self.getCurrentFile(self.profile) + content_file_path = self.get_current_file(self.profile) self.disp("Continuing edition of current draft", 2) content_file_obj = content_file_path.open("r+b") # we seek at the end of file in case of an item already exist @@ -430,15 +430,15 @@ content_file_obj.seek(0, os.SEEK_END) else: # we need a temporary file - content_file_obj, content_file_path = self.getTmpFile() + content_file_obj, content_file_path = self.get_tmp_file() if item or last_item: self.disp("Editing requested published item", 2) try: if self.use_metadata: - content, metadata, item = await self.getItemData(service, node, item) + content, metadata, item = await self.get_item_data(service, node, item) else: - content, item = await self.getItemData(service, node, item) + content, item = await self.get_item_data(service, node, item) except Exception as e: # FIXME: ugly but we have not good may to check errors in bridge if "item-not-found" in str(e): @@ -529,7 +529,7 @@ col_value = filter_(value) # we count size without ANSI code as they will change length of the # string when it's mostly style/color changes. - col_size = len(regex.ansiRemove(col_value)) + col_size = len(regex.ansi_remove(col_value)) else: col_value = str(value) col_size = len(col_value) @@ -558,7 +558,7 @@ return "\n".join(self._buffer) @staticmethod - def readDictValues(data, keys, defaults=None): + def read_dict_values(data, keys, defaults=None): if defaults is None: defaults = {} for key in keys: @@ -572,7 +572,7 @@ raise e @classmethod - def fromListDict( + def from_list_dict( cls, host, data, keys=None, headers=None, filters=None, defaults=None ): """Create a table from a list of dictionaries @@ -600,7 +600,7 @@ filters = {} filters = [filters.get(k) for k in keys] return cls( - host, (cls.readDictValues(d, keys, defaults) for d in data), headers, filters + host, (cls.read_dict_values(d, keys, defaults) for d in data), headers, filters ) def _headers(self, head_sep, headers, sizes, alignment="left", style=None): @@ -679,7 +679,7 @@ if not self.sizes: # the table is empty return - col_sep_size = len(regex.ansiRemove(col_sep)) + col_sep_size = len(regex.ansi_remove(col_sep)) # if we have columns to hide, we remove them from headers and size if not hide_cols: @@ -784,7 +784,7 @@ host = command.host try: - uris_data = await host.bridge.URIFind(path, [key]) + uris_data = await host.bridge.uri_find(path, [key]) except Exception as e: host.disp(f"can't find {key} URI: {e}", error=True) host.quit(C.EXIT_BRIDGE_ERRBACK) @@ -824,7 +824,7 @@ values.extend(json.loads(new_values_json)) setattr(args, dest, values) - parsed_uri = xmpp_uri.parseXMPPUri(uri) + parsed_uri = xmpp_uri.parse_xmpp_uri(uri) try: args.service = parsed_uri["path"] args.node = parsed_uri["node"]
--- a/sat_frontends/jp/loops.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/loops.py Sat Apr 08 13:54:42 2023 +0200 @@ -35,7 +35,7 @@ """ -def getJPLoop(bridge_name): +def get_jp_loop(bridge_name): if 'dbus' in bridge_name: import signal import threading
--- a/sat_frontends/jp/output_template.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/output_template.py Sat Apr 08 13:54:42 2023 +0200 @@ -65,7 +65,7 @@ data to a dict usable by the template. """ # media_dir is needed for the template - self.host.media_dir = self.host.bridge.getConfig("", "media_dir") + self.host.media_dir = self.host.bridge.config_get("", "media_dir") cmd = self.host.command try: template_path = cmd.TEMPLATE @@ -113,7 +113,7 @@ tmp_file = os.path.join(tmp_dir, template_name) with open(tmp_file, "w") as f: f.write(rendered.encode("utf-8")) - theme, theme_root_path = self.renderer.getThemeAndRoot(template_path) + theme, theme_root_path = self.renderer.get_theme_and_root(template_path) if theme is None: # we have an absolute path webbrowser
--- a/sat_frontends/jp/xml_tools.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/xml_tools.py Sat Apr 08 13:54:42 2023 +0200 @@ -20,8 +20,8 @@ from sat.core.i18n import _ from sat_frontends.jp.constants import Const as C -def etreeParse(cmd, raw_xml, reraise=False): - """Import lxml and parse raw XML +def etree_parse(cmd, raw_xml, reraise=False): + """import lxml and parse raw XML @param cmd(CommandBase): current command instance @param raw_xml(file, str): an XML bytestring, string or file-like object @@ -51,7 +51,7 @@ ) return element, etree -def getPayload(cmd, element): +def get_payload(cmd, element): """Retrieve payload element and exit with and error if not found @param element(etree.Element): root element
--- a/sat_frontends/jp/xmlui_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/jp/xmlui_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -74,7 +74,7 @@ """ raise NotImplementedError(self.__class__) - def verboseName(self, elems=None, value=None): + def verbose_name(self, elems=None, value=None): """add name in color to the elements helper method to display name which can then be used to automate commands @@ -114,7 +114,7 @@ super(InputWidget, self).__init__(xmlui_parent, value) self.read_only = read_only - def _xmluiGetValue(self): + def _xmlui_get_value(self): return self.value @@ -141,13 +141,13 @@ def value(self, value): self.selected = [value] - def _xmluiSelectValue(self, value): + def _xmlui_select_value(self, value): self.value = value - def _xmluiSelectValues(self, values): + def _xmlui_select_values(self, values): self.values = values - def _xmluiGetSelectedValues(self): + def _xmlui_get_selected_values(self): return self.values @property @@ -219,7 +219,7 @@ self.disp(self.value) else: elems = [] - self.verboseName(elems) + self.verbose_name(elems) if self.value: elems.append(_("(enter: {value})").format(value=self.value)) elems.extend([C.A_HEADER, "> "]) @@ -243,7 +243,7 @@ # TODO: use a more advanced input method async def show(self): - self.verboseName() + self.verbose_name() if self.read_only or self.root.read_only: self.disp(self.value) else: @@ -275,7 +275,7 @@ # FIXME: we use bridge in a blocking way as permitted by python-dbus # this only for now to make it simpler, it must be refactored # to use async when jp will be fully async (expected for 0.8) - self.value = await self.host.bridge.syntaxConvert( + self.value = await self.host.bridge.syntax_convert( self.value, C.SYNTAX_XHTML, "markdown", False, self.host.profile ) await super(XHTMLBoxWidget, self).show() @@ -294,14 +294,14 @@ return # list display - self.verboseName() + self.verbose_name() for idx, (value, label) in enumerate(self.options): elems = [] if not self.root.read_only: elems.extend([C.A_SUBHEADER, str(idx), A.RESET, ": "]) elems.append(label) - self.verboseName(elems, value) + self.verbose_name(elems, value) self.disp(A.color(*elems)) if self.root.read_only: @@ -350,13 +350,13 @@ choice = None while choice not in ("0", "1"): elems = [C.A_HEADER, _("your choice (0,1): ")] - self.verboseName(elems) + self.verbose_name(elems) choice = await self.host.ainput(A.color(*elems)) self.value = bool(int(choice)) self.disp("") - def _xmluiGetValue(self): - return C.boolConst(self.value) + def _xmlui_get_value(self): + return C.bool_const(self.value) ## Containers ## @@ -371,10 +371,10 @@ def __iter__(self): return iter(self.children) - def _xmluiAppend(self, widget): + def _xmlui_append(self, widget): self.children.append(widget) - def _xmluiRemove(self, widget): + def _xmlui_remove(self, widget): self.children.remove(widget) async def show(self): @@ -486,9 +486,9 @@ input_ = await self.host.ainput(f"{self.message} (y/n)? ") input_ = input_.lower() if input_ == "y": - self._xmluiValidated() + self._xmlui_validated() else: - self._xmluiCancelled() + self._xmlui_cancelled() ## Factory ## @@ -502,7 +502,7 @@ class XMLUIPanel(xmlui_base.AIOXMLUIPanel): widget_factory = WidgetFactory() - _actions = 0 # use to keep track of bridge's launchAction calls + _actions = 0 # use to keep track of bridge's action_launch calls read_only = False values_only = False workflow = None @@ -557,11 +557,11 @@ if workflow: XMLUIPanel.workflow = workflow if XMLUIPanel.workflow: - await self.runWorkflow() + await self.run_workflow() else: await self.main_cont.show() - async def runWorkflow(self): + async def run_workflow(self): """loop into workflow commands and execute commands SUBMIT will interrupt workflow (which will be continue on callback) @@ -574,7 +574,7 @@ except IndexError: break if cmd == SUBMIT: - await self.onFormSubmitted() + await self.on_form_submitted() self.submit_id = None # avoid double submit return elif isinstance(cmd, list): @@ -585,22 +585,22 @@ widget.value = value await self.show() - async def submitForm(self, callback=None): + async def submit_form(self, callback=None): XMLUIPanel._submit_cb = callback - await self.onFormSubmitted() + await self.on_form_submitted() - async def onFormSubmitted(self, ignore=None): + async def on_form_submitted(self, ignore=None): # self.submitted is a Q&D workaround to avoid # double submit when a workflow is set if self.submitted: return self.submitted = True - await super(XMLUIPanel, self).onFormSubmitted(ignore) + await super(XMLUIPanel, self).on_form_submitted(ignore) - def _xmluiClose(self): + def _xmlui_close(self): pass - async def _launchActionCb(self, data): + async def _launch_action_cb(self, data): XMLUIPanel._actions -= 1 assert XMLUIPanel._actions >= 0 if "xmlui" in data: @@ -608,7 +608,7 @@ xmlui = create(self.host, xmlui_raw) await xmlui.show() if xmlui.submit_id: - await xmlui.onFormSubmitted() + await xmlui.on_form_submitted() # TODO: handle data other than XMLUI if not XMLUIPanel._actions: if self._submit_cb is None: @@ -616,10 +616,10 @@ else: self._submit_cb() - async def _xmluiLaunchAction(self, action_id, data): + async def _xmlui_launch_action(self, action_id, data): XMLUIPanel._actions += 1 try: - data = await self.host.bridge.launchAction( + data = await self.host.bridge.action_launch( action_id, data, self.profile, @@ -628,7 +628,7 @@ self.disp(f"can't launch XMLUI action: {e}", error=True) self.host.quit(C.EXIT_BRIDGE_ERRBACK) else: - await self._launchActionCb(data) + await self._launch_action_cb(data) class XMLUIDialog(xmlui_base.XMLUIDialog): @@ -639,7 +639,7 @@ async def show(self, __=None): await self.dlg.show() - def _xmluiClose(self): + def _xmlui_close(self): pass
--- a/sat_frontends/primitivus/base.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/base.py Sat Apr 08 13:54:42 2023 +0200 @@ -20,7 +20,7 @@ from sat.core.i18n import _, D_ from sat_frontends.primitivus.constants import Const as C from sat.core import log_config -log_config.satConfigure(C.LOG_BACKEND_STANDARD, C) +log_config.sat_configure(C.LOG_BACKEND_STANDARD, C) from sat.core import log as logging log = logging.getLogger(__name__) from sat.tools import config as sat_config @@ -45,8 +45,8 @@ import sys ## bridge handling # we get bridge name from conf and initialise the right class accordingly -main_config = sat_config.parseMainConf() -bridge_name = sat_config.getConfig(main_config, '', 'bridge', 'dbus') +main_config = sat_config.parse_main_conf() +bridge_name = sat_config.config_get(main_config, '', 'bridge', 'dbus') if 'dbus' not in bridge_name: print(u"only D-Bus bridge is currently supported") sys.exit(3) @@ -63,8 +63,8 @@ a_key['MODE_COMMAND']: (C.MODE_COMMAND, u':')} #XXX: captions *MUST* be unicode super(EditBar, self).__init__(modes) self.host = host - self.setCompletionMethod(self._text_completion) - urwid.connect_signal(self, 'click', self.onTextEntered) + self.set_completion_method(self._text_completion) + urwid.connect_signal(self, 'click', self.on_text_entered) def _text_completion(self, text, completion_data, mode): if mode == C.MODE_INSERTION: @@ -78,47 +78,47 @@ else: return text - def onTextEntered(self, editBar): + def on_text_entered(self, editBar): """Called when text is entered in the main edit bar""" if self.mode == C.MODE_INSERTION: if isinstance(self.host.selected_widget, quick_chat.QuickChat): chat_widget = self.host.selected_widget - self.host.messageSend( + self.host.message_send( chat_widget.target, {'': editBar.get_edit_text()}, # TODO: handle language mess_type = C.MESS_TYPE_GROUPCHAT if chat_widget.type == C.CHAT_GROUP else C.MESS_TYPE_CHAT, # TODO: put this in QuickChat - errback=lambda failure: self.host.showDialog(_("Error while sending message ({})").format(failure), type="error"), + errback=lambda failure: self.host.show_dialog(_("Error while sending message ({})").format(failure), type="error"), profile_key=chat_widget.profile ) editBar.set_edit_text('') elif self.mode == C.MODE_COMMAND: - self.commandHandler() + self.command_handler() - def commandHandler(self): + def command_handler(self): #TODO: separate class with auto documentation (with introspection) # and completion method tokens = self.get_edit_text().split(' ') command, args = tokens[0], tokens[1:] if command == 'quit': - self.host.onExit() + self.host.on_exit() raise urwid.ExitMainLoop() elif command == 'messages': - wid = sat_widgets.GenericList(logging.memoryGet()) - self.host.selectWidget(wid) + wid = sat_widgets.GenericList(logging.memory_get()) + self.host.select_widget(wid) # FIXME: reactivate the command # elif command == 'presence': # values = [value for value in commonConst.PRESENCE.keys()] # values = [value if value else 'online' for value in values] # the empty value actually means 'online' # if args and args[0] in values: # presence = '' if args[0] == 'online' else args[0] - # self.host.status_bar.onChange(user_data=sat_widgets.ClickableText(commonConst.PRESENCE[presence])) + # self.host.status_bar.on_change(user_data=sat_widgets.ClickableText(commonConst.PRESENCE[presence])) # else: - # self.host.status_bar.onPresenceClick() + # self.host.status_bar.on_presence_click() # elif command == 'status': # if args: - # self.host.status_bar.onChange(user_data=sat_widgets.AdvancedEdit(args[0])) + # self.host.status_bar.on_change(user_data=sat_widgets.AdvancedEdit(args[0])) # else: - # self.host.status_bar.onStatusClick() + # self.host.status_bar.on_status_click() elif command == 'history': widget = self.host.selected_widget if isinstance(widget, quick_chat.QuickChat): @@ -126,22 +126,22 @@ limit = int(args[0]) except (IndexError, ValueError): limit = 50 - widget.updateHistory(size=limit, profile=widget.profile) + widget.update_history(size=limit, profile=widget.profile) elif command == 'search': widget = self.host.selected_widget if isinstance(widget, quick_chat.QuickChat): pattern = " ".join(args) if not pattern: - self.host.notif_bar.addMessage(D_("Please specify the globbing pattern to search for")) + self.host.notif_bar.add_message(D_("Please specify the globbing pattern to search for")) else: - widget.updateHistory(size=C.HISTORY_LIMIT_NONE, filters={'search': pattern}, profile=widget.profile) + widget.update_history(size=C.HISTORY_LIMIT_NONE, filters={'search': pattern}, profile=widget.profile) elif command == 'filter': # FIXME: filter is now only for current widget, # need to be able to set it globally or per widget widget = self.host.selected_widget # FIXME: Q&D way, need to be more generic if isinstance(widget, quick_chat.QuickChat): - widget.setFilter(args) + widget.set_filter(args) elif command in ('topic', 'suject', 'title'): try: new_title = args[0].strip() @@ -149,12 +149,12 @@ new_title = None widget = self.host.selected_widget if isinstance(widget, quick_chat.QuickChat) and widget.type == C.CHAT_GROUP: - widget.onSubjectDialog(new_title) + widget.on_subject_dialog(new_title) else: return self.set_edit_text('') - def _historyCb(self, text): + def _history_cb(self, text): self.set_edit_text(text) self.set_edit_pos(len(text)) @@ -163,25 +163,25 @@ and move the index of the temporary history stack.""" if key == a_key['MODAL_ESCAPE']: # first save the text to the current mode, then change to NORMAL - self.host._updateInputHistory(self.get_edit_text(), mode=self.mode) - self.host._updateInputHistory(mode=C.MODE_NORMAL) + self.host._update_input_history(self.get_edit_text(), mode=self.mode) + self.host._update_input_history(mode=C.MODE_NORMAL) if self._mode == C.MODE_NORMAL and key in self._modes: - self.host._updateInputHistory(mode=self._modes[key][0]) + self.host._update_input_history(mode=self._modes[key][0]) if key == a_key['HISTORY_PREV']: - self.host._updateInputHistory(self.get_edit_text(), -1, self._historyCb, self.mode) + self.host._update_input_history(self.get_edit_text(), -1, self._history_cb, self.mode) return elif key == a_key['HISTORY_NEXT']: - self.host._updateInputHistory(self.get_edit_text(), +1, self._historyCb, self.mode) + self.host._update_input_history(self.get_edit_text(), +1, self._history_cb, self.mode) return elif key == a_key['EDIT_ENTER']: - self.host._updateInputHistory(self.get_edit_text(), mode=self.mode) + self.host._update_input_history(self.get_edit_text(), mode=self.mode) else: if (self._mode == C.MODE_INSERTION and isinstance(self.host.selected_widget, quick_chat.QuickChat) and key not in sat_widgets.FOCUS_KEYS and key not in (a_key['HISTORY_PREV'], a_key['HISTORY_NEXT']) and self.host.sync): - self.host.bridge.chatStateComposing(self.host.selected_widget.target, self.host.selected_widget.profile) + self.host.bridge.chat_state_composing(self.host.selected_widget.target, self.host.selected_widget.profile) return super(EditBar, self).keypress(size, key) @@ -203,12 +203,12 @@ for position in self.positions: setattr(self, position, - property(lambda: self, self.widgetGet(position=position), - lambda pos, new_wid: self.widgetSet(new_wid, position=pos)) + property(lambda: self, self.widget_get(position=position), + lambda pos, new_wid: self.widget_set(new_wid, position=pos)) ) self.focus_position = len(self.contents)-1 - def getVisiblePositions(self, keep=None): + def get_visible_positions(self, keep=None): """Return positions that are not hidden in the right order @param keep: if not None, this position will be keep in the right order, even if it's hidden @@ -241,27 +241,27 @@ if focus in self._hidden: return - self.focus_position = self.getVisiblePositions().index(focus) + self.focus_position = self.get_visible_positions().index(focus) return return super(PrimitivusTopWidget, self).keypress(size, key) - def widgetGet(self, position): + def widget_get(self, position): if not position in self.positions: raise ValueError("Unknown position {}".format(position)) return getattr(self, "_{}".format(position)) - def widgetSet(self, widget, position): + def widget_set(self, widget, position): if not position in self.positions: raise ValueError("Unknown position {}".format(position)) return setattr(self, "_{}".format(position), widget) - def hideSwitch(self, position): + def hide_switch(self, position): if not position in self.can_hide: raise ValueError("Can't switch position {}".format(position)) hide = not position in self._hidden - widget = self.widgetGet(position) - idx = self.getVisiblePositions(position).index(position) + widget = self.widget_get(position) + idx = self.get_visible_positions(position).index(position) if hide: del self.contents[idx] self._hidden.add(position) @@ -271,11 +271,11 @@ def show(self, position): if position in self._hidden: - self.hideSwitch(position) + self.hide_switch(position) def hide(self, position): if not position in self._hidden: - self.hideSwitch(position) + self.hide_switch(position) class PrimitivusApp(QuickApp, InputHistory): @@ -289,33 +289,33 @@ sys.exit(3) else: log.debug(u"Loading {} bridge".format(bridge_name)) - QuickApp.__init__(self, bridge_factory=bridge_module.Bridge, xmlui=xmlui, check_options=quick_utils.check_options, connect_bridge=False) + QuickApp.__init__(self, bridge_factory=bridge_module.bridge, xmlui=xmlui, check_options=quick_utils.check_options, connect_bridge=False) ## main loop setup ## event_loop = urwid.GLibEventLoop if 'dbus' in bridge_name else urwid.TwistedEventLoop - self.loop = urwid.MainLoop(urwid.SolidFill(), C.PALETTE, event_loop=event_loop(), input_filter=self.inputFilter, unhandled_input=self.keyHandler) + self.loop = urwid.MainLoop(urwid.SolidFill(), C.PALETTE, event_loop=event_loop(), input_filter=self.input_filter, unhandled_input=self.key_handler) @classmethod def run(cls): cls().start() - def onBridgeConnected(self): + def on_bridge_connected(self): ##misc setup## self._visible_widgets = set() self.notif_bar = sat_widgets.NotificationBar() - urwid.connect_signal(self.notif_bar, 'change', self.onNotification) + urwid.connect_signal(self.notif_bar, 'change', self.on_notification) - self.progress_wid = self.widgets.getOrCreateWidget(Progress, None, on_new_widget=None) - urwid.connect_signal(self.notif_bar.progress, 'click', lambda x: self.selectWidget(self.progress_wid)) + self.progress_wid = self.widgets.get_or_create_widget(Progress, None, on_new_widget=None) + urwid.connect_signal(self.notif_bar.progress, 'click', lambda x: self.select_widget(self.progress_wid)) self.__saved_overlay = None self.x_notify = Notify() # we already manage exit with a_key['APP_QUIT'], so we don't want C-c signal.signal(signal.SIGINT, signal.SIG_IGN) - sat_conf = sat_config.parseMainConf() + sat_conf = sat_config.parse_main_conf() self._bracketed_paste = C.bool( - sat_config.getConfig(sat_conf, C.CONFIG_SECTION, 'bracketed_paste', 'false') + sat_config.config_get(sat_conf, C.CONFIG_SECTION, 'bracketed_paste', 'false') ) if self._bracketed_paste: log.debug("setting bracketed paste mode as requested") @@ -323,7 +323,7 @@ self._bracketed_mode_set = True self.loop.widget = self.main_widget = ProfileManager(self) - self.postInit() + self.post_init() @property def visible_widgets(self): @@ -337,7 +337,7 @@ def mode(self, value): self.editBar.mode = value - def modeHint(self, value): + def mode_hint(self, value): """Change mode if make sens (i.e.: if there is nothing in the editBar)""" if not self.editBar.get_edit_text(): self.mode = value @@ -366,22 +366,22 @@ pass def start(self): - self.connectBridge() + self.connect_bridge() self.loop.run() - def postInit(self): + def post_init(self): try: - config.applyConfig(self) + config.apply_config(self) except Exception as e: log.error(u"configuration error: {}".format(e)) popup = self.alert(_(u"Configuration Error"), _(u"Something went wrong while reading the configuration, please check :messages")) if self.options.profile: self._early_popup = popup else: - self.showPopUp(popup) - super(PrimitivusApp, self).postInit(self.main_widget) + self.show_pop_up(popup) + super(PrimitivusApp, self).post_init(self.main_widget) - def keysToText(self, keys): + def keys_to_text(self, keys): """Generator return normal text from urwid keys""" for k in keys: if k == 'tab': @@ -391,7 +391,7 @@ elif is_wide_char(k,0) or (len(k)==1 and ord(k) >= 32): yield k - def inputFilter(self, input_, raw): + def input_filter(self, input_, raw): if self.__saved_overlay and input_ != a_key['OVERLAY_HIDE']: return @@ -439,7 +439,7 @@ if self.main_widget.focus == edit_bar: # XXX: if a paste is detected, we append it directly to the edit bar text # so the user can check it and press [enter] if it's OK - buf_paste = u''.join(self.keysToText(input_)) + buf_paste = u''.join(self.keys_to_text(input_)) pos = edit_bar.edit_pos edit_bar.set_edit_text(u'{}{}{}'.format(edit_bar.edit_text[:pos], buf_paste, edit_bar.edit_text[pos:])) edit_bar.edit_pos+=len(buf_paste) @@ -463,16 +463,16 @@ input_[input_.index(i)] = a_key['HISTORY_NEXT'] return input_ - def keyHandler(self, input_): + def key_handler(self, input_): if input_ == a_key['MENU_HIDE']: """User want to (un)hide the menu roller""" try: - self.main_widget.hideSwitch('menu') + self.main_widget.hide_switch('menu') except AttributeError: pass elif input_ == a_key['NOTIFICATION_NEXT']: """User wants to see next notification""" - self.notif_bar.showNext() + self.notif_bar.show_next() elif input_ == a_key['OVERLAY_HIDE']: """User wants to (un)hide overlay window""" if isinstance(self.loop.widget,urwid.Overlay): @@ -483,7 +483,7 @@ self.loop.widget = self.__saved_overlay self.__saved_overlay = None - elif input_ == a_key['DEBUG'] and 'D' in self.bridge.getVersion(): #Debug only for dev versions + elif input_ == a_key['DEBUG'] and 'D' in self.bridge.version_get(): #Debug only for dev versions self.debug() elif input_ == a_key['CONTACTS_HIDE']: #user wants to (un)hide the contact lists try: @@ -507,74 +507,74 @@ self.loop.widget = self.save_main_widget del self.save_main_widget try: - return self.menu_roller.checkShortcuts(input_) + return self.menu_roller.check_shortcuts(input_) except AttributeError: return input_ - def addMenus(self, menu, type_filter, menu_data=None): + def add_menus(self, menu, type_filter, menu_data=None): """Add cached menus to instance @param menu: sat_widgets.Menu instance - @param type_filter: menu type like is sat.core.sat_main.importMenu + @param type_filter: menu type like is sat.core.sat_main.import_menu @param menu_data: data to send with these menus """ def add_menu_cb(callback_id): - self.launchAction(callback_id, menu_data, profile=self.current_profile) - for id_, type_, path, path_i18n, extra in self.bridge.menusGet("", C.NO_SECURITY_LIMIT ): # TODO: manage extra + self.action_launch(callback_id, menu_data, profile=self.current_profile) + for id_, type_, path, path_i18n, extra in self.bridge.menus_get("", C.NO_SECURITY_LIMIT ): # TODO: manage extra if type_ != type_filter: continue if len(path) != 2: raise NotImplementedError("Menu with a path != 2 are not implemented yet") - menu.addMenu(path_i18n[0], path_i18n[1], lambda dummy,id_=id_: add_menu_cb(id_)) + menu.add_menu(path_i18n[0], path_i18n[1], lambda dummy,id_=id_: add_menu_cb(id_)) - def _buildMenuRoller(self): + def _build_menu_roller(self): menu = sat_widgets.Menu(self.loop) general = _("General") - menu.addMenu(general, _("Connect"), self.onConnectRequest) - menu.addMenu(general, _("Disconnect"), self.onDisconnectRequest) - menu.addMenu(general, _("Parameters"), self.onParam) - menu.addMenu(general, _("About"), self.onAboutRequest) - menu.addMenu(general, _("Exit"), self.onExitRequest, a_key['APP_QUIT']) - menu.addMenu(_("Contacts")) # add empty menu to save the place in the menu order + menu.add_menu(general, _("Connect"), self.on_connect_request) + menu.add_menu(general, _("Disconnect"), self.on_disconnect_request) + menu.add_menu(general, _("Parameters"), self.on_param) + menu.add_menu(general, _("About"), self.on_about_request) + menu.add_menu(general, _("Exit"), self.on_exit_request, a_key['APP_QUIT']) + menu.add_menu(_("Contacts")) # add empty menu to save the place in the menu order groups = _("Groups") - menu.addMenu(groups) - menu.addMenu(groups, _("Join room"), self.onJoinRoomRequest, a_key['ROOM_JOIN']) + menu.add_menu(groups) + menu.add_menu(groups, _("Join room"), self.on_join_room_request, a_key['ROOM_JOIN']) #additionals menus #FIXME: do this in a more generic way (in quickapp) - self.addMenus(menu, C.MENU_GLOBAL) + self.add_menus(menu, C.MENU_GLOBAL) menu_roller = sat_widgets.MenuRoller([(_('Main menu'), menu, C.MENU_ID_MAIN)]) return menu_roller - def _buildMainWidget(self): + def _build_main_widget(self): self.contact_lists_pile = urwid.Pile([]) #self.center_part = urwid.Columns([('weight',2,self.contact_lists[profile]),('weight',8,Chat('',self))]) self.center_part = urwid.Columns([('weight', 2, self.contact_lists_pile), ('weight', 8, urwid.Filler(urwid.Text('')))]) self.editBar = EditBar(self) - self.menu_roller = self._buildMenuRoller() + self.menu_roller = self._build_menu_roller() self.main_widget = PrimitivusTopWidget(self.center_part, self.menu_roller, self.notif_bar, self.editBar) return self.main_widget def plugging_profiles(self): - self.loop.widget = self._buildMainWidget() + self.loop.widget = self._build_main_widget() self.redraw() try: # if a popup arrived before main widget is build, we need to show it now - self.showPopUp(self._early_popup) + self.show_pop_up(self._early_popup) except AttributeError: pass else: del self._early_popup - def profilePlugged(self, profile): - QuickApp.profilePlugged(self, profile) - contact_list = self.widgets.getOrCreateWidget(ContactList, None, on_new_widget=None, on_click=self.contactSelected, on_change=lambda w: self.redraw(), profile=profile) + def profile_plugged(self, profile): + QuickApp.profile_plugged(self, profile) + contact_list = self.widgets.get_or_create_widget(ContactList, None, on_new_widget=None, on_click=self.contact_selected, on_change=lambda w: self.redraw(), profile=profile) self.contact_lists_pile.contents.append((contact_list, ('weight', 1))) return contact_list - def isHidden(self): + def is_hidden(self): """Tells if the frontend window is hidden. @return bool @@ -590,11 +590,11 @@ @return (urwid_satext.Alert): the created Alert instance """ popup = sat_widgets.Alert(title, message) - popup.setCallback('ok', lambda dummy: self.removePopUp(popup)) - self.showPopUp(popup, width=75, height=20) + popup.set_callback('ok', lambda dummy: self.remove_pop_up(popup)) + self.show_pop_up(popup, width=75, height=20) return popup - def removePopUp(self, widget=None): + def remove_pop_up(self, widget=None): """Remove current pop-up, and if there is other in queue, show it @param widget(None, urwid.Widget): if not None remove this popup from front or queue @@ -607,19 +607,19 @@ current_popup = self.loop.widget.top_w if not current_popup == widget: try: - self.notif_bar.removePopUp(widget) + self.notif_bar.remove_pop_up(widget) except ValueError: log.warning(u"Trying to remove an unknown widget {}".format(widget)) return self.loop.widget = self.main_widget - next_popup = self.notif_bar.getNextPopup() + next_popup = self.notif_bar.get_next_popup() if next_popup: #we still have popup to show, we display it - self.showPopUp(next_popup) + self.show_pop_up(next_popup) else: self.redraw() - def showPopUp(self, pop_up_widget, width=None, height=None, align='center', + def show_pop_up(self, pop_up_widget, width=None, height=None, align='center', valign='middle'): """Show a pop-up window if possible, else put it in queue @@ -640,11 +640,11 @@ self.loop.widget = display_widget self.redraw() else: - self.notif_bar.addPopUp(pop_up_widget) + self.notif_bar.add_pop_up(pop_up_widget) - def barNotify(self, message): + def bar_notify(self, message): """"Notify message to user via notification bar""" - self.notif_bar.addMessage(message) + self.notif_bar.add_message(message) self.redraw() def notify(self, type_, entity=None, message=None, subject=None, callback=None, cb_args=None, widget=None, profile=C.PROF_KEY_NONE): @@ -653,15 +653,15 @@ # still do a desktop notification is the X window has not the focus super(PrimitivusApp, self).notify(type_, entity, message, subject, callback, cb_args, widget, profile) # we don't want notifications without message on desktop - if message is not None and not self.x_notify.hasFocus(): + if message is not None and not self.x_notify.has_focus(): if message is None: message = _("{app}: a new event has just happened{entity}").format( app=C.APP_NAME, entity=u' ({})'.format(entity) if entity else '') - self.x_notify.sendNotification(message) + self.x_notify.send_notification(message) - def newWidget(self, widget, user_action=False): + def new_widget(self, widget, user_action=False): """Method called when a new widget is created if suitable, the widget will be displayed @@ -672,9 +672,9 @@ # FIXME: when several widgets are possible (e.g. with :split) # do not replace current widget when self.selected_widget != None if user_action or self.selected_widget is None: - self.selectWidget(widget) + self.select_widget(widget) - def selectWidget(self, widget): + def select_widget(self, widget): """Display a widget if possible, else add it in the notification bar queue @@ -684,27 +684,27 @@ wid_idx = len(self.center_part.widget_list)-1 self.center_part.widget_list[wid_idx] = widget try: - self.menu_roller.removeMenu(C.MENU_ID_WIDGET) + self.menu_roller.remove_menu(C.MENU_ID_WIDGET) except KeyError: log.debug("No menu to delete") self.selected_widget = widget try: - onSelected = self.selected_widget.onSelected + on_selected = self.selected_widget.on_selected except AttributeError: pass else: - onSelected() + on_selected() self._visible_widgets = set([widget]) # XXX: we can only have one widget visible at the time for now self.contact_lists.select(None) - for wid in self.visible_widgets: # FIXME: check if widgets.getWidgets is not more appropriate + for wid in self.visible_widgets: # FIXME: check if widgets.get_widgets is not more appropriate if isinstance(wid, Chat): contact_list = self.contact_lists[wid.profile] contact_list.select(wid.target) self.redraw() - def removeWindow(self): + def remove_window(self): """Remove window showed on the right column""" #TODO: better Window management than this hack assert len(self.center_part.widget_list) <= 2 @@ -713,7 +713,7 @@ self.center_part.focus_position = 0 self.redraw() - def addProgress(self, pid, message, profile): + def add_progress(self, pid, message, profile): """Follow a SàT progression @param pid: progression id @@ -721,118 +721,118 @@ """ self.progress_wid.add(pid, message, profile) - def setProgress(self, percentage): + def set_progress(self, percentage): """Set the progression shown in notification bar""" - self.notif_bar.setProgress(percentage) + self.notif_bar.set_progress(percentage) - def contactSelected(self, contact_list, entity): - self.clearNotifs(entity, profile=contact_list.profile) + def contact_selected(self, contact_list, entity): + self.clear_notifs(entity, profile=contact_list.profile) if entity.resource: # we have clicked on a private MUC conversation - chat_widget = self.widgets.getOrCreateWidget(Chat, entity, on_new_widget=None, force_hash = Chat.getPrivateHash(contact_list.profile, entity), profile=contact_list.profile) + chat_widget = self.widgets.get_or_create_widget(Chat, entity, on_new_widget=None, force_hash = Chat.get_private_hash(contact_list.profile, entity), profile=contact_list.profile) else: - chat_widget = self.widgets.getOrCreateWidget(Chat, entity, on_new_widget=None, profile=contact_list.profile) - self.selectWidget(chat_widget) - self.menu_roller.addMenu(_('Chat menu'), chat_widget.getMenu(), C.MENU_ID_WIDGET) + chat_widget = self.widgets.get_or_create_widget(Chat, entity, on_new_widget=None, profile=contact_list.profile) + self.select_widget(chat_widget) + self.menu_roller.add_menu(_('Chat menu'), chat_widget.get_menu(), C.MENU_ID_WIDGET) - def _dialogOkCb(self, widget, data): + def _dialog_ok_cb(self, widget, data): popup, answer_cb, answer_data = data - self.removePopUp(popup) + self.remove_pop_up(popup) if answer_cb is not None: answer_cb(True, answer_data) - def _dialogCancelCb(self, widget, data): + def _dialog_cancel_cb(self, widget, data): popup, answer_cb, answer_data = data - self.removePopUp(popup) + self.remove_pop_up(popup) if answer_cb is not None: answer_cb(False, answer_data) - def showDialog(self, message, title="", type="info", answer_cb = None, answer_data = None): + def show_dialog(self, message, title="", type="info", answer_cb = None, answer_data = None): if type == 'info': popup = sat_widgets.Alert(title, message, ok_cb=answer_cb) if answer_cb is None: - popup.setCallback('ok', lambda dummy: self.removePopUp(popup)) + popup.set_callback('ok', lambda dummy: self.remove_pop_up(popup)) elif type == 'error': popup = sat_widgets.Alert(title, message, ok_cb=answer_cb) if answer_cb is None: - popup.setCallback('ok', lambda dummy: self.removePopUp(popup)) + popup.set_callback('ok', lambda dummy: self.remove_pop_up(popup)) elif type == 'yes/no': popup = sat_widgets.ConfirmDialog(message) - popup.setCallback('yes', self._dialogOkCb, (popup, answer_cb, answer_data)) - popup.setCallback('no', self._dialogCancelCb, (popup, answer_cb, answer_data)) + popup.set_callback('yes', self._dialog_ok_cb, (popup, answer_cb, answer_data)) + popup.set_callback('no', self._dialog_cancel_cb, (popup, answer_cb, answer_data)) else: popup = sat_widgets.Alert(title, message, ok_cb=answer_cb) if answer_cb is None: - popup.setCallback('ok', lambda dummy: self.removePopUp(popup)) + popup.set_callback('ok', lambda dummy: self.remove_pop_up(popup)) log.error(u'unmanaged dialog type: {}'.format(type)) - self.showPopUp(popup) + self.show_pop_up(popup) - def dialogFailure(self, failure): + def dialog_failure(self, failure): """Show a failure that has been returned by an asynchronous bridge method. @param failure (defer.Failure): Failure instance """ self.alert(failure.classname, failure.message) - def onNotification(self, notif_bar): + def on_notification(self, notif_bar): """Called when a new notification has been received""" if not isinstance(self.main_widget, PrimitivusTopWidget): #if we are not in the main configuration, we ignore the notifications bar return - if self.notif_bar.canHide(): + if self.notif_bar.can_hide(): #No notification left, we can hide the bar self.main_widget.hide('notif_bar') else: self.main_widget.show('notif_bar') self.redraw() # FIXME: invalidate cache in a more efficient way - def _actionManagerUnknownError(self): + def _action_manager_unknown_error(self): self.alert(_("Error"), _(u"Unmanaged action")) - def roomJoinedHandler(self, room_jid_s, room_nicks, user_nick, subject, profile): - super(PrimitivusApp, self).roomJoinedHandler(room_jid_s, room_nicks, user_nick, subject, profile) + def room_joined_handler(self, room_jid_s, room_nicks, user_nick, subject, profile): + super(PrimitivusApp, self).room_joined_handler(room_jid_s, room_nicks, user_nick, subject, profile) # if self.selected_widget is None: - # for contact_list in self.widgets.getWidgets(ContactList): + # for contact_list in self.widgets.get_widgets(ContactList): # if profile in contact_list.profiles: - # contact_list.setFocus(jid.JID(room_jid_s), True) + # contact_list.set_focus(jid.JID(room_jid_s), True) - def progressStartedHandler(self, pid, metadata, profile): - super(PrimitivusApp, self).progressStartedHandler(pid, metadata, profile) - self.addProgress(pid, metadata.get('name', _(u'unkown')), profile) + def progress_started_handler(self, pid, metadata, profile): + super(PrimitivusApp, self).progress_started_handler(pid, metadata, profile) + self.add_progress(pid, metadata.get('name', _(u'unkown')), profile) - def progressFinishedHandler(self, pid, metadata, profile): + def progress_finished_handler(self, pid, metadata, profile): log.info(u"Progress {} finished".format(pid)) - super(PrimitivusApp, self).progressFinishedHandler(pid, metadata, profile) + super(PrimitivusApp, self).progress_finished_handler(pid, metadata, profile) - def progressErrorHandler(self, pid, err_msg, profile): + def progress_error_handler(self, pid, err_msg, profile): log.warning(u"Progress {pid} error: {err_msg}".format(pid=pid, err_msg=err_msg)) - super(PrimitivusApp, self).progressErrorHandler(pid, err_msg, profile) + super(PrimitivusApp, self).progress_error_handler(pid, err_msg, profile) ##DIALOGS CALLBACKS## - def onJoinRoom(self, button, edit): - self.removePopUp() + def on_join_room(self, button, edit): + self.remove_pop_up() room_jid = jid.JID(edit.get_edit_text()) - self.bridge.mucJoin(room_jid, self.profiles[self.current_profile].whoami.node, {}, self.current_profile, callback=lambda dummy: None, errback=self.dialogFailure) + self.bridge.muc_join(room_jid, self.profiles[self.current_profile].whoami.node, {}, self.current_profile, callback=lambda dummy: None, errback=self.dialog_failure) #MENU EVENTS# - def onConnectRequest(self, menu): + def on_connect_request(self, menu): QuickApp.connect(self, self.current_profile) - def onDisconnectRequest(self, menu): + def on_disconnect_request(self, menu): self.disconnect(self.current_profile) - def onParam(self, menu): + def on_param(self, menu): def success(params): ui = xmlui.create(self, xml_data=params, profile=self.current_profile) ui.show() def failure(error): self.alert(_("Error"), _("Can't get parameters (%s)") % error) - self.bridge.getParamsUI(app=C.APP_NAME, profile_key=self.current_profile, callback=success, errback=failure) + self.bridge.param_ui_get(app=C.APP_NAME, profile_key=self.current_profile, callback=success, errback=failure) - def onExitRequest(self, menu): - QuickApp.onExit(self) + def on_exit_request(self, menu): + QuickApp.on_exit(self) try: if self._bracketed_mode_set: # we don't unset if bracketed paste mode was detected automatically (i.e. not in conf) log.debug("unsetting bracketed paste mode") @@ -841,21 +841,21 @@ pass raise urwid.ExitMainLoop() - def onJoinRoomRequest(self, menu): + def on_join_room_request(self, menu): """User wants to join a MUC room""" - pop_up_widget = sat_widgets.InputDialog(_("Entering a MUC room"), _("Please enter MUC's JID"), default_txt=self.bridge.mucGetDefaultService(), ok_cb=self.onJoinRoom) - pop_up_widget.setCallback('cancel', lambda dummy: self.removePopUp(pop_up_widget)) - self.showPopUp(pop_up_widget) + pop_up_widget = sat_widgets.InputDialog(_("Entering a MUC room"), _("Please enter MUC's JID"), default_txt=self.bridge.muc_get_default_service(), ok_cb=self.on_join_room) + pop_up_widget.set_callback('cancel', lambda dummy: self.remove_pop_up(pop_up_widget)) + self.show_pop_up(pop_up_widget) - def onAboutRequest(self, menu): - self.alert(_("About"), C.APP_NAME + " v" + self.bridge.getVersion()) + def on_about_request(self, menu): + self.alert(_("About"), C.APP_NAME + " v" + self.bridge.version_get()) #MISC CALLBACKS# - def setPresenceStatus(self, show='', status=None, profile=C.PROF_KEY_NONE): - contact_list_wid = self.widgets.getWidget(ContactList, profiles=profile) + def set_presence_status(self, show='', status=None, profile=C.PROF_KEY_NONE): + contact_list_wid = self.widgets.get_widget(ContactList, profiles=profile) if contact_list_wid is not None: - contact_list_wid.status_bar.setPresenceStatus(show, status) + contact_list_wid.status_bar.set_presence_status(show, status) else: log.warning(u"No ContactList widget found for profile {}".format(profile))
--- a/sat_frontends/primitivus/chat.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/chat.py Sat Apr 08 13:54:42 2023 +0200 @@ -53,9 +53,9 @@ @property def markup(self): return ( - self._generateInfoMarkup() + self._generate_info_markup() if self.mess_data.type == C.MESS_TYPE_INFO - else self._generateMarkup() + else self._generate_markup() ) @property @@ -104,10 +104,10 @@ canvas.set_cursor(self.get_cursor_coords(size)) return canvas - def _generateInfoMarkup(self): + def _generate_info_markup(self): return ("info_msg", self.message) - def _generateMarkup(self): + def _generate_markup(self): """Generate text markup according to message data and Widget options""" markup = [] d = self.mess_data @@ -163,12 +163,12 @@ def __init__(self, occupant_data): self.occupant_data = occupant_data occupant_data.widgets.add(self) - markup = self._generateMarkup() + markup = self._generate_markup() text = sat_widgets.ClickableText(markup) urwid.connect_signal( text, "click", - self.occupant_data.parent._occupantsClicked, + self.occupant_data.parent._occupants_clicked, user_args=[self.occupant_data], ) super(OccupantWidget, self).__init__(text) @@ -186,7 +186,7 @@ @property def markup(self): - return self._generateMarkup() + return self._generate_markup() @property def parent(self): @@ -217,7 +217,7 @@ canvas.set_cursor(self.get_cursor_coords(size)) return canvas - def _generateMarkup(self): + def _generate_markup(self): # TODO: role and affiliation are shown in a Q&D way # should be more intuitive and themable o = self.occupant_data @@ -240,7 +240,7 @@ self.parent = parent self.occupants_walker = urwid.SimpleListWalker([]) self.occupants_footer = urwid.Text("", align="center") - self.updateFooter() + self.update_footer() occupants_widget = urwid.Frame( urwid.ListBox(self.occupants_walker), footer=self.occupants_footer ) @@ -253,12 +253,12 @@ def clear(self): del self.occupants_walker[:] - def updateFooter(self): + def update_footer(self): """update footer widget""" txt = OCCUPANTS_FOOTER.format(len(self.parent.occupants)) self.occupants_footer.set_text(txt) - def getNicks(self, start=""): + def get_nicks(self, start=""): """Return nicks of all occupants @param start(unicode): only return nicknames which start with this text @@ -272,14 +272,14 @@ def addUser(self, occupant_data): """add a user to the list""" bisect.insort(self.occupants_walker, OccupantWidget(occupant_data)) - self.updateFooter() + self.update_footer() self.parent.host.redraw() # FIXME: should not be necessary def removeUser(self, occupant_data): """remove a user from the list""" for widget in occupant_data.widgets: self.occupants_walker.remove(widget) - self.updateFooter() + self.update_footer() self.parent.host.redraw() # FIXME: should not be necessary @@ -305,8 +305,8 @@ self.occupants_panel = sat_widgets.VerticalSeparator( self.occupants_widget ) - self._appendOccupantsPanel() - self.host.addListener("presence", self.presenceListener, [profiles]) + self._append_occupants_panel() + self.host.addListener("presence", self.presence_listener, [profiles]) # focus marker is a separator indicated last visible message before focus was lost self.focus_marker = None # link to current marker @@ -314,7 +314,7 @@ self.show_timestamp = True self.show_short_nick = False self.show_title = 1 # 0: clip title; 1: full title; 2: no title - self.postInit() + self.post_init() @property def message_widgets_rev(self): @@ -325,9 +325,9 @@ if self.type == C.CHAT_GROUP: widgets = [widget for (widget, options) in self.chat_colums.contents] if self.occupants_panel in widgets: - self._removeOccupantsPanel() + self._remove_occupants_panel() else: - self._appendOccupantsPanel() + self._append_occupants_panel() elif key == a_key["TIMESTAMP_HIDE"]: # user wants to (un)hide timestamp self.show_timestamp = not self.show_timestamp self.redraw() @@ -339,9 +339,9 @@ if self.subject: self.show_title = (self.show_title + 1) % 3 if self.show_title == 0: - self.setSubject(self.subject, "clip") + self.set_subject(self.subject, "clip") elif self.show_title == 1: - self.setSubject(self.subject, "space") + self.set_subject(self.subject, "space") elif self.show_title == 2: self.chat_widget.header = None self._invalidate() @@ -360,7 +360,7 @@ space = text.rfind(" ") start = text[space + 1 :] - words = self.occupants_widget.getNicks(start) + words = self.occupants_widget.get_nicks(start) if not words: return text try: @@ -373,24 +373,24 @@ word = completion_data["last_word"] = words[word_idx] return "{}{}{}".format(text[: space + 1], word, ": " if space < 0 else "") - def getMenu(self): + def get_menu(self): """Return Menu bar""" menu = sat_widgets.Menu(self.host.loop) if self.type == C.CHAT_GROUP: - self.host.addMenus(menu, C.MENU_ROOM, {"room_jid": self.target.bare}) + self.host.add_menus(menu, C.MENU_ROOM, {"room_jid": self.target.bare}) game = _("Game") - menu.addMenu(game, "Tarot", self.onTarotRequest) + menu.add_menu(game, "Tarot", self.on_tarot_request) elif self.type == C.CHAT_ONE2ONE: # FIXME: self.target is a bare jid, we need to check that contact_list = self.host.contact_lists[self.profile] if not self.target.resource: - full_jid = contact_list.getFullJid(self.target) + full_jid = contact_list.get_full_jid(self.target) else: full_jid = self.target - self.host.addMenus(menu, C.MENU_SINGLE, {"jid": full_jid}) + self.host.add_menus(menu, C.MENU_SINGLE, {"jid": full_jid}) return menu - def setFilter(self, args): + def set_filter(self, args): """set filtering of messages @param args(list[unicode]): filters following syntax "[filter]=[value]" @@ -403,9 +403,9 @@ lang = args[0][5:].strip() self.filters.append(lambda mess_data: lang in mess_data.message) - self.printMessages() + self.print_messages() - def presenceListener(self, entity, show, priority, statuses, profile): + def presence_listener(self, entity, show, priority, statuses, profile): """Update entity's presence status @param entity (jid.JID): entity updated @@ -421,7 +421,7 @@ # return # self.update(entity) - def createMessage(self, message): + def create_message(self, message): self.appendMessage(message) def _scrollDown(self): @@ -461,10 +461,10 @@ if not all([f(message) for f in self.filters]): return - if self.handleUserMoved(message): + if self.handle_user_moved(message): return - if ((self.host.selected_widget != self or not self.host.x_notify.hasFocus()) + if ((self.host.selected_widget != self or not self.host.x_notify.has_focus()) and self.focus_marker_set is not None): if not self.focus_marker_set and not self._locked and self.mess_walker: if self.focus_marker is not None: @@ -486,7 +486,7 @@ wid = MessageWidget(message) self.mess_walker.append(wid) self._scrollDown() - if self.isUserMoved(message): + if self.is_user_moved(message): return # no notification for moved messages # notifications @@ -528,33 +528,33 @@ if occupant is not None: self.occupants_widget.removeUser(occupant) - def occupantsClear(self): - super(Chat, self).occupantsClear() + def occupants_clear(self): + super(Chat, self).occupants_clear() self.occupants_widget.clear() - def _occupantsClicked(self, occupant, clicked_wid): + def _occupants_clicked(self, occupant, clicked_wid): assert self.type == C.CHAT_GROUP contact_list = self.host.contact_lists[self.profile] # we have a click on a nick, we need to create the widget if it doesn't exists - self.getOrCreatePrivateWidget(occupant.jid) + self.get_or_create_private_widget(occupant.jid) # now we select the new window - for contact_list in self.host.widgets.getWidgets( + for contact_list in self.host.widgets.get_widgets( ContactList, profiles=(self.profile,) ): - contact_list.setFocus(occupant.jid, True) + contact_list.set_focus(occupant.jid, True) - def _appendOccupantsPanel(self): + def _append_occupants_panel(self): self.chat_colums.contents.append((self.occupants_panel, ("weight", 2, False))) - def _removeOccupantsPanel(self): + def _remove_occupants_panel(self): for widget, options in self.chat_colums.contents: if widget is self.occupants_panel: self.chat_colums.contents.remove((widget, options)) break - def addGamePanel(self, widget): + def add_game_panel(self, widget): """Insert a game panel to this Chat dialog. @param widget (Widget): the game panel @@ -564,7 +564,7 @@ self.pile.contents.insert(1, (urwid.Filler(urwid.Divider("-"), ("fixed", 1)))) self.host.redraw() - def removeGamePanel(self, widget): + def remove_game_panel(self, widget): """Remove the game panel from this Chat dialog. @param widget (Widget): the game panel @@ -573,9 +573,9 @@ del self.pile.contents[0] self.host.redraw() - def setSubject(self, subject, wrap="space"): + def set_subject(self, subject, wrap="space"): """Set title for a group chat""" - quick_chat.QuickChat.setSubject(self, subject) + quick_chat.QuickChat.set_subject(self, subject) self.subj_wid = urwid.Text( str(subject.replace("\n", "|") if wrap == "clip" else subject), align="left" if wrap == "clip" else "center", @@ -586,7 +586,7 @@ ## Messages - def printMessages(self, clear=True): + def print_messages(self, clear=True): """generate message widgets @param clear(bool): clear message before printing if true @@ -604,7 +604,7 @@ except AttributeError: pass - def updateHistory(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, profile="@NONE@"): + def update_history(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, profile="@NONE@"): del self.mess_walker[:] if filters and "search" in filters: self.mess_walker.append( @@ -617,19 +617,19 @@ self.mess_walker.append( urwid.Text(_("Type ':history <lines>' to reset the chat history")) ) - super(Chat, self).updateHistory(size, filters, profile) + super(Chat, self).update_history(size, filters, profile) - def _onHistoryPrinted(self): + def _on_history_printed(self): """Refresh or scroll down the focus after the history is printed""" - self.printMessages(clear=False) - super(Chat, self)._onHistoryPrinted() + self.print_messages(clear=False) + super(Chat, self)._on_history_printed() - def onPrivateCreated(self, widget): - self.host.contact_lists[widget.profile].setSpecial( + def on_private_created(self, widget): + self.host.contact_lists[widget.profile].set_special( widget.target, C.CONTACT_SPECIAL_GROUP ) - def onSelected(self): + def on_selected(self): self.focus_marker_set = False def notify(self, contact="somebody", msg=""): @@ -646,62 +646,62 @@ # as that mean that he is probably watching discussion history self.mess_widgets.focus_position = len(self.mess_walker) - 1 self.host.redraw() - if not self.host.x_notify.hasFocus(): + if not self.host.x_notify.has_focus(): if self.type == C.CHAT_ONE2ONE: - self.host.x_notify.sendNotification( + self.host.x_notify.send_notification( _("Primitivus: %s is talking to you") % contact ) elif self.nick is not None and self.nick.lower() in msg.lower(): - self.host.x_notify.sendNotification( + self.host.x_notify.send_notification( _("Primitivus: %(user)s mentioned you in room '%(room)s'") % {"user": contact, "room": self.target} ) # MENU EVENTS # - def onTarotRequest(self, menu): + def on_tarot_request(self, menu): # TODO: move this to plugin_misc_tarot with dynamic menu if len(self.occupants) != 4: - self.host.showPopUp( + self.host.show_pop_up( sat_widgets.Alert( _("Can't start game"), _( "You need to be exactly 4 peoples in the room to start a Tarot game" ), - ok_cb=self.host.removePopUp, + ok_cb=self.host.remove_pop_up, ) ) else: - self.host.bridge.tarotGameCreate( + self.host.bridge.tarot_game_create( self.target, list(self.occupants), self.profile ) # MISC EVENTS # - def onDelete(self): + def on_delete(self): # FIXME: to be checked after refactoring - super(Chat, self).onDelete() + super(Chat, self).on_delete() if self.type == C.CHAT_GROUP: - self.host.removeListener("presence", self.presenceListener) + self.host.removeListener("presence", self.presence_listener) - def onChatState(self, from_jid, state, profile): - super(Chat, self).onChatState(from_jid, state, profile) + def on_chat_state(self, from_jid, state, profile): + super(Chat, self).on_chat_state(from_jid, state, profile) if self.type == C.CHAT_ONE2ONE: self.title_dynamic = C.CHAT_STATE_ICON[state] self.host.redraw() # FIXME: should not be necessary - def _onSubjectDialogCb(self, button, dialog): - self.changeSubject(dialog.text) - self.host.removePopUp(dialog) + def _on_subject_dialog_cb(self, button, dialog): + self.change_subject(dialog.text) + self.host.remove_pop_up(dialog) - def onSubjectDialog(self, new_subject=None): + def on_subject_dialog(self, new_subject=None): dialog = sat_widgets.InputDialog( _("Change title"), _("Enter the new title"), default_txt=new_subject if new_subject is not None else self.subject, ) - dialog.setCallback("ok", self._onSubjectDialogCb, dialog) - dialog.setCallback("cancel", lambda __: self.host.removePopUp(dialog)) - self.host.showPopUp(dialog) + dialog.set_callback("ok", self._on_subject_dialog_cb, dialog) + dialog.set_callback("cancel", lambda __: self.host.remove_pop_up(dialog)) + self.host.show_pop_up(dialog) quick_widgets.register(quick_chat.QuickChat, Chat)
--- a/sat_frontends/primitivus/config.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/config.py Sat Apr 08 13:54:42 2023 +0200 @@ -24,7 +24,7 @@ import configparser -def applyConfig(host): +def apply_config(host): """Parse configuration and apply found change raise: can raise various Exceptions if configuration is not good
--- a/sat_frontends/primitivus/contact_list.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/contact_list.py Sat Apr 08 13:54:42 2023 +0200 @@ -46,21 +46,21 @@ # we now build the widget self.status_bar = StatusBar(host) - self.frame = sat_widgets.FocusFrame(self._buildList(), None, self.status_bar) + self.frame = sat_widgets.FocusFrame(self._build_list(), None, self.status_bar) PrimitivusWidget.__init__(self, self.frame, _("Contacts")) if on_click: urwid.connect_signal(self, "click", on_click, user_data) if on_change: urwid.connect_signal(self, "change", on_change, user_data) - self.host.addListener("notification", self.onNotification, [self.profile]) - self.host.addListener("notificationsClear", self.onNotification, [self.profile]) - self.postInit() + self.host.addListener("notification", self.on_notification, [self.profile]) + self.host.addListener("notificationsClear", self.on_notification, [self.profile]) + self.post_init() def update(self, entities=None, type_=None, profile=None): """Update display, keep focus""" # FIXME: full update is done each time, must handle entities, type_ and profile widget, position = self.frame.body.get_focus() - self.frame.body = self._buildList() + self.frame.body = self._build_list() if position: try: self.frame.body.focus_position = position @@ -85,27 +85,27 @@ elif ( key == a_key["DISCONNECTED_HIDE"] ): # user wants to (un)hide disconnected contacts - self.host.bridge.setParam( + self.host.bridge.param_set( C.SHOW_OFFLINE_CONTACTS, - C.boolConst(not self.contact_list.show_disconnected), + C.bool_const(not self.contact_list.show_disconnected), "General", profile_key=self.profile, ) elif key == a_key["RESOURCES_HIDE"]: # user wants to (un)hide contacts resources - self.contact_list.showResources(not self.contact_list.show_resources) + self.contact_list.show_resources(not self.contact_list.show_resources) self.update() return super(ContactList, self).keypress(size, key) # QuickWidget methods @staticmethod - def getWidgetHash(target, profiles): + def get_widget_hash(target, profiles): profiles = sorted(profiles) return tuple(profiles) # modify the contact list - def setFocus(self, text, select=False): + def set_focus(self, text, select=False): """give focus to the first element that matches the given text. You can also pass in text a sat_frontends.tools.jid.JID (it's a subclass of unicode). @@ -117,7 +117,7 @@ try: if isinstance(widget, sat_widgets.ClickableText): # contact group - value = widget.getValue() + value = widget.get_value() elif isinstance(widget, sat_widgets.SelectableText): # contact or muc value = widget.data @@ -128,36 +128,36 @@ if text.strip() == value.strip(): self.frame.body.focus_position = idx if select: - self._contactClicked(False, widget, True) + self._contact_clicked(False, widget, True) return except AttributeError: pass idx += 1 - log.debug("Not element found for {} in setFocus".format(text)) + log.debug("Not element found for {} in set_focus".format(text)) # events - def _groupClicked(self, group_wid): - group = group_wid.getValue() - data = self.contact_list.getGroupData(group) + def _group_clicked(self, group_wid): + group = group_wid.get_value() + data = self.contact_list.get_group_data(group) data[C.GROUP_DATA_FOLDED] = not data.setdefault(C.GROUP_DATA_FOLDED, False) - self.setFocus(group) + self.set_focus(group) self.update() - def _contactClicked(self, use_bare_jid, contact_wid, selected): + def _contact_clicked(self, use_bare_jid, contact_wid, selected): """Method called when a contact is clicked - @param use_bare_jid: True if use_bare_jid is set in self._buildEntityWidget. + @param use_bare_jid: True if use_bare_jid is set in self._build_entity_widget. @param contact_wid: widget of the contact, must have the entity set in data attribute @param selected: boolean returned by the widget, telling if it is selected """ entity = contact_wid.data - self.host.modeHint(C.MODE_INSERTION) + self.host.mode_hint(C.MODE_INSERTION) self._emit("click", entity) - def onNotification(self, entity, notif, profile): - notifs = list(self.host.getNotifs(C.ENTITY_ALL, profile=self.profile)) + def on_notification(self, entity, notif, profile): + notifs = list(self.host.get_notifs(C.ENTITY_ALL, profile=self.profile)) if notifs: self.title_dynamic = "({})".format(len(notifs)) else: @@ -166,7 +166,7 @@ # Methods to build the widget - def _buildEntityWidget( + def _build_entity_widget( self, entity, keys=None, @@ -222,10 +222,10 @@ entity_attr = "default" notifs = list( - self.host.getNotifs(entity, exact_jid=special, profile=self.profile) + self.host.get_notifs(entity, exact_jid=special, profile=self.profile) ) mentions = list( - self.host.getNotifs(entity.bare, C.NOTIFY_MENTION, profile=self.profile) + self.host.get_notifs(entity.bare, C.NOTIFY_MENTION, profile=self.profile) ) if notifs or mentions: attr = 'cl_mention' if mentions else 'cl_notifs' @@ -245,11 +245,11 @@ widget.data = entity widget.comp = entity_txt.lower() # value to use for sorting urwid.connect_signal( - widget, "change", self._contactClicked, user_args=[use_bare_jid] + widget, "change", self._contact_clicked, user_args=[use_bare_jid] ) return widget - def _buildEntities(self, content, entities): + def _build_entities(self, content, entities): """Add entity representation in widget list @param content: widget list, e.g. SimpleListWalker @@ -262,7 +262,7 @@ for entity in entities: if ( entity in self.contact_list._specials - or not self.contact_list.entityVisible(entity) + or not self.contact_list.entity_visible(entity) ): continue markup_extra = [] @@ -288,7 +288,7 @@ status = self.contact_list.getCache(entity, "status", default=None) status_disp = ("status", "\n " + status) if status else "" markup_extra.append(status_disp) - widget = self._buildEntityWidget( + widget = self._build_entity_widget( entity, ("cache_nick", "cache_name", "node"), use_bare_jid=True, @@ -301,22 +301,22 @@ for widget in widgets: content.append(widget) - def _buildSpecials(self, content): + def _build_specials(self, content): """Build the special entities""" - specials = sorted(self.contact_list.getSpecials()) + specials = sorted(self.contact_list.get_specials()) current = None for entity in specials: if current is not None and current.bare == entity.bare: # nested entity (e.g. MUC private conversations) - widget = self._buildEntityWidget( + widget = self._build_entity_widget( entity, ("resource",), markup_prepend=" ", special=True ) else: # the special widgets if entity.resource: - widget = self._buildEntityWidget(entity, ("resource",), special=True) + widget = self._build_entity_widget(entity, ("resource",), special=True) else: - widget = self._buildEntityWidget( + widget = self._build_entity_widget( entity, ("cache_nick", "cache_name", "node"), with_show_attr=False, @@ -324,30 +324,30 @@ ) content.append(widget) - def _buildList(self): + def _build_list(self): """Build the main contact list widget""" content = urwid.SimpleListWalker([]) - self._buildSpecials(content) + self._build_specials(content) if self.contact_list._specials: content.append(urwid.Divider("=")) groups = list(self.contact_list._groups) groups.sort(key=lambda x: x.lower() if x else '') for group in groups: - data = self.contact_list.getGroupData(group) + data = self.contact_list.get_group_data(group) folded = data.get(C.GROUP_DATA_FOLDED, False) jids = list(data["jids"]) if group is not None and ( - self.contact_list.anyEntityVisible(jids) + self.contact_list.any_entity_visible(jids) or self.contact_list.show_empty_groups ): header = "[-]" if not folded else "[+]" widget = sat_widgets.ClickableText(group, header=header + " ") content.append(widget) - urwid.connect_signal(widget, "click", self._groupClicked) + urwid.connect_signal(widget, "click", self._group_clicked) if not folded: - self._buildEntities(content, jids) + self._build_entities(content, jids) not_in_roster = ( set(self.contact_list._cache) .difference(self.contact_list._roster) @@ -356,7 +356,7 @@ ) if not_in_roster: content.append(urwid.Divider("-")) - self._buildEntities(content, not_in_roster) + self._build_entities(content, not_in_roster) return urwid.ListBox(content)
--- a/sat_frontends/primitivus/game_tarot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/game_tarot.py Sat Apr 08 13:54:42 2023 +0200 @@ -34,7 +34,7 @@ def __init__(self, card): self.__selected = False self.card = card - urwid.Text.__init__(self, card.getAttrText()) + urwid.Text.__init__(self, card.get_attr_text()) def selectable(self): return True @@ -55,16 +55,16 @@ def select(self, state=True): self.__selected = state - attr, txt = self.card.getAttrText() + attr, txt = self.card.get_attr_text() if self.__selected: attr += "_selected" self.set_text((attr, txt)) self._invalidate() - def isSelected(self): + def is_selected(self): return self.__selected - def getCard(self): + def get_card(self): return self.card def render(self, size, focus=False): @@ -102,12 +102,12 @@ self._emit("click", None) return key - def getSelected(self): + def get_selected(self): """Return a list of selected cards""" _selected = [] for wid in self.columns.widget_list: - if isinstance(wid, CardDisplayer) and wid.isSelected(): - _selected.append(wid.getCard()) + if isinstance(wid, CardDisplayer) and wid.is_selected(): + _selected.append(wid.get_card()) return _selected def update(self, hand): @@ -123,11 +123,11 @@ widget = CardDisplayer(card) self.columns.widget_list.append(widget) self.columns.column_types.append(("fixed", 3)) - urwid.connect_signal(widget, "click", self.__onClick) + urwid.connect_signal(widget, "click", self.__on_click) self.columns.contents.append((urwid.Text(""), ("weight", 1, False))) self.columns.focus_position = 1 - def __onClick(self, card_wid): + def __on_click(self, card_wid): self._emit("click", card_wid) @@ -141,7 +141,7 @@ """@param file: path of the PNG file""" TarotCard.__init__(self, (suit, value)) - def getAttrText(self): + def get_attr_text(self): """return text representation of the card with attributes""" try: value = "%02i" % int(self.value) @@ -169,7 +169,7 @@ color = "special" return ("card_%s" % color, "%s%s" % (value, suit)) - def getWidget(self): + def get_widget(self): """Return a widget representing the card""" return CardDisplayer(self) @@ -180,7 +180,7 @@ def __init__(self): self.top = self.left = self.bottom = self.right = None - def putCard(self, location, card): + def put_card(self, location, card): """Put a card on the table @param location: where to put the card (top, left, bottom or right) @param card: Card to play or None""" @@ -208,7 +208,7 @@ margin_center = max((max_col - Card.SIZE * 2 - len(separator)) / 2, 0) * " " for location in ["top", "left", "bottom", "right"]: card = getattr(self, location) - cards[location] = card.getAttrText() if card else Card.SIZE * " " + cards[location] = card.get_attr_text() if card else Card.SIZE * " " render_wid = [ urwid.Text([margin, cards["top"]]), urwid.Text([margin_center, cards["left"], separator, cards["right"]]), @@ -222,7 +222,7 @@ def __init__(self, parent, referee, players): QuickTarotGame.__init__(self, parent, referee, players) - self.loadCards() + self.load_cards() self.top = urwid.Pile([urwid.Padding(urwid.Text(self.top_nick), "center")]) # self.parent.host.debug() self.table = Table() @@ -244,18 +244,18 @@ ]), urwid.Padding(self.bottom_card_wid,'center') ])""" - self.hand_wid = Hand(selectable=True, on_click=self.onClick) + self.hand_wid = Hand(selectable=True, on_click=self.on_click) self.main_frame = urwid.Frame( self.center, header=self.top, footer=self.hand_wid, focus_part="footer" ) urwid.WidgetWrap.__init__(self, self.main_frame) - self.parent.host.bridge.tarotGameReady( + self.parent.host.bridge.tarot_game_ready( self.player_nick, referee, self.parent.profile ) - def loadCards(self): + def load_cards(self): """Load all the cards in memory""" - QuickTarotGame.loadCards(self) + QuickTarotGame.load_cards(self) for value in list(map(str, list(range(1, 22)))) + ["excuse"]: card = Card("atout", value) self.cards[card.suit, card.value] = card @@ -266,22 +266,22 @@ self.cards[card.suit, card.value] = card self.deck.append(card) - def tarotGameNewHandler(self, hand): + def tarot_game_new_handler(self, hand): """Start a new game, with given hand""" if hand is []: # reset the display after the scores have been showed - self.resetRound() + self.reset_round() for location in ["top", "left", "bottom", "right"]: - self.table.putCard(location, None) + self.table.put_card(location, None) self.parent.host.redraw() - self.parent.host.bridge.tarotGameReady( + self.parent.host.bridge.tarot_game_ready( self.player_nick, self.referee, self.parent.profile ) return - QuickTarotGame.tarotGameNewHandler(self, hand) + QuickTarotGame.tarot_game_new_handler(self, hand) self.hand_wid.update(self.hand) self.parent.host.redraw() - def tarotGameChooseContratHandler(self, xml_data): + def tarot_game_choose_contrat_handler(self, xml_data): """Called when the player has to select his contrat @param xml_data: SàT xml representation of the form""" form = xmlui.create( @@ -293,16 +293,16 @@ ) form.show(valign="top") - def tarotGameShowCardsHandler(self, game_stage, cards, data): + def tarot_game_show_cards_handler(self, game_stage, cards, data): """Display cards in the middle of the game (to show for e.g. chien ou poignée)""" - QuickTarotGame.tarotGameShowCardsHandler(self, game_stage, cards, data) + QuickTarotGame.tarot_game_show_cards_handler(self, game_stage, cards, data) self.center.widget_list[1] = urwid.Filler(Hand(self.to_show)) self.parent.host.redraw() - def tarotGameYourTurnHandler(self): - QuickTarotGame.tarotGameYourTurnHandler(self) + def tarot_game_your_turn_handler(self): + QuickTarotGame.tarot_game_your_turn_handler(self) - def tarotGameScoreHandler(self, xml_data, winners, loosers): + def tarot_game_score_handler(self, xml_data, winners, loosers): """Called when the round is over, display the scores @param xml_data: SàT xml representation of the form""" if not winners and not loosers: @@ -318,23 +318,23 @@ ) form.show() - def tarotGameInvalidCardsHandler(self, phase, played_cards, invalid_cards): + def tarot_game_invalid_cards_handler(self, phase, played_cards, invalid_cards): """Invalid cards have been played @param phase: phase of the game @param played_cards: all the cards played @param invalid_cards: cards which are invalid""" - QuickTarotGame.tarotGameInvalidCardsHandler( + QuickTarotGame.tarot_game_invalid_cards_handler( self, phase, played_cards, invalid_cards ) self.hand_wid.update(self.hand) if self._autoplay == None: # No dialog if there is autoplay - self.parent.host.barNotify(_("Cards played are invalid !")) + self.parent.host.bar_notify(_("Cards played are invalid !")) self.parent.host.redraw() - def tarotGameCardsPlayedHandler(self, player, cards): + def tarot_game_cards_played_handler(self, player, cards): """A card has been played by player""" - QuickTarotGame.tarotGameCardsPlayedHandler(self, player, cards) - self.table.putCard(self.getPlayerLocation(player), self.played[player]) + QuickTarotGame.tarot_game_cards_played_handler(self, player, cards) + self.table.put_card(self.get_player_location(player), self.played[player]) self._checkState() self.parent.host.redraw() @@ -356,7 +356,7 @@ self.hand_wid.update(self.hand) ##EVENTS## - def onClick(self, hand, card_wid): + def on_click(self, hand, card_wid): """Called when user do an action on the hand""" if not self.state in ["play", "ecart", "wait_for_ecart"]: # it's not our turn, we ignore the click @@ -364,16 +364,16 @@ return self._checkState() if self.state == "ecart": - if len(self.hand_wid.getSelected()) == 6: + if len(self.hand_wid.get_selected()) == 6: pop_up_widget = sat_widgets.ConfirmDialog( _("Do you put these cards in chien ?"), - yes_cb=self.onEcartDone, - no_cb=self.parent.host.removePopUp, + yes_cb=self.on_ecart_done, + no_cb=self.parent.host.remove_pop_up, ) - self.parent.host.showPopUp(pop_up_widget) + self.parent.host.show_pop_up(pop_up_widget) elif self.state == "play": - card = card_wid.getCard() - self.parent.host.bridge.tarotGamePlayCards( + card = card_wid.get_card() + self.parent.host.bridge.tarot_game_play_cards( self.player_nick, self.referee, [(card.suit, card.value)], @@ -383,15 +383,15 @@ self.hand_wid.update(self.hand) self.state = "wait" - def onEcartDone(self, button): + def on_ecart_done(self, button): """Called when player has finished his écart""" ecart = [] - for card in self.hand_wid.getSelected(): + for card in self.hand_wid.get_selected(): ecart.append((card.suit, card.value)) self.hand.remove(card) self.hand_wid.update(self.hand) - self.parent.host.bridge.tarotGamePlayCards( + self.parent.host.bridge.tarot_game_play_cards( self.player_nick, self.referee, ecart, self.parent.profile ) self.state = "wait" - self.parent.host.removePopUp() + self.parent.host.remove_pop_up()
--- a/sat_frontends/primitivus/notify.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/notify.py Sat Apr 08 13:54:42 2023 +0200 @@ -33,7 +33,7 @@ from Xlib import display as X_display self.display = X_display.Display() - self.X11_id = self.getFocus() + self.X11_id = self.get_focus() except: pass @@ -51,24 +51,24 @@ except: self.freedesktop_int = None - def getFocus(self): + def get_focus(self): if not self.display: return 0 return self.display.get_input_focus().focus.id - def hasFocus(self): - return (self.getFocus() == self.X11_id) if self.display else True + def has_focus(self): + return (self.get_focus() == self.X11_id) if self.display else True - def useX11(self): + def use_x11(self): return bool(self.display) - def sendNotification(self, summ_mess, body_mess=""): + def send_notification(self, summ_mess, body_mess=""): """Send notification to the user if possible""" # TODO: check options before sending notifications if self.freedesktop_int: - self.sendFDNotification(summ_mess, body_mess) + self.send_fd_notification(summ_mess, body_mess) - def sendFDNotification(self, summ_mess, body_mess=""): + def send_fd_notification(self, summ_mess, body_mess=""): """Send notification with the FreeDesktop D-Bus API""" if self.freedesktop_int: app_name = "Primitivus"
--- a/sat_frontends/primitivus/profile_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/profile_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,21 +32,21 @@ def __init__(self, host, autoconnect=None): QuickProfileManager.__init__(self, host, autoconnect) - # login & password box must be created before list because of onProfileChange + # login & password box must be created before list because of on_profile_change self.login_wid = sat_widgets.AdvancedEdit(_("Login:"), align="center") self.pass_wid = sat_widgets.Password(_("Password:"), align="center") style = ["no_first_select"] - profiles = host.bridge.profilesListGet() + profiles = host.bridge.profiles_list_get() profiles.sort() self.list_profile = sat_widgets.List( - profiles, style=style, align="center", on_change=self.onProfileChange + profiles, style=style, align="center", on_change=self.on_profile_change ) # new & delete buttons buttons = [ - urwid.Button(_("New"), self.onNewProfile), - urwid.Button(_("Delete"), self.onDeleteProfile), + urwid.Button(_("New"), self.on_new_profile), + urwid.Button(_("Delete"), self.on_delete_profile), ] buttons_flow = urwid.GridFlow( buttons, @@ -61,7 +61,7 @@ # connect button connect_button = sat_widgets.CustomButton( - _("Connect"), self.onConnectProfiles, align="center" + _("Connect"), self.on_connect_profiles, align="center" ) # we now build the widget @@ -87,7 +87,7 @@ def keypress(self, size, key): if key == a_key["APP_QUIT"]: - self.host.onExit() + self.host.on_exit() raise urwid.ExitMainLoop() elif key in (a_key["FOCUS_UP"], a_key["FOCUS_DOWN"]): focus_diff = 1 if key == a_key["FOCUS_DOWN"] else -1 @@ -107,122 +107,122 @@ return return super(ProfileManager, self).keypress(size, key) - def cancelDialog(self, button): - self.host.removePopUp() + def cancel_dialog(self, button): + self.host.remove_pop_up() - def newProfile(self, button, edit): + def new_profile(self, button, edit): """Create the profile""" name = edit.get_edit_text() - self.host.bridge.profileCreate( + self.host.bridge.profile_create( name, - callback=lambda: self.newProfileCreated(name), - errback=self.profileCreationFailure, + callback=lambda: self.new_profile_created(name), + errback=self.profile_creation_failure, ) - def newProfileCreated(self, profile): + def new_profile_created(self, profile): # new profile will be selected, and a selected profile assume the session is started - self.host.bridge.profileStartSession( + self.host.bridge.profile_start_session( "", profile, - callback=lambda __: self.newProfileSessionStarted(profile), - errback=self.profileCreationFailure, + callback=lambda __: self.new_profile_session_started(profile), + errback=self.profile_creation_failure, ) - def newProfileSessionStarted(self, profile): - self.host.removePopUp() - self.refillProfiles() - self.list_profile.selectValue(profile) + def new_profile_session_started(self, profile): + self.host.remove_pop_up() + self.refill_profiles() + self.list_profile.select_value(profile) self.current.profile = profile - self.getConnectionParams(profile) + self.get_connection_params(profile) self.host.redraw() - def profileCreationFailure(self, reason): - self.host.removePopUp() - message = self._getErrorMessage(reason) + def profile_creation_failure(self, reason): + self.host.remove_pop_up() + message = self._get_error_message(reason) self.host.alert(_("Can't create profile"), message) - def deleteProfile(self, button): - self._deleteProfile() - self.host.removePopUp() + def delete_profile(self, button): + self._delete_profile() + self.host.remove_pop_up() - def onNewProfile(self, e): + def on_new_profile(self, e): pop_up_widget = sat_widgets.InputDialog( _("New profile"), _("Please enter a new profile name"), - cancel_cb=self.cancelDialog, - ok_cb=self.newProfile, + cancel_cb=self.cancel_dialog, + ok_cb=self.new_profile, ) - self.host.showPopUp(pop_up_widget) + self.host.show_pop_up(pop_up_widget) - def onDeleteProfile(self, e): + def on_delete_profile(self, e): if self.current.profile: pop_up_widget = sat_widgets.ConfirmDialog( _("Are you sure you want to delete the profile {} ?").format( self.current.profile ), - no_cb=self.cancelDialog, - yes_cb=self.deleteProfile, + no_cb=self.cancel_dialog, + yes_cb=self.delete_profile, ) - self.host.showPopUp(pop_up_widget) + self.host.show_pop_up(pop_up_widget) - def onConnectProfiles(self, button): + def on_connect_profiles(self, button): """Connect the profiles and start the main widget @param button: the connect button """ - self._onConnectProfiles() + self._on_connect_profiles() - def resetFields(self): + def reset_fields(self): """Set profile to None, and reset fields""" - super(ProfileManager, self).resetFields() - self.list_profile.unselectAll(invisible=True) + super(ProfileManager, self).reset_fields() + self.list_profile.unselect_all(invisible=True) - def setProfiles(self, profiles): + def set_profiles(self, profiles): """Update the list of profiles""" - self.list_profile.changeValues(profiles) + self.list_profile.change_values(profiles) self.host.redraw() - def getProfiles(self): - return self.list_profile.getSelectedValues() + def get_profiles(self): + return self.list_profile.get_selected_values() - def getJID(self): + def get_jid(self): return self.login_wid.get_edit_text() def getPassword(self): return self.pass_wid.get_edit_text() - def setJID(self, jid_): + def set_jid(self, jid_): self.login_wid.set_edit_text(jid_) self.current.login = jid_ self.host.redraw() # FIXME: redraw should be avoided - def setPassword(self, password): + def set_password(self, password): self.pass_wid.set_edit_text(password) self.current.password = password self.host.redraw() - def onProfileChange(self, list_wid, widget=None, selected=None): + def on_profile_change(self, list_wid, widget=None, selected=None): """This is called when a profile is selected in the profile list. @param list_wid: the List widget who sent the event """ - self.updateConnectionParams() + self.update_connection_params() focused = list_wid.focus - selected = focused.getState() if focused is not None else False + selected = focused.get_state() if focused is not None else False if not selected: # profile was just unselected return - focused.setState( + focused.set_state( False, invisible=True ) # we don't want the widget to be selected until we are sure we can access it def authenticate_cb(data, cb_id, profile): if C.bool(data.pop("validated", C.BOOL_FALSE)): self.current.profile = profile - focused.setState(True, invisible=True) - self.getConnectionParams(profile) + focused.set_state(True, invisible=True) + self.get_connection_params(profile) self.host.redraw() - self.host.actionManager(data, callback=authenticate_cb, profile=profile) + self.host.action_manager(data, callback=authenticate_cb, profile=profile) - self.host.launchAction( + self.host.action_launch( C.AUTHENTICATE_PROFILE_ID, callback=authenticate_cb, profile=focused.text )
--- a/sat_frontends/primitivus/progress.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/progress.py Sat Apr 08 13:54:42 2023 +0200 @@ -34,8 +34,8 @@ self.progress_dict = {} listbox = urwid.ListBox(self.progress_list) buttons = [] - buttons.append(sat_widgets.CustomButton(_("Clear progress list"), self._onClear)) - max_len = max([button.getSize() for button in buttons]) + buttons.append(sat_widgets.CustomButton(_("Clear progress list"), self._on_clear)) + max_len = max([button.get_size() for button in buttons]) buttons_wid = urwid.GridFlow(buttons, max_len, 1, 0, "center") main_wid = sat_widgets.FocusFrame(listbox, footer=buttons_wid) urwid.WidgetWrap.__init__(self, main_wid) @@ -50,11 +50,11 @@ "state": "init", } self.progress_list.append(column) - self.progressCB(self.host.loop, (progress_id, message, profile)) + self.progress_cb(self.host.loop, (progress_id, message, profile)) - def progressCB(self, loop, data): + def progress_cb(self, loop, data): progress_id, message, profile = data - data = self.host.bridge.progressGet(progress_id, profile) + data = self.host.bridge.progress_get(progress_id, profile) pbar = self.progress_dict[(progress_id, profile)]["progress"] if data: if self.progress_dict[(progress_id, profile)]["state"] == "init": @@ -63,33 +63,33 @@ pbar.done = float(data["size"]) pbar.set_completion(float(data["position"])) - self.updateNotBar() + self.update_not_bar() else: if self.progress_dict[(progress_id, profile)]["state"] == "progress": self.progress_dict[(progress_id, profile)]["state"] = "done" pbar.set_completion(pbar.done) - self.updateNotBar() + self.update_not_bar() return - loop.set_alarm_in(0.2, self.progressCB, (progress_id, message, profile)) + loop.set_alarm_in(0.2, self.progress_cb, (progress_id, message, profile)) - def _removeBar(self, progress_id, profile): + def _remove_bar(self, progress_id, profile): wid = self.progress_dict[(progress_id, profile)]["full"] self.progress_list.remove(wid) del (self.progress_dict[(progress_id, profile)]) - def _onClear(self, button): + def _on_clear(self, button): to_remove = [] for progress_id, profile in self.progress_dict: if self.progress_dict[(progress_id, profile)]["state"] == "done": to_remove.append((progress_id, profile)) for progress_id, profile in to_remove: - self._removeBar(progress_id, profile) - self.updateNotBar() + self._remove_bar(progress_id, profile) + self.update_not_bar() - def updateNotBar(self): + def update_not_bar(self): if not self.progress_dict: - self.host.setProgress(None) + self.host.set_progress(None) return progress = 0 nb_bars = 0 @@ -98,4 +98,4 @@ progress += pbar.current / pbar.done * 100 nb_bars += 1 av_progress = progress / float(nb_bars) - self.host.setProgress(av_progress) + self.host.set_progress(av_progress)
--- a/sat_frontends/primitivus/status.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/status.py Sat Apr 08 13:54:42 2023 +0200 @@ -31,7 +31,7 @@ status_prefix = urwid.Text("[") status_suffix = urwid.Text("]") self.status = sat_widgets.ClickableText("") - self.setPresenceStatus(C.PRESENCE_UNAVAILABLE, "") + self.set_presence_status(C.PRESENCE_UNAVAILABLE, "") urwid.Columns.__init__( self, [ @@ -41,25 +41,25 @@ ("weight", 1, status_suffix), ], ) - urwid.connect_signal(self.presence, "click", self.onPresenceClick) - urwid.connect_signal(self.status, "click", self.onStatusClick) + urwid.connect_signal(self.presence, "click", self.on_presence_click) + urwid.connect_signal(self.status, "click", self.on_status_click) - def onPresenceClick(self, sender=None): - if not self.host.bridge.isConnected( + def on_presence_click(self, sender=None): + if not self.host.bridge.is_connected( self.host.current_profile ): # FIXME: manage multi-profiles return options = [commonConst.PRESENCE[presence] for presence in commonConst.PRESENCE] list_widget = sat_widgets.GenericList( - options=options, option_type=sat_widgets.ClickableText, on_click=self.onChange + options=options, option_type=sat_widgets.ClickableText, on_click=self.on_change ) decorated = sat_widgets.LabelLine( list_widget, sat_widgets.SurroundedText(_("Set your presence")) ) - self.host.showPopUp(decorated) + self.host.show_pop_up(decorated) - def onStatusClick(self, sender=None): - if not self.host.bridge.isConnected( + def on_status_click(self, sender=None): + if not self.host.bridge.is_connected( self.host.current_profile ): # FIXME: manage multi-profiles return @@ -67,12 +67,12 @@ _("Set your status"), _("New status"), default_txt=self.status.get_text(), - cancel_cb=lambda _: self.host.removePopUp(), - ok_cb=self.onChange, + cancel_cb=lambda _: self.host.remove_pop_up(), + ok_cb=self.on_change, ) - self.host.showPopUp(pop_up_widget) + self.host.show_pop_up(pop_up_widget) - def onChange(self, sender=None, user_data=None): + def on_change(self, sender=None, user_data=None): new_value = user_data.get_text() previous = ( [key for key in C.PRESENCE if C.PRESENCE[key][0] == self.presence.get_text()][ @@ -100,13 +100,13 @@ ) in ( self.host.profiles ): # FIXME: for now all the profiles share the same status - self.host.bridge.setPresence( + self.host.bridge.presence_set( show=new[0], statuses=statuses, profile_key=profile ) - self.setPresenceStatus(new[0], new[1]) - self.host.removePopUp() + self.set_presence_status(new[0], new[1]) + self.host.remove_pop_up() - def setPresenceStatus(self, show, status): + def set_presence_status(self, show, status): show_icon, show_attr = C.PRESENCE.get(show) self.presence.set_text(("show_normal", show_icon)) if status is not None:
--- a/sat_frontends/primitivus/widget.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/widget.py Sat Apr 08 13:54:42 2023 +0200 @@ -32,7 +32,7 @@ self._title = title self._title_dynamic = None self._original_widget = w - urwid.WidgetWrap.__init__(self, self._getDecoration(w)) + urwid.WidgetWrap.__init__(self, self._get_decoration(w)) @property def title(self): @@ -65,8 +65,8 @@ @title.setter def title(self, value): self._title = value - if self.decorationVisible: - self.showDecoration() + if self.decoration_visible: + self.show_decoration() @property def title_dynamic(self): @@ -76,29 +76,29 @@ @title_dynamic.setter def title_dynamic(self, value): self._title_dynamic = value - if self.decorationVisible: - self.showDecoration() + if self.decoration_visible: + self.show_decoration() @property - def decorationVisible(self): + def decoration_visible(self): """True if the decoration is visible""" return isinstance(self._w, sat_widgets.LabelLine) def keypress(self, size, key): if key == a_key["DECORATION_HIDE"]: # user wants to (un)hide widget decoration - show = not self.decorationVisible - self.showDecoration(show) + show = not self.decoration_visible + self.show_decoration(show) else: return super(PrimitivusWidget, self).keypress(size, key) - def _getDecoration(self, widget): + def _get_decoration(self, widget): return sat_widgets.LabelLine(widget, self.title) - def showDecoration(self, show=True): + def show_decoration(self, show=True): """Show/Hide the decoration around the window""" self._w = ( - self._getDecoration(self._original_widget) if show else self._original_widget + self._get_decoration(self._original_widget) if show else self._original_widget ) - def getMenu(self): + def get_menu(self): raise NotImplementedError
--- a/sat_frontends/primitivus/xmlui.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/primitivus/xmlui.py Sat Apr 08 13:54:42 2023 +0200 @@ -38,7 +38,7 @@ """" Call xmlui callback and ignore any extra argument """ args[-1](ctrl) - def _xmluiOnChange(self, callback): + def _xmlui_on_change(self, callback): """ Call callback with widget as only argument """ urwid.connect_signal(self, "change", self._event_callback, callback) @@ -93,10 +93,10 @@ return False return super(PrimitivusStringWidget, self).selectable() - def _xmluiSetValue(self, value): + def _xmlui_set_value(self, value): self.set_edit_text(value) - def _xmluiGetValue(self): + def _xmlui_get_value(self): return self.get_edit_text() @@ -116,10 +116,10 @@ return False return super(PrimitivusPasswordWidget, self).selectable() - def _xmluiSetValue(self, value): + def _xmlui_set_value(self, value): self.set_edit_text(value) - def _xmluiGetValue(self): + def _xmlui_get_value(self): return self.get_edit_text() @@ -135,10 +135,10 @@ return False return super(PrimitivusTextBoxWidget, self).selectable() - def _xmluiSetValue(self, value): + def _xmlui_set_value(self, value): self.set_edit_text(value) - def _xmluiGetValue(self): + def _xmlui_get_value(self): return self.get_edit_text() @@ -152,10 +152,10 @@ return False return super(PrimitivusBoolWidget, self).selectable() - def _xmluiSetValue(self, value): + def _xmlui_set_value(self, value): self.set_state(value == "true") - def _xmluiGetValue(self): + def _xmlui_get_value(self): return C.BOOL_TRUE if self.get_state() else C.BOOL_FALSE @@ -169,10 +169,10 @@ return False return super(PrimitivusIntWidget, self).selectable() - def _xmluiSetValue(self, value): + def _xmlui_set_value(self, value): self.set_edit_text(value) - def _xmluiGetValue(self): + def _xmlui_get_value(self): return self.get_edit_text() @@ -182,38 +182,38 @@ def __init__(self, _xmlui_parent, value, click_callback): sat_widgets.CustomButton.__init__(self, value, on_press=click_callback) - def _xmluiOnClick(self, callback): + def _xmlui_on_click(self, callback): urwid.connect_signal(self, "click", callback) class PrimitivusListWidget(xmlui.ListWidget, sat_widgets.List, PrimitivusEvents): def __init__(self, _xmlui_parent, options, selected, flags): sat_widgets.List.__init__(self, options=options, style=flags) - self._xmluiSelectValues(selected) + self._xmlui_select_values(selected) - def _xmluiSelectValue(self, value): - return self.selectValue(value) + def _xmlui_select_value(self, value): + return self.select_value(value) - def _xmluiSelectValues(self, values): - return self.selectValues(values) + def _xmlui_select_values(self, values): + return self.select_values(values) - def _xmluiGetSelectedValues(self): - return [option.value for option in self.getSelectedValues()] + def _xmlui_get_selected_values(self): + return [option.value for option in self.get_selected_values()] - def _xmluiAddValues(self, values, select=True): - current_values = self.getAllValues() + def _xmlui_add_values(self, values, select=True): + current_values = self.get_all_values() new_values = copy.deepcopy(current_values) for value in values: if value not in current_values: new_values.append(value) if select: - selected = self._xmluiGetSelectedValues() - self.changeValues(new_values) + selected = self._xmlui_get_selected_values() + self.change_values(new_values) if select: for value in values: if value not in selected: selected.append(value) - self._xmluiSelectValues(selected) + self._xmlui_select_values(selected) class PrimitivusJidsListWidget(xmlui.ListWidget, sat_widgets.List, PrimitivusEvents): @@ -224,11 +224,11 @@ option_type=lambda txt, align: sat_widgets.AdvancedEdit( edit_text=txt, align=align ), - on_change=self._onChange, + on_change=self._on_change, ) self.delete = 0 - def _onChange(self, list_widget, jid_widget=None, text=None): + def _on_change(self, list_widget, jid_widget=None, text=None): if jid_widget is not None: if jid_widget != list_widget.contents[-1] and not text: # if a field is empty, we delete the line (except for the last line) @@ -237,9 +237,9 @@ # we always want an empty field as last value to be able to add jids list_widget.contents.append(sat_widgets.AdvancedEdit()) - def _xmluiGetSelectedValues(self): + def _xmlui_get_selected_values(self): # XXX: there is not selection in this list, so we return all non empty values - return [jid_ for jid_ in self.getAllValues() if jid_] + return [jid_ for jid_ in self.get_all_values() if jid_] class PrimitivusAdvancedListContainer( @@ -253,19 +253,19 @@ self, columns=columns, options=options, row_selectable=selectable != "no" ) - def _xmluiAppend(self, widget): - self.addWidget(widget) + def _xmlui_append(self, widget): + self.add_widget(widget) - def _xmluiAddRow(self, idx): - self.setRowIndex(idx) + def _xmlui_add_row(self, idx): + self.set_row_index(idx) - def _xmluiGetSelectedWidgets(self): - return self.getSelectedWidgets() + def _xmlui_get_selected_widgets(self): + return self.get_selected_widgets() - def _xmluiGetSelectedIndex(self): - return self.getSelectedIndex() + def _xmlui_get_selected_index(self): + return self.get_selected_index() - def _xmluiOnSelect(self, callback): + def _xmlui_on_select(self, callback): """ Call callback with widget as only argument """ urwid.connect_signal(self, "click", self._event_callback, callback) @@ -277,11 +277,11 @@ options["FOCUS_ATTR"] = "param_selected" sat_widgets.TableContainer.__init__(self, columns=2, options=options) - def _xmluiAppend(self, widget): + def _xmlui_append(self, widget): if isinstance(widget, PrimitivusEmptyWidget): # we don't want highlight on empty widgets widget = urwid.AttrMap(widget, "default") - self.addWidget(widget) + self.add_widget(widget) class PrimitivusLabelContainer(PrimitivusPairsContainer, xmlui.LabelContainer): @@ -292,12 +292,12 @@ def __init__(self, _xmlui_parent): sat_widgets.TabsContainer.__init__(self) - def _xmluiAppend(self, widget): + def _xmlui_append(self, widget): self.body.append(widget) - def _xmluiAddTab(self, label, selected): + def _xmlui_add_tab(self, label, selected): tab = PrimitivusVerticalContainer(None) - self.addTab(label, tab, selected) + self.add_tab(label, tab, selected) return tab @@ -308,7 +308,7 @@ urwid.ListBox.__init__(self, urwid.SimpleListWalker([])) self._last_size = None - def _xmluiAppend(self, widget): + def _xmlui_append(self, widget): if "flow" not in widget.sizing(): widget = urwid.BoxAdapter(widget, self.BOX_HEIGHT) self.body.append(widget) @@ -331,11 +331,11 @@ def __init__(self, _xmlui_parent): self.host = _xmlui_parent.host - def _xmluiShow(self): - self.host.showPopUp(self) + def _xmlui_show(self): + self.host.show_pop_up(self) - def _xmluiClose(self): - self.host.removePopUp(self) + def _xmlui_close(self): + self.host.remove_pop_up(self) class PrimitivusMessageDialog(PrimitivusDialog, xmlui.MessageDialog, sat_widgets.Alert): @@ -343,7 +343,7 @@ PrimitivusDialog.__init__(self, _xmlui_parent) xmlui.MessageDialog.__init__(self, _xmlui_parent) sat_widgets.Alert.__init__( - self, title, message, ok_cb=lambda __: self._xmluiClose() + self, title, message, ok_cb=lambda __: self._xmlui_close() ) @@ -362,8 +362,8 @@ self, title, message, - no_cb=lambda __: self._xmluiCancelled(), - yes_cb=lambda __: self._xmluiValidated(), + no_cb=lambda __: self._xmlui_cancelled(), + yes_cb=lambda __: self._xmlui_validated(), ) @@ -379,8 +379,8 @@ style.append("dir") files_management.FileDialog.__init__( self, - ok_cb=lambda path: self._xmluiValidated({"path": path}), - cancel_cb=lambda __: self._xmluiCancelled(), + ok_cb=lambda path: self._xmlui_validated({"path": path}), + cancel_cb=lambda __: self._xmlui_cancelled(), message=message, title=title, style=style, @@ -433,7 +433,7 @@ PrimitivusWidget.__init__(self, self.main_cont, self.xmlui_title) - def _parseChilds(self, _xmlui_parent, current_node, wanted=("container",), data=None): + def _parse_childs(self, _xmlui_parent, current_node, wanted=("container",), data=None): # Small hack to always have a VerticalContainer as main container in Primitivus. # this used to be the default behaviour for all frontends, but now # TabsContainer can also be the main container. @@ -442,23 +442,23 @@ if node.nodeName == "container" and node.getAttribute("type") == "tabs": _xmlui_parent = self.widget_factory.createVerticalContainer(self) self.main_cont = _xmlui_parent - return super(XMLUIPanel, self)._parseChilds(_xmlui_parent, current_node, wanted, + return super(XMLUIPanel, self)._parse_childs(_xmlui_parent, current_node, wanted, data) - def constructUI(self, parsed_dom): - def postTreat(): + def construct_ui(self, parsed_dom): + def post_treat(): assert self.main_cont.body if self.type in ("form", "popup"): buttons = [] if self.type == "form": - buttons.append(urwid.Button(_("Submit"), self.onFormSubmitted)) + buttons.append(urwid.Button(_("Submit"), self.on_form_submitted)) if not "NO_CANCEL" in self.flags: - buttons.append(urwid.Button(_("Cancel"), self.onFormCancelled)) + buttons.append(urwid.Button(_("Cancel"), self.on_form_cancelled)) else: buttons.append( - urwid.Button(_("OK"), on_press=lambda __: self._xmluiClose()) + urwid.Button(_("OK"), on_press=lambda __: self._xmlui_close()) ) max_len = max([len(button.get_label()) for button in buttons]) grid_wid = urwid.GridFlow(buttons, max_len + 4, 1, 0, "center") @@ -467,17 +467,17 @@ tabs_cont = self.main_cont.body[0].base_widget assert isinstance(tabs_cont, sat_widgets.TabsContainer) buttons = [] - buttons.append(sat_widgets.CustomButton(_("Save"), self.onSaveParams)) + buttons.append(sat_widgets.CustomButton(_("Save"), self.on_save_params)) buttons.append( sat_widgets.CustomButton( - _("Cancel"), lambda x: self.host.removeWindow() + _("Cancel"), lambda x: self.host.remove_window() ) ) - max_len = max([button.getSize() for button in buttons]) + max_len = max([button.get_size() for button in buttons]) grid_wid = urwid.GridFlow(buttons, max_len, 1, 0, "center") - tabs_cont.addFooter(grid_wid) + tabs_cont.add_footer(grid_wid) - xmlui.XMLUIPanel.constructUI(self, parsed_dom, postTreat) + xmlui.XMLUIPanel.construct_ui(self, parsed_dom, post_treat) urwid.WidgetWrap.__init__(self, self.main_cont) def show(self, show_type=None, valign="middle"): @@ -501,18 +501,18 @@ self._dest = show_type if show_type == "popup": - self.host.showPopUp(self, valign=valign) + self.host.show_pop_up(self, valign=valign) elif show_type == "window": - self.host.newWidget(self, user_action=self.user_action) + self.host.new_widget(self, user_action=self.user_action) else: assert False self.host.redraw() - def _xmluiClose(self): + def _xmlui_close(self): if self._dest == "window": - self.host.removeWindow() + self.host.remove_window() elif self._dest == "popup": - self.host.removePopUp(self) + self.host.remove_pop_up(self) else: raise exceptions.InternalError( "self._dest unknown, are you sure you have called XMLUI.show ?" @@ -523,6 +523,6 @@ dialog_factory = GenericFactory() -xmlui.registerClass(xmlui.CLASS_PANEL, XMLUIPanel) -xmlui.registerClass(xmlui.CLASS_DIALOG, XMLUIDialog) +xmlui.register_class(xmlui.CLASS_PANEL, XMLUIPanel) +xmlui.register_class(xmlui.CLASS_DIALOG, XMLUIDialog) create = xmlui.create
--- a/sat_frontends/quick_frontend/constants.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/constants.py Sat Apr 08 13:54:42 2023 +0200 @@ -103,13 +103,13 @@ "notificationsClear", "widgetNew", "widgetDeleted", - "profilePlugged", + "profile_plugged", "contactsFilled", "disconnect", "gotMenus", "menu", - "progressFinished", - "progressError", + "progress_finished", + "progress_error", } # Notifications
--- a/sat_frontends/quick_frontend/quick_app.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_app.py Sat Apr 08 13:54:42 2023 +0200 @@ -63,23 +63,23 @@ def plug(self): """Plug the profile to the host""" # first of all we create the contact lists - self.host.contact_lists.addProfile(self.profile) + self.host.contact_lists.add_profile(self.profile) # we get the essential params - self.bridge.asyncGetParamA( + self.bridge.param_get_a_async( "JabberID", "Connection", profile_key=self.profile, callback=self._plug_profile_jid, - errback=self._getParamError, + errback=self._get_param_error, ) def _plug_profile_jid(self, jid_s): self.whoami = jid.JID(jid_s) # resource might change after the connection log.info(f"Our current jid is: {self.whoami}") - self.bridge.isConnected(self.profile, callback=self._plug_profile_isconnected) + self.bridge.is_connected(self.profile, callback=self._plug_profile_isconnected) - def _autodisconnectEb(self, failure_): + def _autodisconnect_eb(self, failure_): # XXX: we ignore error on this parameter, as Libervia can't access it log.warning( _("Error while trying to get autodisconnect param, ignoring: {}").format( @@ -91,24 +91,24 @@ def _plug_profile_isconnected(self, connected): self.connected = connected if connected: - self.host.profileConnected(self.profile) - self.bridge.asyncGetParamA( + self.host.profile_connected(self.profile) + self.bridge.param_get_a_async( "autodisconnect", "Connection", profile_key=self.profile, callback=self._plug_profile_autodisconnect, - errback=self._autodisconnectEb, + errback=self._autodisconnect_eb, ) def _plug_profile_autodisconnect(self, autodisconnect): if C.bool(autodisconnect): self._autodisconnect = True - self.bridge.asyncGetParamA( + self.bridge.param_get_a_async( "autoconnect", "Connection", profile_key=self.profile, callback=self._plug_profile_autoconnect, - errback=self._getParamError, + errback=self._get_param_error, ) def _plug_profile_autoconnect(self, value_str): @@ -124,79 +124,79 @@ # Profile can be connected or not # we get cached data self.connected = True - self.host.bridge.getFeatures( + self.host.bridge.features_get( profile_key=self.profile, - callback=self._plug_profile_getFeaturesCb, - errback=self._plug_profile_getFeaturesEb, + callback=self._plug_profile_get_features_cb, + errback=self._plug_profile_get_features_eb, ) - def _plug_profile_getFeaturesEb(self, failure): + def _plug_profile_get_features_eb(self, failure): log.error("Couldn't get features: {}".format(failure)) - self._plug_profile_getFeaturesCb({}) + self._plug_profile_get_features_cb({}) - def _plug_profile_getFeaturesCb(self, features): + def _plug_profile_get_features_cb(self, features): self.host.features = features - self.host.bridge.getEntitiesData([], ProfileManager.cache_keys_to_get, + self.host.bridge.entities_data_get([], ProfileManager.cache_keys_to_get, profile=self.profile, - callback=self._plug_profile_gotCachedValues, - errback=self._plug_profile_failedCachedValues) + callback=self._plug_profile_got_cached_values, + errback=self._plug_profile_failed_cached_values) - def _plug_profile_failedCachedValues(self, failure): + def _plug_profile_failed_cached_values(self, failure): log.error("Couldn't get cached values: {}".format(failure)) - self._plug_profile_gotCachedValues({}) + self._plug_profile_got_cached_values({}) - def _plug_profile_gotCachedValues(self, cached_values): + def _plug_profile_got_cached_values(self, cached_values): contact_list = self.host.contact_lists[self.profile] # add the contact list and its listener for entity_s, data in cached_values.items(): for key, value in data.items(): - self.host.entityDataUpdatedHandler(entity_s, key, value, self.profile) + self.host.entity_data_updated_handler(entity_s, key, value, self.profile) if not self.connected: - self.host.setPresenceStatus(C.PRESENCE_UNAVAILABLE, "", profile=self.profile) + self.host.set_presence_status(C.PRESENCE_UNAVAILABLE, "", profile=self.profile) else: contact_list.fill() - self.host.setPresenceStatus(profile=self.profile) + self.host.set_presence_status(profile=self.profile) # The waiting subscription requests - self.bridge.getWaitingSub( - self.profile, callback=self._plug_profile_gotWaitingSub + self.bridge.sub_waiting_get( + self.profile, callback=self._plug_profile_got_waiting_sub ) - def _plug_profile_gotWaitingSub(self, waiting_sub): + def _plug_profile_got_waiting_sub(self, waiting_sub): for sub in waiting_sub: - self.host.subscribeHandler(waiting_sub[sub], sub, self.profile) + self.host.subscribe_handler(waiting_sub[sub], sub, self.profile) - self.bridge.mucGetRoomsJoined( - self.profile, callback=self._plug_profile_gotRoomsJoined + self.bridge.muc_get_rooms_joined( + self.profile, callback=self._plug_profile_got_rooms_joined ) - def _plug_profile_gotRoomsJoined(self, rooms_args): + def _plug_profile_got_rooms_joined(self, rooms_args): # Now we open the MUC window where we already are: for room_args in rooms_args: - self.host.mucRoomJoinedHandler(*room_args, profile=self.profile) + self.host.muc_room_joined_handler(*room_args, profile=self.profile) # Presence must be requested after rooms are filled - self.host.bridge.getPresenceStatuses( - self.profile, callback=self._plug_profile_gotPresences + self.host.bridge.presence_statuses_get( + self.profile, callback=self._plug_profile_got_presences ) - def _plug_profile_gotPresences(self, presences): + def _plug_profile_got_presences(self, presences): for contact in presences: for res in presences[contact]: jabber_id = ("%s/%s" % (jid.JID(contact).bare, res)) if res else contact show = presences[contact][res][0] priority = presences[contact][res][1] statuses = presences[contact][res][2] - self.host.presenceUpdateHandler( + self.host.presence_update_handler( jabber_id, show, priority, statuses, self.profile ) # At this point, profile should be fully plugged # and we launch frontend specific method - self.host.profilePlugged(self.profile) + self.host.profile_plugged(self.profile) - def _getParamError(self, failure): + def _get_param_error(self, failure): log.error(_("Can't get profile parameter: {msg}").format(msg=failure)) @@ -242,7 +242,7 @@ del self._profiles[profile] - def chooseOneProfile(self): + def choose_one_profile(self): return list(self._profiles.keys())[0] @@ -260,7 +260,7 @@ def __init__(self, bridge_factory, xmlui, check_options=None, connect_bridge=True): """Create a frontend application - @param bridge_factory: method to use to create the Bridge + @param bridge_factory: method to use to create the bridge @param xmlui: xmlui module @param check_options: method to call to check options (usually command line arguments) @@ -297,7 +297,7 @@ self.bridge = bridge_factory() ProfileManager.bridge = self.bridge if connect_bridge: - self.connectBridge() + self.connect_bridge() # frontend notifications self._notif_id = 0 @@ -314,58 +314,58 @@ # state of synchronisation with backend self._sync = True - def connectBridge(self): - self.bridge.bridgeConnect(callback=self._bridgeCb, errback=self._bridgeEb) + def connect_bridge(self): + self.bridge.bridge_connect(callback=self._bridge_cb, errback=self._bridge_eb) - def _namespacesGetCb(self, ns_map): + def _namespaces_get_cb(self, ns_map): self.ns_map = ns_map - def _namespacesGetEb(self, failure_): + def _namespaces_get_eb(self, failure_): log.error(_("Can't get namespaces map: {msg}").format(msg=failure_)) - def _encryptionPluginsGetCb(self, plugins_ser): + def _encryption_plugins_get_cb(self, plugins_ser): self.encryption_plugins = data_format.deserialise(plugins_ser, type_check=list) - def _encryptionPluginsGetEb(self, failure_): + def _encryption_plugins_get_eb(self, failure_): log.warning(_("Can't retrieve encryption plugins: {msg}").format(msg=failure_)) - def onBridgeConnected(self): - self.bridge.getReady(self.onBackendReady) + def on_bridge_connected(self): + self.bridge.ready_get(self.on_backend_ready) - def _bridgeCb(self): - self.registerSignal("connected") - self.registerSignal("disconnected") - self.registerSignal("actionNew") - self.registerSignal("newContact") - self.registerSignal("messageNew") + def _bridge_cb(self): + self.register_signal("connected") + self.register_signal("disconnected") + self.register_signal("action_new") + self.register_signal("contact_new") + self.register_signal("message_new") if self.ENCRYPTION_HANDLERS: - self.registerSignal("messageEncryptionStarted") - self.registerSignal("messageEncryptionStopped") - self.registerSignal("presenceUpdate") - self.registerSignal("subscribe") - self.registerSignal("paramUpdate") - self.registerSignal("contactDeleted") - self.registerSignal("entityDataUpdated") - self.registerSignal("progressStarted") - self.registerSignal("progressFinished") - self.registerSignal("progressError") - self.registerSignal("mucRoomJoined", iface="plugin") - self.registerSignal("mucRoomLeft", iface="plugin") - self.registerSignal("mucRoomUserChangedNick", iface="plugin") - self.registerSignal("mucRoomNewSubject", iface="plugin") - self.registerSignal("chatStateReceived", iface="plugin") - self.registerSignal("messageState", iface="plugin") - self.registerSignal("psEvent", iface="plugin") + self.register_signal("message_encryption_started") + self.register_signal("message_encryption_stopped") + self.register_signal("presence_update") + self.register_signal("subscribe") + self.register_signal("param_update") + self.register_signal("contact_deleted") + self.register_signal("entity_data_updated") + self.register_signal("progress_started") + self.register_signal("progress_finished") + self.register_signal("progress_error") + self.register_signal("muc_room_joined", iface="plugin") + self.register_signal("muc_room_left", iface="plugin") + self.register_signal("muc_room_user_changed_nick", iface="plugin") + self.register_signal("muc_room_new_subject", iface="plugin") + self.register_signal("chat_state_received", iface="plugin") + self.register_signal("message_state", iface="plugin") + self.register_signal("ps_event", iface="plugin") # useful for debugging - self.registerSignal("_debug", iface="core") + self.register_signal("_debug", iface="core") # FIXME: do it dynamically - quick_games.Tarot.registerSignals(self) - quick_games.Quiz.registerSignals(self) - quick_games.Radiocol.registerSignals(self) - self.onBridgeConnected() + quick_games.Tarot.register_signals(self) + quick_games.Quiz.register_signals(self) + quick_games.Radiocol.register_signals(self) + self.on_bridge_connected() - def _bridgeEb(self, failure): + def _bridge_eb(self, failure): if isinstance(failure, exceptions.BridgeExceptionNoService): print((_("Can't connect to SàT backend, are you sure it's launched ?"))) sys.exit(C.EXIT_BACKEND_NOT_FOUND) @@ -375,15 +375,15 @@ else: print((_("Error while initialising bridge: {}".format(failure)))) - def onBackendReady(self): + def on_backend_ready(self): log.info("backend is ready") - self.bridge.namespacesGet( - callback=self._namespacesGetCb, errback=self._namespacesGetEb) + self.bridge.namespaces_get( + callback=self._namespaces_get_cb, errback=self._namespaces_get_eb) # we cache available encryption plugins, as we'll use them on each # new chat widget - self.bridge.encryptionPluginsGet( - callback=self._encryptionPluginsGetCb, - errback=self._encryptionPluginsGetEb) + self.bridge.encryption_plugins_get( + callback=self._encryption_plugins_get_cb, + errback=self._encryption_plugins_get_eb) @property @@ -392,7 +392,7 @@ try: return self.selected_widget.profile except (TypeError, AttributeError): - return self.profiles.chooseOneProfile() + return self.profiles.choose_one_profile() @property def visible_widgets(self): @@ -434,13 +434,13 @@ return self._selected_widget = wid try: - onSelected = wid.onSelected + on_selected = wid.on_selected except AttributeError: pass else: - onSelected() + on_selected() - self.callListeners("selected", wid) + self.call_listeners("selected", wid) # backend state management @@ -486,7 +486,7 @@ except AttributeError: pass - def registerSignal( + def register_signal( self, function_name, handler=None, iface="core", with_profile=True ): """Register a handler for a signal @@ -500,12 +500,12 @@ """ log.debug("registering signal {name}".format(name=function_name)) if handler is None: - handler = getattr(self, "{}{}".format(function_name, "Handler")) + handler = getattr(self, "{}{}".format(function_name, "_handler")) if not with_profile: self.bridge.register_signal(function_name, handler, iface) return - def signalReceived(*args, **kwargs): + def signal_received(*args, **kwargs): profile = kwargs.get("profile") if profile is None: if not args: @@ -522,7 +522,7 @@ return # we ignore signal for profiles we don't manage handler(*args, **kwargs) - self.bridge.register_signal(function_name, signalReceived, iface) + self.bridge.register_signal(function_name, signal_received, iface) def addListener(self, type_, callback, profiles_filter=None): """Add a listener for an event @@ -551,15 +551,15 @@ args: (widget_deleted,) - menu: called when a menu item is added or removed args: (type_, path, path_i18n, item) were values are: - type_: same as in [sat.core.sat_main.SAT.importMenu] - path: same as in [sat.core.sat_main.SAT.importMenu] + type_: same as in [sat.core.sat_main.SAT.import_menu] + path: same as in [sat.core.sat_main.SAT.import_menu] path_i18n: translated path (or None if the item is removed) item: instance of quick_menus.MenuItemBase or None if the item is removed - gotMenus: called only once when menu are available (no arg) - - progressFinished: called when a progressing action has just finished + - progress_finished: called when a progressing action has just finished args: (progress_id, metadata, profile) - - progressError: called when a progressing action failed + - progress_error: called when a progressing action failed args: (progress_id, error_msg, profile): @param callback: method to call on event @param profiles_filter (set[unicode]): if set and not empty, the @@ -585,7 +585,7 @@ f"Trying to remove an inexisting listener (type = {type_}): " f"{callback}") - def callListeners(self, type_, *args, **kwargs): + def call_listeners(self, type_, *args, **kwargs): """Call the methods which listen type_ event. If a profiles filter has been register with a listener and profile argument is not None, the listener will be called only if profile is in the profiles filter list. @@ -609,7 +609,7 @@ """Tell if the profile is currently followed by the application, and ready""" return profile in self.ready_profiles - def postInit(self, profile_manager): + def post_init(self, profile_manager): """Must be called after initialization is done, do all automatic task (auto plug profile) @@ -619,7 +619,7 @@ if self.options and self.options.profile: profile_manager.autoconnect([self.options.profile]) - def profilePlugged(self, profile): + def profile_plugged(self, profile): """Method called when the profile is fully plugged This will launch frontend specific workflow @@ -641,15 +641,15 @@ ) handler(*args, **kwargs) - self.callListeners("profilePlugged", profile=profile) + self.call_listeners("profile_plugged", profile=profile) if not self._plugs_in_progress: - self.contact_lists.lockUpdate(False) + self.contact_lists.lock_update(False) - def profileConnected(self, profile): + def profile_connected(self, profile): """Called when a plugged profile is connected - it is called independently of profilePlugged (may be called before or after - profilePlugged) + it is called independently of profile_plugged (may be called before or after + profile_plugged) """ pass @@ -676,9 +676,9 @@ module.startswith("twisted.words.protocols.jabber") and failure.condition == "not-authorized" ): - self.launchAction(C.CHANGE_XMPP_PASSWD_ID, {}, profile=profile) + self.action_launch(C.CHANGE_XMPP_PASSWD_ID, {}, profile=profile) else: - self.showDialog(message, fullname, "error") + self.show_dialog(message, fullname, "error") self.bridge.connect(profile, callback=callback, errback=errback) @@ -687,7 +687,7 @@ @param profiles: list of valid profile names """ - self.contact_lists.lockUpdate() + self.contact_lists.lock_update() self._plugs_in_progress.update(profiles) self.plugging_profiles() for profile in profiles: @@ -709,12 +709,12 @@ def clear_profile(self): self.profiles.clear() - def newWidget(self, widget): + def new_widget(self, widget): raise NotImplementedError # bridge signals hanlers - def connectedHandler(self, jid_s, profile): + def connected_handler(self, jid_s, profile): """Called when the connection is made. @param jid_s (unicode): the JID that we were assigned by the server, @@ -722,7 +722,7 @@ """ log.debug(_("Connected")) self.profiles[profile].whoami = jid.JID(jid_s) - self.setPresenceStatus(profile=profile) + self.set_presence_status(profile=profile) # FIXME: fill() is already called for all profiles when doing self.sync = True # a per-profile fill() should be done once, see below note self.contact_lists[profile].fill() @@ -732,25 +732,25 @@ # A mechanism similar to sync should be available # on a per-profile basis self.sync = True - self.profileConnected(profile) + self.profile_connected(profile) - def disconnectedHandler(self, profile): + def disconnected_handler(self, profile): """called when the connection is closed""" log.debug(_("Disconnected")) self.contact_lists[profile].disconnect() - # FIXME: see note on connectedHandler + # FIXME: see note on connected_handler self.sync = False - self.setPresenceStatus(C.PRESENCE_UNAVAILABLE, "", profile=profile) + self.set_presence_status(C.PRESENCE_UNAVAILABLE, "", profile=profile) - def actionNewHandler(self, action_data, id_, security_limit, profile): - self.actionManager(action_data, user_action=False, profile=profile) + def action_new_handler(self, action_data, id_, security_limit, profile): + self.action_manager(action_data, user_action=False, profile=profile) - def newContactHandler(self, jid_s, attributes, groups, profile): + def contact_new_handler(self, jid_s, attributes, groups, profile): entity = jid.JID(jid_s) groups = list(groups) - self.contact_lists[profile].setContact(entity, groups, attributes, in_roster=True) + self.contact_lists[profile].set_contact(entity, groups, attributes, in_roster=True) - def messageNewHandler( + def message_new_handler( self, uid, timestamp, from_jid_s, to_jid_s, msg, subject, type_, extra_s, profile): from_jid = jid.JID(from_jid_s) @@ -767,7 +767,7 @@ contact_list = self.contact_lists[profile] try: - is_room = contact_list.isRoom(target) + is_room = contact_list.is_room(target) except exceptions.NotFound: is_room = False @@ -776,7 +776,7 @@ # messages target = target # we want to be sure to have at least one QuickChat instance - self.widgets.getOrCreateWidget( + self.widgets.get_or_create_widget( quick_chat.QuickChat, target, type_ = C.CHAT_GROUP if is_room else C.CHAT_ONE2ONE, @@ -790,36 +790,36 @@ ): # XXX: needed to show entities which haven't sent any # presence information and which are not in roster - contact_list.setContact(from_jid) + contact_list.set_contact(from_jid) # we dispatch the message in the widgets - for widget in self.widgets.getWidgets( + for widget in self.widgets.get_widgets( quick_chat.QuickChat, target=target, profiles=(profile,) ): - widget.messageNew( + widget.message_new( uid, timestamp, from_jid, mess_to_jid, msg, subject, type_, extra, profile ) - def messageEncryptionStartedHandler(self, destinee_jid_s, plugin_data, profile): + def message_encryption_started_handler(self, destinee_jid_s, plugin_data, profile): destinee_jid = jid.JID(destinee_jid_s) plugin_data = data_format.deserialise(plugin_data) - for widget in self.widgets.getWidgets(quick_chat.QuickChat, + for widget in self.widgets.get_widgets(quick_chat.QuickChat, target=destinee_jid.bare, profiles=(profile,)): - widget.messageEncryptionStarted(plugin_data) + widget.message_encryption_started(plugin_data) - def messageEncryptionStoppedHandler(self, destinee_jid_s, plugin_data, profile): + def message_encryption_stopped_handler(self, destinee_jid_s, plugin_data, profile): destinee_jid = jid.JID(destinee_jid_s) - for widget in self.widgets.getWidgets(quick_chat.QuickChat, + for widget in self.widgets.get_widgets(quick_chat.QuickChat, target=destinee_jid.bare, profiles=(profile,)): - widget.messageEncryptionStopped(plugin_data) + widget.message_encryption_stopped(plugin_data) - def messageStateHandler(self, uid, status, profile): - for widget in self.widgets.getWidgets(quick_chat.QuickChat, profiles=(profile,)): - widget.onMessageState(uid, status, profile) + def message_state_handler(self, uid, status, profile): + for widget in self.widgets.get_widgets(quick_chat.QuickChat, profiles=(profile,)): + widget.on_message_state(uid, status, profile) - def messageSend(self, to_jid, message, subject=None, mess_type="auto", extra=None, callback=None, errback=None, profile_key=C.PROF_KEY_NONE): + def message_send(self, to_jid, message, subject=None, mess_type="auto", extra=None, callback=None, errback=None, profile_key=C.PROF_KEY_NONE): if not subject and not extra and (not message or message == {'': ''}): log.debug("Not sending empty message") return @@ -834,14 +834,14 @@ ) # FIXME: optional argument is here because pyjamas doesn't support callback # without arg with json proxy if errback is None: - errback = lambda failure: self.showDialog( + errback = lambda failure: self.show_dialog( message=failure.message, title=failure.fullname, type="error" ) if not self.trigger.point("messageSendTrigger", to_jid, message, subject, mess_type, extra, callback, errback, profile_key=profile_key): return - self.bridge.messageSend( + self.bridge.message_send( str(to_jid), message, subject, @@ -852,10 +852,10 @@ errback=errback, ) - def setPresenceStatus(self, show="", status=None, profile=C.PROF_KEY_NONE): + def set_presence_status(self, show="", status=None, profile=C.PROF_KEY_NONE): raise NotImplementedError - def presenceUpdateHandler(self, entity_s, show, priority, statuses, profile): + def presence_update_handler(self, entity_s, show, priority, statuses, profile): # XXX: this log is commented because it's really too verbose even for DEBUG logs # but it is kept here as it may still be useful for troubleshooting # log.debug( @@ -875,16 +875,16 @@ if entity == self.profiles[profile].whoami: if show == C.PRESENCE_UNAVAILABLE: - self.setPresenceStatus(C.PRESENCE_UNAVAILABLE, "", profile=profile) + self.set_presence_status(C.PRESENCE_UNAVAILABLE, "", profile=profile) else: # FIXME: try to retrieve user language status before fallback to default status = statuses.get(C.PRESENCE_STATUSES_DEFAULT, None) - self.setPresenceStatus(show, status, profile=profile) + self.set_presence_status(show, status, profile=profile) return - self.callListeners("presence", entity, show, priority, statuses, profile=profile) + self.call_listeners("presence", entity, show, priority, statuses, profile=profile) - def mucRoomJoinedHandler( + def muc_room_joined_handler( self, room_jid_s, occupants, user_nick, subject, statuses, profile): """Called when a MUC room is joined""" log.debug( @@ -893,8 +893,8 @@ ) ) room_jid = jid.JID(room_jid_s) - self.contact_lists[profile].setSpecial(room_jid, C.CONTACT_SPECIAL_GROUP) - self.widgets.getOrCreateWidget( + self.contact_lists[profile].set_special(room_jid, C.CONTACT_SPECIAL_GROUP) + self.widgets.get_or_create_widget( quick_chat.QuickChat, room_jid, type_=C.CHAT_GROUP, @@ -905,44 +905,44 @@ profile=profile, ) - def mucRoomLeftHandler(self, room_jid_s, profile): + def muc_room_left_handler(self, room_jid_s, profile): """Called when a MUC room is left""" log.debug( "Room [%(room_jid)s] left by %(profile)s" % {"room_jid": room_jid_s, "profile": profile} ) room_jid = jid.JID(room_jid_s) - chat_widget = self.widgets.getWidget(quick_chat.QuickChat, room_jid, profile) + chat_widget = self.widgets.get_widget(quick_chat.QuickChat, room_jid, profile) if chat_widget: - self.widgets.deleteWidget( + self.widgets.delete_widget( chat_widget, all_instances=True, explicit_close=True) - self.contact_lists[profile].removeContact(room_jid) + self.contact_lists[profile].remove_contact(room_jid) - def mucRoomUserChangedNickHandler(self, room_jid_s, old_nick, new_nick, profile): + def muc_room_user_changed_nick_handler(self, room_jid_s, old_nick, new_nick, profile): """Called when an user joined a MUC room""" room_jid = jid.JID(room_jid_s) - chat_widget = self.widgets.getOrCreateWidget( + chat_widget = self.widgets.get_or_create_widget( quick_chat.QuickChat, room_jid, type_=C.CHAT_GROUP, profile=profile ) - chat_widget.changeUserNick(old_nick, new_nick) + chat_widget.change_user_nick(old_nick, new_nick) log.debug( "user [%(old_nick)s] is now known as [%(new_nick)s] in room [%(room_jid)s]" % {"old_nick": old_nick, "new_nick": new_nick, "room_jid": room_jid} ) - def mucRoomNewSubjectHandler(self, room_jid_s, subject, profile): + def muc_room_new_subject_handler(self, room_jid_s, subject, profile): """Called when subject of MUC room change""" room_jid = jid.JID(room_jid_s) - chat_widget = self.widgets.getOrCreateWidget( + chat_widget = self.widgets.get_or_create_widget( quick_chat.QuickChat, room_jid, type_=C.CHAT_GROUP, profile=profile ) - chat_widget.setSubject(subject) + chat_widget.set_subject(subject) log.debug( "new subject for room [%(room_jid)s]: %(subject)s" % {"room_jid": room_jid, "subject": subject} ) - def chatStateReceivedHandler(self, from_jid_s, state, profile): + def chat_state_received_handler(self, from_jid_s, state, profile): """Called when a new chat state (XEP-0085) is received. @param from_jid_s (unicode): JID of a contact or C.ENTITY_ALL @@ -950,9 +950,9 @@ @param profile (unicode): current profile """ from_jid = jid.JID(from_jid_s) - for widget in self.widgets.getWidgets(quick_chat.QuickChat, target=from_jid.bare, + for widget in self.widgets.get_widgets(quick_chat.QuickChat, target=from_jid.bare, profiles=(profile,)): - widget.onChatState(from_jid, state, profile) + widget.on_chat_state(from_jid, state, profile) def notify(self, type_, entity=None, message=None, subject=None, callback=None, cb_args=None, widget=None, profile=C.PROF_KEY_NONE): @@ -986,9 +986,9 @@ type_notifs.append(notif_data) self._notifications[self._notif_id] = notif_data self._notif_id += 1 - self.callListeners("notification", entity, notif_data, profile=profile) + self.call_listeners("notification", entity, notif_data, profile=profile) - def getNotifs(self, entity=None, type_=None, exact_jid=None, profile=C.PROF_KEY_NONE): + def get_notifs(self, entity=None, type_=None, exact_jid=None, profile=C.PROF_KEY_NONE): """return notifications for given entity @param entity(jid.JID, None, C.ENTITY_ALL): jid of the entity to check @@ -1032,7 +1032,7 @@ continue yield notif - def clearNotifs(self, entity, type_=None, profile=C.PROF_KEY_NONE): + def clear_notifs(self, entity, type_=None, profile=C.PROF_KEY_NONE): """return notifications for given entity @param entity(jid.JID, None): bare jid of the entity to check @@ -1050,9 +1050,9 @@ del notif_dict[key][type_] except KeyError: return - self.callListeners("notificationsClear", entity, type_, profile=profile) + self.call_listeners("notificationsClear", entity, type_, profile=profile) - def psEventHandler(self, category, service_s, node, event_type, data, profile): + def ps_event_handler(self, category, service_s, node, event_type, data, profile): """Called when a PubSub event is received. @param category(unicode): event category (e.g. "PEP", "MICROBLOG") @@ -1073,8 +1073,8 @@ # FIXME: check if [] make sense (instead of None) _groups = data.get("group") - for wid in self.widgets.getWidgets(quick_blog.QuickBlog): - wid.addEntryIfAccepted(service_s, node, data, _groups, profile) + for wid in self.widgets.get_widgets(quick_blog.QuickBlog): + wid.add_entry_if_accepted(service_s, node, data, _groups, profile) try: comments_node, comments_service = ( @@ -1084,7 +1084,7 @@ except KeyError: pass else: - self.bridge.mbGet( + self.bridge.mb_get( comments_service, comments_node, C.NO_LIMIT, @@ -1093,13 +1093,13 @@ profile=profile, ) elif event_type == C.PS_RETRACT: - for wid in self.widgets.getWidgets(quick_blog.QuickBlog): - wid.deleteEntryIfPresent(service_s, node, data["id"], profile) + for wid in self.widgets.get_widgets(quick_blog.QuickBlog): + wid.delete_entry_if_present(service_s, node, data["id"], profile) pass else: log.warning("Unmanaged PubSub event type {}".format(event_type)) - def registerProgressCbs(self, progress_id, callback, errback): + def register_progress_cbs(self, progress_id, callback, errback): """Register progression callbacks @param progress_id(unicode): id of the progression to check @@ -1112,10 +1112,10 @@ callbacks = self._progress_ids.setdefault(progress_id, []) callbacks.append((callback, errback)) - def progressStartedHandler(self, pid, metadata, profile): + def progress_started_handler(self, pid, metadata, profile): log.info("Progress {} started".format(pid)) - def progressFinishedHandler(self, pid, metadata, profile): + def progress_finished_handler(self, pid, metadata, profile): log.info("Progress {} finished".format(pid)) try: callbacks = self._progress_ids.pop(pid) @@ -1125,9 +1125,9 @@ for callback, __ in callbacks: if callback is not None: callback(metadata, profile=profile) - self.callListeners("progressFinished", pid, metadata, profile=profile) + self.call_listeners("progress_finished", pid, metadata, profile=profile) - def progressErrorHandler(self, pid, err_msg, profile): + def progress_error_handler(self, pid, err_msg, profile): log.warning("Progress {pid} error: {err_msg}".format(pid=pid, err_msg=err_msg)) try: callbacks = self._progress_ids.pop(pid) @@ -1137,20 +1137,20 @@ for __, errback in callbacks: if errback is not None: errback(err_msg, profile=profile) - self.callListeners("progressError", pid, err_msg, profile=profile) + self.call_listeners("progress_error", pid, err_msg, profile=profile) def _subscribe_cb(self, answer, data): entity, profile = data type_ = "subscribed" if answer else "unsubscribed" self.bridge.subscription(type_, str(entity.bare), profile_key=profile) - def subscribeHandler(self, type, raw_jid, profile): + def subscribe_handler(self, type, raw_jid, profile): """Called when a subsciption management signal is received""" entity = jid.JID(raw_jid) if type == "subscribed": # this is a subscription confirmation, we just have to inform user # TODO: call self.getEntityMBlog to add the new contact blogs - self.showDialog( + self.show_dialog( _("The contact {contact} has accepted your subscription").format( contact=entity.bare ), @@ -1158,7 +1158,7 @@ ) elif type == "unsubscribed": # this is a subscription refusal, we just have to inform user - self.showDialog( + self.show_dialog( _("The contact {contact} has refused your subscription").format( contact=entity.bare ), @@ -1168,7 +1168,7 @@ elif type == "subscribe": # this is a subscriptionn request, we have to ask for user confirmation # TODO: use sat.stdui.ui_contact_list to display the groups selector - self.showDialog( + self.show_dialog( _( "The contact {contact} wants to subscribe to your presence" ".\nDo you accept ?" @@ -1179,7 +1179,7 @@ answer_data=(entity, profile), ) - def _debugHandler(self, action, parameters, profile): + def _debug_handler(self, action, parameters, profile): if action == "widgets_dump": from pprint import pformat log.info("Widgets dump:\n{data}".format(data=pformat(self.widgets._widgets))) @@ -1187,7 +1187,7 @@ log.warning("Unknown debug action: {action}".format(action=action)) - def showDialog(self, message, title, type="info", answer_cb=None, answer_data=None): + def show_dialog(self, message, title, type="info", answer_cb=None, answer_data=None): """Show a dialog to user Frontends must override this method @@ -1206,25 +1206,25 @@ # FIXME: misnamed method + types are not well chosen. Need to be rethought raise NotImplementedError - def showAlert(self, message): + def show_alert(self, message): # FIXME: doesn't seems used anymore, to remove? pass # FIXME - def dialogFailure(self, failure): + def dialog_failure(self, failure): log.warning("Failure: {}".format(failure)) - def progressIdHandler(self, progress_id, profile): + def progress_id_handler(self, progress_id, profile): """Callback used when an action result in a progress id""" log.info("Progress ID received: {}".format(progress_id)) - def isHidden(self): + def is_hidden(self): """Tells if the frontend window is hidden. @return bool """ raise NotImplementedError - def paramUpdateHandler(self, name, value, namespace, profile): + def param_update_handler(self, name, value, namespace, profile): log.debug( _("param update: [%(namespace)s] %(name)s = %(value)s") % {"namespace": namespace, "name": name, "value": value} @@ -1233,37 +1233,37 @@ log.debug(_("Changing JID to %s") % value) self.profiles[profile].whoami = jid.JID(value) elif (namespace, name) == ("General", C.SHOW_OFFLINE_CONTACTS): - self.contact_lists[profile].showOfflineContacts(C.bool(value)) + self.contact_lists[profile].show_offline_contacts(C.bool(value)) elif (namespace, name) == ("General", C.SHOW_EMPTY_GROUPS): - self.contact_lists[profile].showEmptyGroups(C.bool(value)) + self.contact_lists[profile].show_empty_groups(C.bool(value)) - def contactDeletedHandler(self, jid_s, profile): + def contact_deleted_handler(self, jid_s, profile): target = jid.JID(jid_s) - self.contact_lists[profile].removeContact(target) + self.contact_lists[profile].remove_contact(target) - def entityDataUpdatedHandler(self, entity_s, key, value_raw, profile): + def entity_data_updated_handler(self, entity_s, key, value_raw, profile): entity = jid.JID(entity_s) value = data_format.deserialise(value_raw, type_check=None) if key == "nicknames": assert isinstance(value, list) or value is None if entity in self.contact_lists[profile]: - self.contact_lists[profile].setCache(entity, "nicknames", value) - self.callListeners("nicknames", entity, value, profile=profile) + self.contact_lists[profile].set_cache(entity, "nicknames", value) + self.call_listeners("nicknames", entity, value, profile=profile) elif key == "avatar" and self.AVATARS_HANDLER: assert isinstance(value, dict) or value is None - self.contact_lists[profile].setCache(entity, "avatar", value) - self.callListeners("avatar", entity, value, profile=profile) + self.contact_lists[profile].set_cache(entity, "avatar", value) + self.call_listeners("avatar", entity, value, profile=profile) - def actionManager(self, action_data, callback=None, ui_show_cb=None, user_action=True, + def action_manager(self, action_data, callback=None, ui_show_cb=None, user_action=True, progress_cb=None, progress_eb=None, profile=C.PROF_KEY_NONE): """Handle backend action - @param action_data(dict): action dict as sent by launchAction or returned by an + @param action_data(dict): action dict as sent by action_launch or returned by an UI action @param callback(None, callback): if not None, callback to use on XMLUI answer @param ui_show_cb(None, callback): if not None, method to call to show the XMLUI @param user_action(bool): if True, the action is a result of a user interaction - else the action come from backend direclty (i.e. actionNew). + else the action come from backend direclty (i.e. action_new). This is useful to know if the frontend can display a popup immediately (if True) or if it should add it to a queue that the user can activate later. @param progress_cb(None, callable): method to call when progression is finished. @@ -1295,8 +1295,8 @@ pass else: if progress_cb or progress_eb: - self.registerProgressCbs(progress_id, progress_cb, progress_eb) - self.progressIdHandler(progress_id, profile) + self.register_progress_cbs(progress_id, progress_cb, progress_eb) + self.progress_id_handler(progress_id, profile) # we ignore metadata action_data = { @@ -1310,13 +1310,13 @@ ) ) - def _actionCb(self, data, callback, callback_id, profile): + def _action_cb(self, data, callback, callback_id, profile): if callback is None: - self.actionManager(data, profile=profile) + self.action_manager(data, profile=profile) else: callback(data=data, cb_id=callback_id, profile=profile) - def launchAction( + def action_launch( self, callback_id, data=None, callback=None, profile=C.PROF_KEY_NONE ): """Launch a dynamic action @@ -1324,7 +1324,7 @@ @param callback_id: id of the action to launch @param data: data needed only for certain actions @param callback(callable, None): will be called with the resut - if None, self.actionManager will be called + if None, self.action_manager will be called else the callable will be called with the following kw parameters: - data: action_data - cb_id: callback id @@ -1334,12 +1334,12 @@ """ if data is None: data = dict() - action_cb = lambda data: self._actionCb(data, callback, callback_id, profile) - self.bridge.launchAction( - callback_id, data, profile, callback=action_cb, errback=self.dialogFailure + action_cb = lambda data: self._action_cb(data, callback, callback_id, profile) + self.bridge.action_launch( + callback_id, data, profile, callback=action_cb, errback=self.dialog_failure ) - def launchMenu( + def launch_menu( self, menu_type, path, @@ -1354,7 +1354,7 @@ @param path(iterable[unicode]): path to the menu @param data: data needed only for certain actions @param callback(callable, None): will be called with the resut - if None, self.actionManager will be called + if None, self.action_manager will be called else the callable will be called with the following kw parameters: - data: action_data - cb_id: (menu_type, path) tuple @@ -1364,25 +1364,25 @@ """ if data is None: data = dict() - action_cb = lambda data: self._actionCb( + action_cb = lambda data: self._action_cb( data, callback, (menu_type, path), profile ) - self.bridge.menuLaunch( + self.bridge.menu_launch( menu_type, path, data, security_limit, profile, callback=action_cb, - errback=self.dialogFailure, + errback=self.dialog_failure, ) def disconnect(self, profile): log.info("disconnecting") - self.callListeners("disconnect", profile=profile) + self.call_listeners("disconnect", profile=profile) self.bridge.disconnect(profile) - def onExit(self): + def on_exit(self): """Must be called when the frontend is terminating""" to_unplug = [] for profile, profile_manager in self.profiles.items():
--- a/sat_frontends/quick_frontend/quick_blog.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_blog.py Sat Apr 08 13:54:42 2023 +0200 @@ -125,27 +125,27 @@ manager = manager.manager return level - def _addMBItems(self, items_tuple, service=None, node=None): + def _add_mb_items(self, items_tuple, service=None, node=None): """Add Microblog items to this panel update is NOT called after addition - @param items_tuple(tuple): (items_data,items_metadata) tuple as returned by mbGet + @param items_tuple(tuple): (items_data,items_metadata) tuple as returned by mb_get """ items, metadata = items_tuple for item in items: - self.addEntry(item, service=service, node=node, with_update=False) + self.add_entry(item, service=service, node=node, with_update=False) - def _addMBItemsWithComments(self, items_tuple, service=None, node=None): + def _add_mb_items_with_comments(self, items_tuple, service=None, node=None): """Add Microblog items to this panel update is NOT called after addition - @param items_tuple(tuple): (items_data,items_metadata) tuple as returned by mbGet + @param items_tuple(tuple): (items_data,items_metadata) tuple as returned by mb_get """ items, metadata = items_tuple for item, comments in items: - self.addEntry(item, comments, service=service, node=node, with_update=False) + self.add_entry(item, comments, service=service, node=node, with_update=False) - def addEntry(self, item=None, comments=None, service=None, node=None, + def add_entry(self, item=None, comments=None, service=None, node=None, with_update=True, editable=False, edit_entry=False): """Add a microblog entry @@ -160,7 +160,7 @@ entry regardless of sorting) """ new_entry = ENTRY_CLS(self, item, comments, service=service, node=node) - new_entry.setEditable(editable) + new_entry.set_editable(editable) if edit_entry: self.edit_entry = new_entry else: @@ -175,7 +175,7 @@ @param entry (Entry, None): if not None, must be the new entry. If None, all the items will be checked to update the display """ - # update is separated from addEntry to allow adding + # update is separated from add_entry to allow adding # several entries at once, and updating at the end raise NotImplementedError @@ -232,7 +232,7 @@ """Refresh the display when data have been modified""" pass - def setEditable(self, editable=True): + def set_editable(self, editable=True): """tell if the entry can be edited or not @param editable(bool): True if the entry can be edited @@ -240,17 +240,17 @@ # XXX: we don't use @property as property setter doesn't play well with pyjamas raise NotImplementedError - def addComments(self, comments_data): - """Add comments to this entry by calling addEntry repeatidly + def add_comments(self, comments_data): + """Add comments to this entry by calling add_entry repeatidly - @param comments_data(tuple): data as returned by mbGetFromMany*RTResults + @param comments_data(tuple): data as returned by mb_get_from_many*RTResults """ # TODO: manage seperator between comments of coming from different services/nodes for data in comments_data: service, node, failure, comments, metadata = data for comment in comments: if not failure: - self.addEntry(comment, service=jid.JID(service), node=node) + self.add_entry(comment, service=jid.JID(service), node=node) else: log.warning("getting comment failed: {}".format(failure)) self.update() @@ -286,7 +286,7 @@ if self.blog.new_message_target == C.GROUP: mb_data['groups'] = list(self.blog.targets) - self.blog.host.bridge.mbSend( + self.blog.host.bridge.mb_send( str(self.service or ""), self.node or "", data_format.serialise(mb_data), @@ -327,12 +327,12 @@ """ # TODO: manage several comments nodes case. if self.item.comments: - self.blog.host.bridge.psNodeDelete( + self.blog.host.bridge.ps_node_delete( str(self.item.comments_service) or "", self.item.comments_node, profile=self.blog.profile, ) - self.blog.host.bridge.mbRetract( + self.blog.host.bridge.mb_retract( str(self.service or ""), self.node or "", self.item.id, @@ -362,7 +362,7 @@ quick_widgets.QuickWidget.__init__(self, host, targets[0], C.PROF_KEY_NONE) for target in targets[1:]: assert isinstance(target, str) - self.addTarget(target) + self.add_target(target) self._targets_type = C.GROUP @property @@ -379,7 +379,7 @@ ", ".join(self.targets), self.profile ) - def _getResultsCb(self, data, rt_session): + def _get_results_cb(self, data, rt_session): remaining, results = data log.debug( "Got {got_len} results, {rem_len} remaining".format( @@ -393,37 +393,37 @@ for item_metadata in item_data[1]: item_metadata[3] = [data_format.deserialise(i) for i in item_metadata[3]] if not failure: - self._addMBItemsWithComments((items_data, metadata), + self._add_mb_items_with_comments((items_data, metadata), service=jid.JID(service)) self.update() if remaining: - self._getResults(rt_session) + self._get_results(rt_session) - def _getResultsEb(self, failure): - log.warning("microblog getFromMany error: {}".format(failure)) + def _get_results_eb(self, failure): + log.warning("microblog get_from_many error: {}".format(failure)) - def _getResults(self, rt_session): - """Manage results from mbGetFromMany RT Session + def _get_results(self, rt_session): + """Manage results from mb_get_from_many RT Session - @param rt_session(str): session id as returned by mbGetFromMany + @param rt_session(str): session id as returned by mb_get_from_many """ - self.host.bridge.mbGetFromManyWithCommentsRTResult( + self.host.bridge.mb_get_from_many_with_comments_rt_result( rt_session, profile=self.profile, - callback=lambda data: self._getResultsCb(data, rt_session), - errback=self._getResultsEb, + callback=lambda data: self._get_results_cb(data, rt_session), + errback=self._get_results_eb, ) - def getAll(self): + def get_all(self): """Get all (micro)blogs from self.targets""" - def gotSession(rt_session): - self._getResults(rt_session) + def got_session(rt_session): + self._get_results(rt_session) if self._targets_type in (C.ALL, C.GROUP): targets = tuple(self.targets) if self._targets_type is C.GROUP else () - self.host.bridge.mbGetFromManyWithComments( + self.host.bridge.mb_get_from_many_with_comments( self._targets_type, targets, 10, @@ -431,10 +431,10 @@ {}, {"subscribe": C.BOOL_TRUE}, profile=self.profile, - callback=gotSession, + callback=got_session, ) own_pep = self.host.whoami.bare - self.host.bridge.mbGetFromManyWithComments( + self.host.bridge.mb_get_from_many_with_comments( C.JID, (str(own_pep),), 10, @@ -442,14 +442,14 @@ {}, {}, profile=self.profile, - callback=gotSession, + callback=got_session, ) else: raise NotImplementedError( "{} target type is not managed".format(self._targets_type) ) - def isJidAccepted(self, jid_): + def is_jid_accepted(self, jid_): """Tell if a jid is actepted and must be shown in this panel @param jid_(jid.JID): jid to check @@ -459,11 +459,11 @@ return True assert self._targets_type is C.GROUP # we don't manage other types for now for group in self.targets: - if self.host.contact_lists[self.profile].isEntityInGroup(jid_, group): + if self.host.contact_lists[self.profile].is_entity_in_group(jid_, group): return True return False - def addEntryIfAccepted(self, service, node, mb_data, groups, profile): + def add_entry_if_accepted(self, service, node, mb_data, groups, profile): """add entry to this panel if it's acceptable This method check if the entry is new or an update, @@ -485,24 +485,24 @@ # The node is unknown, # we need to check that we can accept the entry if ( - self.isJidAccepted(service) + self.is_jid_accepted(service) or ( groups is None and service == self.host.profiles[self.profile].whoami.bare ) or (groups and groups.intersection(self.targets)) ): - self.addEntry(mb_data, service=service, node=node) + self.add_entry(mb_data, service=service, node=node) else: # the entry is a comment in a known node for parent_entry in parent_entries: - parent_entry.addEntry(mb_data, service=service, node=node) + parent_entry.add_entry(mb_data, service=service, node=node) else: # The entry exist, it's an update entry.reset(mb_data) entry.refresh() - def deleteEntryIfPresent(self, service, node, item_id, profile): + def delete_entry_if_present(self, service, node, item_id, profile): """Delete and entry if present in this QuickBlog @param sender(jid.JID): jid of the entry sender @@ -518,7 +518,7 @@ entry.delete() -def registerClass(type_, cls): +def register_class(type_, cls): global ENTRY_CLS, COMMENTS_CLS if type_ == "ENTRY": ENTRY_CLS = cls
--- a/sat_frontends/quick_frontend/quick_chat.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_chat.py Sat Apr 08 13:54:42 2023 +0200 @@ -55,7 +55,7 @@ self.subject = subject self.type = type_ self.extra = extra - self.nick = self.getNick(from_jid) + self.nick = self.get_nick(from_jid) self._status = None # own_mess is True if message was sent by profile's jid self.own_mess = ( @@ -69,7 +69,7 @@ if self.parent.nick.lower() in m.lower(): self._mention = True break - self.handleMe() + self.handle_me() self.widgets = set() # widgets linked to this message def __str__(self): @@ -147,14 +147,14 @@ return contact_list.getCache(entity, "avatar") except (exceptions.NotFound, KeyError): # we don't check the result as the avatar listener will be called - self.host.bridge.avatarGet(entity, True, self.profile) + self.host.bridge.avatar_get(entity, True, self.profile) return None @property def encrypted(self): return self.extra.get("encrypted", False) - def getNick(self, entity): + def get_nick(self, entity): """Return nick of an entity when possible""" contact_list = self.host.contact_lists[self.profile] if self.type == C.MESS_TYPE_INFO and self.info_type in ROOM_USER_MOVED: @@ -163,9 +163,9 @@ except KeyError: log.error("extra data is missing user nick for uid {}".format(self.uid)) return "" - # FIXME: converted getSpecials to list for pyjamas + # FIXME: converted get_specials to list for pyjamas if self.parent.type == C.CHAT_GROUP or entity in list( - contact_list.getSpecials(C.CONTACT_SPECIAL_GROUP) + contact_list.get_specials(C.CONTACT_SPECIAL_GROUP) ): return entity.resource or "" if entity.bare in contact_list: @@ -174,7 +174,7 @@ nicknames = contact_list.getCache(entity, "nicknames") except (exceptions.NotFound, KeyError): # we check result as listener will be called - self.host.bridge.identityGet( + self.host.bridge.identity_get( entity.bare, ["nicknames"], True, self.profile) return entity.node or entity @@ -200,7 +200,7 @@ for w in self.widgets: w.update({"status": status}) - def handleMe(self): + def handle_me(self): """Check if messages starts with "/me " and change them if it is the case if several messages (different languages) are presents, they all need to start with "/me " @@ -306,7 +306,7 @@ # True when resync is in progress, avoid resynchronising twice when resync is called # and history is still being updated. For internal use only self._resync_lock = False - self.setLocked() + self.set_locked() if type_ == C.CHAT_GROUP: if target.resource: raise exceptions.InternalError( @@ -317,7 +317,7 @@ self.nick = nick self.occupants = {} - self.setOccupants(occupants) + self.set_occupants(occupants) else: if occupants is not None or nick is not None: raise exceptions.InternalError( @@ -340,9 +340,9 @@ lt.tm_isdst, ) # struct_time of day changing time if self.host.AVATARS_HANDLER: - self.host.addListener("avatar", self.onAvatar, profiles) + self.host.addListener("avatar", self.on_avatar, profiles) - def setLocked(self): + def set_locked(self): """Set locked flag To be set when we are waiting for history/search @@ -353,37 +353,37 @@ log.warning("{wid} is already locked!".format(wid=self)) return self._locked = True - # messageNew signals are cached when locked + # message_new signals are cached when locked self._cache = OrderedDict() log.debug("{wid} is now locked".format(wid=self)) - def setUnlocked(self): + def set_unlocked(self): if not self._locked: log.debug("{wid} was already unlocked".format(wid=self)) return self._locked = False for uid, data in self._cache.items(): if uid not in self.messages: - self.messageNew(*data) + self.message_new(*data) else: log.debug("discarding message already in history: {data}, ".format(data=data)) del self._cache log.debug("{wid} is now unlocked".format(wid=self)) - def postInit(self): + def post_init(self): """Method to be called by frontend after widget is initialised handle the display of history and subject """ - self.historyPrint(profile=self.profile) + self.history_print(profile=self.profile) if self.subject is not None: - self.setSubject(self.subject) + self.set_subject(self.subject) if self.host.ENCRYPTION_HANDLERS: - self.getEncryptionState() + self.get_encryption_state() - def onDelete(self): + def on_delete(self): if self.host.AVATARS_HANDLER: - self.host.removeListener("avatar", self.onAvatar) + self.host.removeListener("avatar", self.on_avatar) @property def contact_list(self): @@ -403,13 +403,13 @@ def sync(self, state): quick_widgets.QuickWidget.sync.fset(self, state) if not state: - self.setLocked() + self.set_locked() - def _resyncComplete(self): + def _resync_complete(self): self.sync = True self._resync_lock = False - def occupantsClear(self): + def occupants_clear(self): """Remove all occupants Must be overridden by frontends to clear their own representations of occupants @@ -428,17 +428,17 @@ break else: # we have no message yet, we can get normal history - self.historyPrint(callback=self._resyncComplete, profile=self.profile) + self.history_print(callback=self._resync_complete, profile=self.profile) return if self.type == C.CHAT_GROUP: - self.occupantsClear() - self.host.bridge.mucOccupantsGet( - str(self.target), self.profile, callback=self.updateOccupants, + self.occupants_clear() + self.host.bridge.muc_occupants_get( + str(self.target), self.profile, callback=self.update_occupants, errback=log.error) - self.historyPrint( + self.history_print( size=C.HISTORY_LIMIT_NONE, filters={'timestamp_start': last_message.timestamp}, - callback=self._resyncComplete, + callback=self._resync_complete, profile=self.profile) ## Widget management ## @@ -449,26 +449,26 @@ ) @staticmethod - def getWidgetHash(target, profiles): + def get_widget_hash(target, profiles): profile = list(profiles)[0] return profile + "\n" + str(target.bare) @staticmethod - def getPrivateHash(target, profile): + def get_private_hash(target, profile): """Get unique hash for private conversations This method should be used with force_hash to get unique widget for private MUC conversations """ return (str(profile), target) - def addTarget(self, target): - super(QuickChat, self).addTarget(target) + def add_target(self, target): + super(QuickChat, self).add_target(target) if target.resource: self.current_target = ( target ) # FIXME: tmp, must use resource priority throught contactList instead - def recreateArgs(self, args, kwargs): + def recreate_args(self, args, kwargs): """copy important attribute for a new widget""" kwargs["type_"] = self.type if self.type == C.CHAT_GROUP: @@ -479,21 +479,21 @@ except AttributeError: pass - def onPrivateCreated(self, widget): + def on_private_created(self, widget): """Method called when a new widget for private conversation (MUC) is created""" raise NotImplementedError - def getOrCreatePrivateWidget(self, entity): + def get_or_create_private_widget(self, entity): """Create a widget for private conversation, or get it if it already exists @param entity: full jid of the target """ - return self.host.widgets.getOrCreateWidget( + return self.host.widgets.get_or_create_widget( QuickChat, entity, type_=C.CHAT_ONE2ONE, - force_hash=self.getPrivateHash(self.profile, entity), - on_new_widget=self.onPrivateCreated, + force_hash=self.get_private_hash(self.profile, entity), + on_new_widget=self.on_private_created, profile=self.profile, ) # we force hash to have a new widget, not this one again @@ -505,7 +505,7 @@ ## occupants ## - def setOccupants(self, occupants): + def set_occupants(self, occupants): """Set the whole list of occupants""" assert len(self.occupants) == 0 for nick, data in occupants.items(): @@ -515,10 +515,10 @@ # nick=nick, room=self.target)) self.occupants[nick] = Occupant(self, data, self.profile) - def updateOccupants(self, occupants): + def update_occupants(self, occupants): """Update occupants list - In opposition to setOccupants, this only add missing occupants and remove + In opposition to set_occupants, this only add missing occupants and remove occupants who have left """ # FIXME: occupants with modified status are not handled @@ -553,11 +553,11 @@ else: return occupant - def setUserNick(self, nick): + def set_user_nick(self, nick): """Set the nick of the user, usefull for e.g. change the color of the user""" self.nick = nick - def changeUserNick(self, old_nick, new_nick): + def change_user_nick(self, old_nick, new_nick): """Change nick of a user in group list""" log.info("{old} is now known as {new} in room {room_jid}".format( old = old_nick, @@ -566,11 +566,11 @@ ## Messages ## - def manageMessage(self, entity, mess_type): + def manage_message(self, entity, mess_type): """Tell if this chat widget manage an entity and message type couple @param entity (jid.JID): (full) jid of the sending entity - @param mess_type (str): message type as given by messageNew + @param mess_type (str): message type as given by message_new @return (bool): True if this Chat Widget manage this couple """ if self.type == C.CHAT_GROUP: @@ -584,32 +584,32 @@ return True return False - def updateHistory(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, profile="@NONE@"): + def update_history(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, profile="@NONE@"): """Called when history need to be recreated - Remove all message from history then call historyPrint + Remove all message from history then call history_print Must probably be overriden by frontend to clear widget @param size (int): number of messages @param filters (str): patterns to filter the history results @param profile (str): %(doc_profile)s """ - self.setLocked() + self.set_locked() self.messages.clear() - self.historyPrint(size, filters, profile=profile) + self.history_print(size, filters, profile=profile) - def _onHistoryPrinted(self): + def _on_history_printed(self): """Method called when history is printed (or failed) unlock the widget, and can be used to refresh or scroll down the focus after the history is printed """ - self.setUnlocked() + self.set_unlocked() - def historyPrint(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, callback=None, + def history_print(self, size=C.HISTORY_LIMIT_DEFAULT, filters=None, callback=None, profile="@NONE@"): """Print the current history - Note: self.setUnlocked will be called once history is printed + Note: self.set_unlocked will be called once history is printed @param size (int): number of messages @param search (str): pattern to filter the history results @param callback(callable, None): method to call when history has been printed @@ -619,7 +619,7 @@ filters = {} if size == 0: log.debug("Empty history requested, skipping") - self._onHistoryPrinted() + self._on_history_printed() return log_msg = _("now we print the history") if size != C.HISTORY_LIMIT_DEFAULT: @@ -647,12 +647,12 @@ self.history_filters = filters - def _historyGetCb(history): + def _history_get_cb(history): # day_format = "%A, %d %b %Y" # to display the day change # previous_day = datetime.now().strftime(day_format) # message_day = datetime.fromtimestamp(timestamp).strftime(self.day_format) # if previous_day != message_day: - # self.printDayChange(message_day) + # self.print_day_change(message_day) # previous_day = message_day for data in history: uid, timestamp, from_jid, to_jid, message, subject, type_, extra_s = data @@ -675,49 +675,49 @@ extra, profile, ) - self._onHistoryPrinted() + self._on_history_printed() if callback is not None: callback() - def _historyGetEb(err): + def _history_get_eb(err): log.error(_("Can't get history: {}").format(err)) - self._onHistoryPrinted() + self._on_history_printed() if callback is not None: callback() - self.host.bridge.historyGet( + self.host.bridge.history_get( str(self.host.profiles[profile].whoami.bare), str(target), size, True, {k: str(v) for k,v in filters.items()}, profile, - callback=_historyGetCb, - errback=_historyGetEb, + callback=_history_get_cb, + errback=_history_get_eb, ) - def messageEncryptionGetCb(self, session_data): + def message_encryption_get_cb(self, session_data): if session_data: session_data = data_format.deserialise(session_data) - self.messageEncryptionStarted(session_data) + self.message_encryption_started(session_data) - def messageEncryptionGetEb(self, failure_): + def message_encryption_get_eb(self, failure_): log.error(_("Can't get encryption state: {reason}").format(reason=failure_)) - def getEncryptionState(self): + def get_encryption_state(self): """Retrieve encryption state with current target. - Once state is retrieved, default messageEncryptionStarted will be called if + Once state is retrieved, default message_encryption_started will be called if suitable """ if self.type == C.CHAT_GROUP: return - self.host.bridge.messageEncryptionGet(str(self.target.bare), self.profile, - callback=self.messageEncryptionGetCb, - errback=self.messageEncryptionGetEb) + self.host.bridge.message_encryption_get(str(self.target.bare), self.profile, + callback=self.message_encryption_get_cb, + errback=self.message_encryption_get_eb) - def messageNew(self, uid, timestamp, from_jid, to_jid, msg, subject, type_, extra, + def message_new(self, uid, timestamp, from_jid, to_jid, msg, subject, type_, extra, profile): if self._locked: self._cache[uid] = ( @@ -742,8 +742,8 @@ if to_jid.resource and type_ != C.MESS_TYPE_GROUPCHAT: # we have a private message, we forward it to a private conversation # widget - chat_widget = self.getOrCreatePrivateWidget(to_jid) - chat_widget.messageNew( + chat_widget = self.get_or_create_private_widget(to_jid) + chat_widget.message_new( uid, timestamp, from_jid, to_jid, msg, subject, type_, extra, profile ) return @@ -768,28 +768,28 @@ if "received_timestamp" in extra: log.warning("Delayed message received after history, this should not happen") - self.createMessage(message) + self.create_message(message) - def messageEncryptionStarted(self, session_data): + def message_encryption_started(self, session_data): self.encrypted = True log.debug(_("message encryption started with {target} using {encryption}").format( target=self.target, encryption=session_data['name'])) - def messageEncryptionStopped(self, session_data): + def message_encryption_stopped(self, session_data): self.encrypted = False log.debug(_("message encryption stopped with {target} (was using {encryption})") .format(target=self.target, encryption=session_data['name'])) - def createMessage(self, message, append=False): + def create_message(self, message, append=False): """Must be implemented by frontend to create and show a new message widget - This is only called on messageNew, not on history. - You need to override historyPrint to handle the later + This is only called on message_new, not on history. + You need to override history_print to handle the later @param message(Message): message data """ raise NotImplementedError - def isUserMoved(self, message): + def is_user_moved(self, message): """Return True if message is a user left/joined message @param message(Message): message to check @@ -804,7 +804,7 @@ else: return info_type in ROOM_USER_MOVED - def handleUserMoved(self, message): + def handle_user_moved(self, message): """Check if this message is a UserMoved one, and merge it when possible "merge it" means that info message indicating a user joined/left will be @@ -813,7 +813,7 @@ @return (bool): True if this message has been merged if True, a new MessageWidget must not be created and appended to history """ - if self.isUserMoved(message): + if self.is_user_moved(message): for wid in self.message_widgets_rev: # we merge in/out messages if no message was sent meanwhile if not isinstance(wid, MessageWidget): @@ -841,7 +841,7 @@ return True return False - def printDayChange(self, day): + def print_day_change(self, day): """Display the day on a new line. @param day(unicode): day to display (or not if this method is not overwritten) @@ -851,7 +851,7 @@ ## Room ## - def setSubject(self, subject): + def set_subject(self, subject): """Set title for a group chat""" if self.type != C.CHAT_GROUP: raise exceptions.InternalError( @@ -859,22 +859,22 @@ ) self.subject = subject - def changeSubject(self, new_subject): + def change_subject(self, new_subject): """Change the subject of the room This change the subject on the room itself (i.e. via XMPP), - while setSubject change the subject of this widget + while set_subject change the subject of this widget """ - self.host.bridge.mucSubject(str(self.target), new_subject, self.profile) + self.host.bridge.muc_subject(str(self.target), new_subject, self.profile) - def addGamePanel(self, widget): + def add_game_panel(self, widget): """Insert a game panel to this Chat dialog. @param widget (Widget): the game panel """ raise NotImplementedError - def removeGamePanel(self, widget): + def remove_game_panel(self, widget): """Remove the game panel from this Chat dialog. @param widget (Widget): the game panel @@ -891,7 +891,7 @@ ## events ## - def onChatState(self, from_jid, state, profile): + def on_chat_state(self, from_jid, state, profile): """A chat state has been received""" if self.type == C.CHAT_GROUP: nick = from_jid.resource @@ -904,7 +904,7 @@ ) ) - def onMessageState(self, uid, status, profile): + def on_message_state(self, uid, status, profile): try: mess_data = self.messages[uid] except KeyError: @@ -912,7 +912,7 @@ else: mess_data.status = status - def onAvatar(self, entity, avatar_data, profile): + def on_avatar(self, entity, avatar_data, profile): if self.type == C.CHAT_GROUP: if entity.bare == self.target: try:
--- a/sat_frontends/quick_frontend/quick_contact_list.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_contact_list.py Sat Apr 08 13:54:42 2023 +0200 @@ -83,7 +83,7 @@ # options self.show_disconnected = False - self.show_empty_groups = True + self._show_empty_groups = True self.show_resources = False self.show_status = False # do we show entities with notifications? @@ -91,45 +91,45 @@ # (e.g. not in contact list) if they have notifications attached self.show_entities_with_notifs = True - self.host.bridge.asyncGetParamA( + self.host.bridge.param_get_a_async( C.SHOW_EMPTY_GROUPS, "General", profile_key=profile, - callback=self._showEmptyGroups, + callback=self._show_empty_groups_cb, ) - self.host.bridge.asyncGetParamA( + self.host.bridge.param_get_a_async( C.SHOW_OFFLINE_CONTACTS, "General", profile_key=profile, - callback=self._showOfflineContacts, + callback=self._show_offline_contacts, ) - self.host.addListener("presence", self.onPresenceUpdate, [self.profile]) - self.host.addListener("nicknames", self.onNicknamesUpdate, [self.profile]) - self.host.addListener("notification", self.onNotification, [self.profile]) - # onNotification only updates the entity, so we can re-use it - self.host.addListener("notificationsClear", self.onNotification, [self.profile]) + self.host.addListener("presence", self.on_presence_update, [self.profile]) + self.host.addListener("nicknames", self.on_nicknames_update, [self.profile]) + self.host.addListener("notification", self.on_notification, [self.profile]) + # on_notification only updates the entity, so we can re-use it + self.host.addListener("notificationsClear", self.on_notification, [self.profile]) @property def whoami(self): return self.host.profiles[self.profile].whoami - def _showEmptyGroups(self, show_str): + def _show_empty_groups_cb(self, show_str): # Called only by __init__ # self.update is not wanted here, as it is done by # handler when all profiles are ready - self.showEmptyGroups(C.bool(show_str)) + self.show_empty_groups(C.bool(show_str)) - def _showOfflineContacts(self, show_str): - # same comments as for _showEmptyGroups - self.showOfflineContacts(C.bool(show_str)) + def _show_offline_contacts(self, show_str): + # same comments as for _show_empty_groups + self.show_offline_contacts(C.bool(show_str)) def __contains__(self, entity): """Check if entity is in contact list An entity can be in contact list even if not in roster - use isInRoster to check if entity is in roster. + use is_in_roster to check if entity is in roster. @param entity (jid.JID): jid of the entity (resource is not ignored, use bare jid if needed) """ @@ -209,10 +209,10 @@ return { jid_: cache for jid_, cache in self._cache.items() - if self.entityVisible(jid_) + if self.entity_visible(jid_) } - def getItem(self, entity): + def get_item(self, entity): """Return item representation of requested entity @param entity(jid.JID): bare jid of entity @@ -220,7 +220,7 @@ """ return self._cache[entity] - def _gotContacts(self, contacts): + def _got_contacts(self, contacts): """Add contacts and notice parent that contacts are filled Called during initial contact list filling @@ -235,16 +235,16 @@ "Roster entities with resources are not managed, ignoring {entity}" .format(entity=entity)) continue - self.host.newContactHandler(*contact, profile=self.profile) - handler._contactsFilled(self.profile) + self.host.contact_new_handler(*contact, profile=self.profile) + handler._contacts_filled(self.profile) def _fill(self): """Get all contacts from backend Contacts will be cleared before refilling them """ - self.clearContacts(keep_cache=True) - self.host.bridge.getContacts(self.profile, callback=self._gotContacts) + self.clear_contacts(keep_cache=True) + self.host.bridge.contacts_get(self.profile, callback=self._got_contacts) def fill(self): handler.fill(self.profile) @@ -285,7 +285,7 @@ cache = self._cache[entity.bare] except KeyError: if create_if_not_found: - self.setContact(entity) + self.set_contact(entity) cache = self._cache[entity.bare] else: raise exceptions.NotFound @@ -319,7 +319,7 @@ return cache[C.CONTACT_RESOURCES][entity.resource][name] except KeyError as e: if bare_default is None: - bare_default = not self.isRoom(entity.bare) + bare_default = not self.is_room(entity.bare) if not bare_default: if default is Exception: raise e @@ -334,15 +334,15 @@ else: return default - def setCache(self, entity, name, value): + def set_cache(self, entity, name, value): """Set or update value for one data in cache @param entity(JID): entity to update @param name(str): value to set or update """ - self.setContact(entity, attributes={name: value}) + self.set_contact(entity, attributes={name: value}) - def getFullJid(self, entity): + def get_full_jid(self, entity): """Get full jid from a bare jid @param entity(jid.JID): must be a bare jid @@ -350,11 +350,11 @@ @raise ValueError: the entity is not bare """ if entity.resource: - raise ValueError("getFullJid must be used with a bare jid") + raise ValueError("get_full_jid must be used with a bare jid") main_resource = self.getCache(entity, C.CONTACT_MAIN_RESOURCE) return jid.JID("{}/{}".format(entity, main_resource)) - def setGroupData(self, group, name, value): + def set_group_data(self, group, name, value): """Register a data for a group @param group: a valid (existing) group name @@ -364,7 +364,7 @@ assert name != "jids" self._groups[group][name] = value - def getGroupData(self, group, name=None): + def get_group_data(self, group, name=None): """Return value associated to group data @param group: a valid (existing) group name @@ -375,7 +375,7 @@ return self._groups[group] return self._groups[group][name] - def isInRoster(self, entity): + def is_in_roster(self, entity): """Tell if an entity is in roster @param entity(jid.JID): jid of the entity @@ -383,7 +383,7 @@ """ return entity.bare in self._roster - def isRoom(self, entity): + def is_room(self, entity): """Helper method to know if entity is a MUC room @param entity(jid.JID): jid of the entity @@ -391,9 +391,9 @@ @return (bool): True if entity is a room """ assert entity.resource is None # FIXME: this may change when MIX will be handled - return self.isSpecial(entity, C.CONTACT_SPECIAL_GROUP) + return self.is_special(entity, C.CONTACT_SPECIAL_GROUP) - def isSpecial(self, entity, special_type): + def is_special(self, entity, special_type): """Tell if an entity is of a specialy _type @param entity(jid.JID): jid of the special entity @@ -403,7 +403,7 @@ """ return self.getCache(entity, C.CONTACT_SPECIAL, default=None) == special_type - def setSpecial(self, entity, special_type): + def set_special(self, entity, special_type): """Set special flag on an entity @param entity(jid.JID): jid of the special entity @@ -412,9 +412,9 @@ or None to remove special flag """ assert special_type in C.CONTACT_SPECIAL_ALLOWED + (None,) - self.setCache(entity, C.CONTACT_SPECIAL, special_type) + self.set_cache(entity, C.CONTACT_SPECIAL, special_type) - def getSpecials(self, special_type=None, bare=False): + def get_specials(self, special_type=None, bare=False): """Return all the bare JIDs of the special roster entities of with given type. @param special_type(unicode, None): if not None, filter by special type @@ -434,9 +434,9 @@ def disconnect(self): # for now we just clear contacts on disconnect - self.clearContacts() + self.clear_contacts() - def clearContacts(self, keep_cache=False): + def clear_contacts(self, keep_cache=False): """Clear all the contact list @param keep_cache: if True, don't reset the cache @@ -449,7 +449,7 @@ self._roster.clear() self.update() - def setContact(self, entity, groups=None, attributes=None, in_roster=False): + def set_contact(self, entity, groups=None, attributes=None, in_roster=False): """Add a contact to the list if it doesn't exist, else update it. This method can be called with groups=None for the purpose of updating @@ -473,7 +473,7 @@ # we check if the entity is visible before changing anything # this way we know if we need to do an UPDATE_ADD, UPDATE_MODIFY # or an UPDATE_DELETE - was_visible = self.entityVisible(entity_bare) + was_visible = self.entity_visible(entity_bare) if in_roster: self._roster.add(entity_bare) @@ -526,7 +526,7 @@ else cache ) for attribute, value in attributes.items(): - if attribute == "nicknames" and self.isSpecial( + if attribute == "nicknames" and self.is_special( entity, C.CONTACT_SPECIAL_GROUP ): # we don't want to keep nicknames for MUC rooms @@ -540,7 +540,7 @@ cache_attr[attribute] = value # we can update the display if needed - if self.entityVisible(entity_bare): + if self.entity_visible(entity_bare): # if the contact was not visible, we need to add a widget # else we just update id update_type = C.UPDATE_MODIFY if was_visible else C.UPDATE_ADD @@ -549,7 +549,7 @@ # the entity was visible and is not anymore, we remove it self.update([entity], C.UPDATE_DELETE, self.profile) - def entityVisible(self, entity, check_resource=False): + def entity_visible(self, entity, check_resource=False): """Tell if the contact should be showed or hidden. @param entity (jid.JID): jid of the contact @@ -571,12 +571,12 @@ or entity in selected or ( self.show_entities_with_notifs - and next(self.host.getNotifs(entity.bare, profile=self.profile), None) + and next(self.host.get_notifs(entity.bare, profile=self.profile), None) ) - or entity.resource is None and self.isRoom(entity.bare) + or entity.resource is None and self.is_room(entity.bare) ) - def anyEntityVisible(self, entities, check_resources=False): + def any_entity_visible(self, entities, check_resources=False): """Tell if in a list of entities, at least one should be shown @param entities (list[jid.JID]): list of jids @@ -585,26 +585,26 @@ """ # FIXME: looks inefficient, really needed? for entity in entities: - if self.entityVisible(entity, check_resources): + if self.entity_visible(entity, check_resources): return True return False - def isEntityInGroup(self, entity, group): + def is_entity_in_group(self, entity, group): """Tell if an entity is in a roster group @param entity(jid.JID): jid of the entity @param group(unicode): group to check @return (bool): True if the entity is in the group """ - return entity in self.getGroupData(group, "jids") + return entity in self.get_group_data(group, "jids") - def removeContact(self, entity): + def remove_contact(self, entity): """remove a contact from the list @param entity(jid.JID): jid of the entity to remove (bare jid is used) """ entity_bare = entity.bare - was_visible = self.entityVisible(entity_bare) + was_visible = self.entity_visible(entity_bare) try: groups = self._cache[entity_bare].get(C.CONTACT_GROUPS, set()) except KeyError: @@ -629,7 +629,7 @@ if was_visible: self.update([entity], C.UPDATE_DELETE, self.profile) - def onPresenceUpdate(self, entity, show, priority, statuses, profile): + def on_presence_update(self, entity, show, priority, statuses, profile): """Update entity's presence status @param entity(jid.JID): entity updated @@ -638,9 +638,9 @@ @param statuses: dict of statuses @param profile: %(doc_profile)s """ - # FIXME: cache modification should be done with setContact + # FIXME: cache modification should be done with set_contact # the resources/presence handling logic should be moved there - was_visible = self.entityVisible(entity.bare) + was_visible = self.entity_visible(entity.bare) cache = self.getCache(entity, create_if_not_found=True) if show == C.PRESENCE_UNAVAILABLE: if not entity.resource: @@ -680,13 +680,13 @@ ), ) cache[C.CONTACT_MAIN_RESOURCE] = priority_resource - if self.entityVisible(entity.bare): + if self.entity_visible(entity.bare): update_type = C.UPDATE_MODIFY if was_visible else C.UPDATE_ADD self.update([entity], update_type, self.profile) elif was_visible: self.update([entity], C.UPDATE_DELETE, self.profile) - def onNicknamesUpdate(self, entity, nicknames, profile): + def on_nicknames_update(self, entity, nicknames, profile): """Update entity's nicknames @param entity(jid.JID): entity updated @@ -694,9 +694,9 @@ @param profile: %(doc_profile)s """ assert profile == self.profile - self.setCache(entity, "nicknames", nicknames) + self.set_cache(entity, "nicknames", nicknames) - def onNotification(self, entity, notif, profile): + def on_notification(self, entity, notif, profile): """Update entity with notification @param entity(jid.JID): entity updated @@ -704,7 +704,7 @@ @param profile: %(doc_profile)s """ assert profile == self.profile - if entity is not None and self.entityVisible(entity): + if entity is not None and self.entity_visible(entity): self.update([entity], C.UPDATE_MODIFY, profile) def unselect(self, entity): @@ -747,7 +747,7 @@ self._selected.add(entity) self.update([entity], C.UPDATE_SELECTION, profile=self.profile) - def showOfflineContacts(self, show): + def show_offline_contacts(self, show): """Tell if offline contacts should be shown @param show(bool): True if offline contacts should be shown @@ -758,14 +758,14 @@ self.show_disconnected = show self.update(type_=C.UPDATE_STRUCTURE, profile=self.profile) - def showEmptyGroups(self, show): + def show_empty_groups(self, show): assert isinstance(show, bool) - if self.show_empty_groups == show: + if self._show_empty_groups == show: return - self.show_empty_groups = show + self._show_empty_groups = show self.update(type_=C.UPDATE_STRUCTURE, profile=self.profile) - def showResources(self, show): + def show_resources(self, show): assert isinstance(show, bool) if self.show_resources == show: return @@ -773,10 +773,10 @@ self.update(type_=C.UPDATE_STRUCTURE, profile=self.profile) def plug(self): - handler.addProfile(self.profile) + handler.add_profile(self.profile) def unplug(self): - handler.removeProfile(self.profile) + handler.remove_profile(self.profile) def update(self, entities=None, type_=None, profile=None): handler.update(entities, type_, profile) @@ -930,7 +930,7 @@ """ self._widgets.remove(widget) - def addProfiles(self, profiles): + def add_profiles(self, profiles): """Add a contact list for plugged profiles @param profile(iterable[unicode]): plugged profiles @@ -940,10 +940,10 @@ self._clist[profile] = ProfileContactList(profile) return [self._clist[profile] for profile in profiles] - def addProfile(self, profile): - return self.addProfiles([profile])[0] + def add_profile(self, profile): + return self.add_profiles([profile])[0] - def removeProfiles(self, profiles): + def remove_profiles(self, profiles): """Remove given unplugged profiles from contact list @param profile(iterable[unicode]): unplugged profiles @@ -951,10 +951,10 @@ for profile in profiles: del self._clist[profile] - def removeProfile(self, profile): - self.removeProfiles([profile]) + def remove_profile(self, profile): + self.remove_profiles([profile]) - def getSpecialExtras(self, special_type=None): + def get_special_extras(self, special_type=None): """Return special extras with given type If special_type is None, return all special extras. @@ -966,16 +966,16 @@ """ entities = set() for contact_list in self._clist.values(): - entities.update(contact_list.getSpecialExtras(special_type)) + entities.update(contact_list.get_special_extras(special_type)) return entities - def _contactsFilled(self, profile): + def _contacts_filled(self, profile): self._to_fill.remove(profile) if not self._to_fill: del self._to_fill # we need a full update when all contacts are filled self.update() - self.host.callListeners("contactsFilled", profile=profile) + self.host.call_listeners("contactsFilled", profile=profile) def fill(self, profile=None): """Get all contacts from backend, and fill the widget @@ -1009,13 +1009,13 @@ for profile in remaining: self._clist[profile]._fill() - def clearContacts(self, keep_cache=False): + def clear_contacts(self, keep_cache=False): """Clear all the contact list @param keep_cache: if True, don't reset the cache """ for contact_list in self._clist.values(): - contact_list.clearContacts(keep_cache) + contact_list.clear_contacts(keep_cache) # we need a full update self.update() @@ -1027,7 +1027,7 @@ for contact_list in self._clist.values(): contact_list.select(entity) - def lockUpdate(self, locked=True, do_update=True): + def lock_update(self, locked=True, do_update=True): """Forbid contact list updates Used mainly while profiles are plugged, as many updates can occurs, causing @@ -1067,11 +1067,11 @@ # for next values, None means use indivual value per profile # True or False mean override these values for all profiles self.show_disconnected = None # TODO - self.show_empty_groups = None # TODO + self._show_empty_groups = None # TODO self.show_resources = None # TODO self.show_status = None # TODO - def postInit(self): + def post_init(self): """Method to be called by frontend after widget is initialised""" handler.register(self) @@ -1108,6 +1108,6 @@ """ raise NotImplementedError - def onDelete(self): - QuickWidget.onDelete(self) + def on_delete(self): + QuickWidget.on_delete(self) handler.unregister(self)
--- a/sat_frontends/quick_frontend/quick_contact_management.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_contact_management.py Sat Apr 08 13:54:42 2023 +0200 @@ -49,7 +49,7 @@ self.__contactlist[entity.bare]["resources"].remove(entity.resource) self.__contactlist[entity.bare]["resources"].append(entity.resource) - def getContFromGroup(self, group): + def get_cont_from_group(self, group): """Return all contacts which are in given group""" result = [] for contact in self.__contactlist: @@ -58,7 +58,7 @@ result.append(JID(contact)) return result - def getAttr(self, entity, name): + def get_attr(self, entity, name): """Return a specific attribute of contact, or all attributes @param entity: jid of the contact @param name: name of the attribute @@ -73,7 +73,7 @@ log.debug(_("Trying to get attribute for an unknown contact")) return None - def isConnected(self, entity): + def is_connected(self, entity): """Tell if the contact is online""" return entity.bare in self.__contactlist
--- a/sat_frontends/quick_frontend/quick_game_tarot.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_game_tarot.py Sat Apr 08 13:54:42 2023 +0200 @@ -48,7 +48,7 @@ self.to_show = [] self.state = None - def resetRound(self): + def reset_round(self): """Reset the game's variables to be reatty to start the next round""" del self.selected[:] del self.hand[:] @@ -57,14 +57,14 @@ for pl in self.played: self.played[pl] = None - def getPlayerLocation(self, nick): + def get_player_location(self, nick): """return player location (top,bottom,left or right)""" for location in ["top", "left", "bottom", "right"]: if getattr(self, "%s_nick" % location) == nick: return location assert False - def loadCards(self): + def load_cards(self): """Load all the cards in memory @param dir: directory where the PNG files are""" self.cards = {} @@ -77,7 +77,7 @@ self.cards["carreau"] = {} # diamond self.cards["trefle"] = {} # club - def tarotGameNewHandler(self, hand): + def tarot_game_new_handler(self, hand): """Start a new game, with given hand""" assert len(self.hand) == 0 for suit, value in hand: @@ -85,12 +85,12 @@ self.hand.sort() self.state = "init" - def tarotGameChooseContratHandler(self, xml_data): + def tarot_game_choose_contrat_handler(self, xml_data): """Called when the player as to select his contrat @param xml_data: SàT xml representation of the form""" raise NotImplementedError - def tarotGameShowCardsHandler(self, game_stage, cards, data): + def tarot_game_show_cards_handler(self, game_stage, cards, data): """Display cards in the middle of the game (to show for e.g. chien ou poignée)""" self.to_show = [] for suit, value in cards: @@ -100,14 +100,14 @@ else: self.state = "chien" - def tarotGameYourTurnHandler(self): + def tarot_game_your_turn_handler(self): """Called when we have to play :)""" if self.state == "chien": self.to_show = [] self.state = "play" - self.__fakePlay() + self.__fake_play() - def __fakePlay(self): + def __fake_play(self): """Convenience method for stupid autoplay /!\ don't forgot to comment any interactive dialog for invalid card""" if self._autoplay == None: @@ -115,21 +115,21 @@ if self._autoplay >= len(self.hand): self._autoplay = 0 card = self.hand[self._autoplay] - self.parent.host.bridge.tarotGamePlayCards( + self.parent.host.bridge.tarot_game_play_cards( self.player_nick, self.referee, [(card.suit, card.value)], self.parent.profile ) del self.hand[self._autoplay] self.state = "wait" self._autoplay += 1 - def tarotGameScoreHandler(self, xml_data, winners, loosers): + def tarot_game_score_handler(self, xml_data, winners, loosers): """Called at the end of a game @param xml_data: SàT xml representation of the scores @param winners: list of winners' nicks @param loosers: list of loosers' nicks""" raise NotImplementedError - def tarotGameCardsPlayedHandler(self, player, cards): + def tarot_game_cards_played_handler(self, player, cards): """A card has been played by player""" if self.to_show: self.to_show = [] @@ -141,7 +141,7 @@ pl_cards.append(self.cards[suit, value]) self.played[player] = pl_cards[0] - def tarotGameInvalidCardsHandler(self, phase, played_cards, invalid_cards): + def tarot_game_invalid_cards_handler(self, phase, played_cards, invalid_cards): """Invalid cards have been played @param phase: phase of the game @param played_cards: all the cards played @@ -158,4 +158,4 @@ self.hand.append(self.cards[suit, value]) self.hand.sort() - self.__fakePlay() + self.__fake_play()
--- a/sat_frontends/quick_frontend/quick_games.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_games.py Sat Apr 08 13:54:42 2023 +0200 @@ -36,26 +36,26 @@ _signal_suffixes = None @classmethod - def registerSignals(cls, host): + def register_signals(cls, host): def make_handler(suffix, signal): def handler(*args): if suffix in ("Started", "Players"): - return cls.startedHandler(host, suffix, *args) - return cls.genericHandler(host, signal, *args) + return cls.started_handler(host, suffix, *args) + return cls.generic_handler(host, signal, *args) return handler for suffix in cls._signal_suffixes: signal = cls._signal_prefix + suffix - host.registerSignal( + host.register_signal( signal, handler=make_handler(suffix, signal), iface="plugin" ) @classmethod - def startedHandler(cls, host, suffix, *args): + def started_handler(cls, host, suffix, *args): room_jid, args, profile = jid.JID(args[0]), args[1:-1], args[-1] referee, players, args = args[0], args[1], args[2:] - chat_widget = host.widgets.getOrCreateWidget( + chat_widget = host.widgets.get_or_create_widget( quick_chat.QuickChat, room_jid, type_=C.CHAT_GROUP, profile=profile ) @@ -66,8 +66,8 @@ index = 0 contact_list = host.contact_lists[profile] for occupant in chat_widget.occupants: - occupant_jid = jid.newResource(room_jid, occupant) - contact_list.setCache( + occupant_jid = jid.new_resource(room_jid, occupant) + contact_list.set_cache( occupant_jid, cls._game_name, symbols[index % len(symbols)] if occupant in players else None, @@ -78,9 +78,9 @@ return # waiting for other players to join, or not playing if cls._game_name in chat_widget.games: return # game panel is already there - real_class = host.widgets.getRealClass(cls) + real_class = host.widgets.get_real_class(cls) if real_class == cls: - host.showDialog( + host.show_dialog( _( "A {game} activity between {players} has been started, but you couldn't take part because your client doesn't support it." ).format(game=cls._game_name, players=", ".join(players)), @@ -89,12 +89,12 @@ return panel = real_class(chat_widget, referee, players, *args) chat_widget.games[cls._game_name] = panel - chat_widget.addGamePanel(panel) + chat_widget.add_game_panel(panel) @classmethod - def genericHandler(cls, host, signal, *args): + def generic_handler(cls, host, signal, *args): room_jid, args, profile = jid.JID(args[0]), args[1:-1], args[-1] - chat_widget = host.widgets.getWidget(quick_chat.QuickChat, room_jid, profile) + chat_widget = host.widgets.get_widget(quick_chat.QuickChat, room_jid, profile) if chat_widget: try: game_panel = chat_widget.games[cls._game_name]
--- a/sat_frontends/quick_frontend/quick_menus.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_menus.py Sat Apr 08 13:54:42 2023 +0200 @@ -27,7 +27,7 @@ str = str from sat.core.log import getLogger -from sat.core.i18n import _, languageSwitch +from sat.core.i18n import _, language_switch log = getLogger(__name__) from sat_frontends.quick_frontend.constants import Const as C @@ -43,10 +43,10 @@ def __init__(self, name, extra=None): """ @param name(unicode): canonical name of the item - @param extra(dict[unicode, unicode], None): same as in [addMenus] + @param extra(dict[unicode, unicode], None): same as in [add_menus] """ self._name = name - self.setExtra(extra) + self.set_extra(extra) @property def canonical(self): @@ -58,7 +58,7 @@ """Return the name of the container, can be translated""" return self._name - def setExtra(self, extra): + def set_extra(self, extra): if extra is None: extra = {} self.icon = extra.get("icon") @@ -73,8 +73,8 @@ """ @param name(unicode): canonical name of the item @param name_i18n(unicode): translated name of the item - @param extra(dict[unicode, unicode], None): same as in [addMenus] - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] + @param extra(dict[unicode, unicode], None): same as in [add_menus] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] """ MenuBase.__init__(self, name, extra) self._name_i18n = name_i18n if name_i18n else name @@ -84,13 +84,13 @@ def name(self): return self._name_i18n - def collectData(self, caller): + def collect_data(self, caller): """Get data according to data_collector @param caller: Menu caller """ assert self.type is not None # if data collector are used, type must be set - data_collector = QuickMenusManager.getDataCollector(self.type) + data_collector = QuickMenusManager.get_data_collector(self.type) if data_collector is None: return {} @@ -124,20 +124,20 @@ def __init__(self, host, type_, name, name_i18n, id_, extra=None): """ @param host: %(doc_host)s - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] @param name(unicode): canonical name of the item @param name_i18n(unicode): translated name of the item @param id_(unicode): id of the distant callback - @param extra(dict[unicode, unicode], None): same as in [addMenus] + @param extra(dict[unicode, unicode], None): same as in [add_menus] """ MenuItem.__init__(self, name, name_i18n, extra, type_) self.host = host self.id = id_ def call(self, caller, profile=C.PROF_KEY_NONE): - data = self.collectData(caller) + data = self.collect_data(caller) log.debug("data collected: %s" % data) - self.host.launchAction(self.id, data, profile=profile) + self.host.action_launch(self.id, data, profile=profile) class MenuItemLocal(MenuItem): @@ -147,24 +147,24 @@ def __init__(self, type_, name, name_i18n, callback, extra=None): """ - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] @param name(unicode): canonical name of the item @param name_i18n(unicode): translated name of the item @param callback(callable): local callback. Will be called with no argument if data_collector is None and with caller, profile, and requested data otherwise - @param extra(dict[unicode, unicode], None): same as in [addMenus] + @param extra(dict[unicode, unicode], None): same as in [add_menus] """ MenuItem.__init__(self, name, name_i18n, extra, type_) self.callback = callback def call(self, caller, profile=C.PROF_KEY_NONE): - data_collector = QuickMenusManager.getDataCollector(self.type) + data_collector = QuickMenusManager.get_data_collector(self.type) if data_collector is None: # FIXME: would not it be better if caller and profile where used as arguments? self.callback() else: - self.callback(caller, self.collectData(caller), profile) + self.callback(caller, self.collect_data(caller), profile) class MenuHook(MenuItemLocal): @@ -216,9 +216,9 @@ except KeyError: raise KeyError(item) - def getOrCreate(self, item): + def get_or_create(self, item): log.debug( - "MenuContainer getOrCreate: item=%s name=%s\nlist=%s" + "MenuContainer get_or_create: item=%s name=%s\nlist=%s" % (item, item.canonical, list(self._items.keys())) ) try: @@ -227,7 +227,7 @@ self.append(item) return item - def getActiveMenus(self): + def get_active_menus(self): """Return an iterator on active children""" for child in self._items.values(): if child.ACTIVE: @@ -284,50 +284,50 @@ def __init__(self, host, menus=None, language=None): """ @param host: %(doc_host)s - @param menus(iterable): menus as in [addMenus] - @param language: same as in [i18n.languageSwitch] + @param menus(iterable): menus as in [add_menus] + @param language: same as in [i18n.language_switch] """ self.host = host MenuBase.host = host self.language = language self.menus = {} if menus is not None: - self.addMenus(menus) + self.add_menus(menus) - def _getPathI18n(self, path): + def _get_path_i_1_8_n(self, path): """Return translated version of path""" - languageSwitch(self.language) + language_switch(self.language) path_i18n = [_(elt) for elt in path] - languageSwitch() + language_switch() return path_i18n - def _createCategories(self, type_, path, path_i18n=None, top_extra=None): + def _create_categories(self, type_, path, path_i18n=None, top_extra=None): """Create catogories of the path - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - @param path(list[unicode]): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + @param path(list[unicode]): same as in [sat.core.sat_main.SAT.import_menu] @param path_i18n(list[unicode], None): translated menu path (same lenght as path) or None to get deferred translation of path @param top_extra: extra data to use on the first element of path only. If the first element already exists and is reused, top_extra will be ignored (you'll have to manually change it if you really want to). @return (MenuContainer): last category created, or MenuType if path is empty """ if path_i18n is None: - path_i18n = self._getPathI18n(path) + path_i18n = self._get_path_i_1_8_n(path) assert len(path) == len(path_i18n) menu_container = self.menus.setdefault(type_, MenuType(type_)) for idx, category in enumerate(path): menu_category = MenuCategory(category, path_i18n[idx], extra=top_extra) - menu_container = menu_container.getOrCreate(menu_category) + menu_container = menu_container.get_or_create(menu_category) top_extra = None return menu_container @staticmethod - def addDataCollector(type_, data_collector): + def add_data_collector(type_, data_collector): """Associate a data collector to a menu type A data collector is a method or a map which allow to collect context data to construct the dictionnary which will be sent to the bridge method managing the menu item. - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] @param data_collector(dict[unicode,unicode], callable, None): can be: - a dict which map data name to local name. The attribute named after the dict values will be getted from caller, and put in data. @@ -338,10 +338,10 @@ QuickMenusManager._data_collectors[type_] = data_collector @staticmethod - def getDataCollector(type_): + def get_data_collector(type_): """Get data_collector associated to type_ - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] @return (callable, dict, None): data_collector """ try: @@ -350,20 +350,20 @@ log.error("No data collector registered for {}".format(type_)) return None - def addMenuItem(self, type_, path, item, path_i18n=None, top_extra=None): + def add_menu_item(self, type_, path, item, path_i18n=None, top_extra=None): """Add a MenuItemBase instance - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - @param path(list[unicode]): same as in [sat.core.sat_main.SAT.importMenu], stop at the last parent category + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + @param path(list[unicode]): same as in [sat.core.sat_main.SAT.import_menu], stop at the last parent category @param item(MenuItem): a instancied item @param path_i18n(list[unicode],None): translated menu path (same lenght as path) or None to use deferred translation of path - @param top_extra: same as in [_createCategories] + @param top_extra: same as in [_create_categories] """ if path_i18n is None: - path_i18n = self._getPathI18n(path) + path_i18n = self._get_path_i_1_8_n(path) assert path and len(path) == len(path_i18n) - menu_container = self._createCategories(type_, path, path_i18n, top_extra) + menu_container = self._create_categories(type_, path, path_i18n, top_extra) if item in menu_container: if isinstance(item, MenuHook): @@ -384,9 +384,9 @@ else: log.debug("Adding menu [{type_}] {path}".format(type_=type_, path=path)) menu_container.append(item) - self.host.callListeners("menu", type_, path, path_i18n, item) + self.host.call_listeners("menu", type_, path, path_i18n, item) - def addMenu( + def add_menu( self, type_, path, @@ -398,16 +398,16 @@ ): """Add a menu item - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - @param path(list[unicode]): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + @param path(list[unicode]): same as in [sat.core.sat_main.SAT.import_menu] @param path_i18n(list[unicode], None): translated menu path (same lenght as path), or None to get deferred translation - @param extra(dict[unicode, unicode], None): same as in [addMenus] - @param top_extra: same as in [_createCategories] + @param extra(dict[unicode, unicode], None): same as in [add_menus] + @param top_extra: same as in [_create_categories] @param id_(unicode): callback id (mutually exclusive with callback) @param callback(callable): local callback (mutually exclusive with id_) """ if path_i18n is None: - path_i18n = self._getPathI18n(path) + path_i18n = self._get_path_i_1_8_n(path) assert bool(id_) ^ bool(callback) # we must have id_ xor callback defined if id_: menu_item = MenuItemDistant( @@ -417,71 +417,71 @@ menu_item = MenuItemLocal( type_, path[-1], path_i18n[-1], callback=callback, extra=extra ) - self.addMenuItem(type_, path[:-1], menu_item, path_i18n[:-1], top_extra) + self.add_menu_item(type_, path[:-1], menu_item, path_i18n[:-1], top_extra) - def addMenus(self, menus, top_extra=None): + def add_menus(self, menus, top_extra=None): """Add several menus at once @param menus(iterable): iterable with: id_(unicode,callable): id of distant callback or local callback - type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - path(iterable[unicode]): same as in [sat.core.sat_main.SAT.importMenu] + type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + path(iterable[unicode]): same as in [sat.core.sat_main.SAT.import_menu] path_i18n(iterable[unicode]): translated menu path (same lenght as path) extra(dict[unicode,unicode]): dictionary of extra data (used on the leaf menu), can be: - "icon": icon name - @param top_extra: same as in [_createCategories] + @param top_extra: same as in [_create_categories] """ # TODO: manage icons for id_, type_, path, path_i18n, extra in menus: if callable(id_): - self.addMenu( + self.add_menu( type_, path, path_i18n, callback=id_, extra=extra, top_extra=top_extra ) else: - self.addMenu( + self.add_menu( type_, path, path_i18n, id_=id_, extra=extra, top_extra=top_extra ) - def addMenuHook( + def add_menu_hook( self, type_, path, path_i18n=None, extra=None, top_extra=None, callback=None ): """Helper method to add a menu hook Menu hooks are local menus which override menu given by backend - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - @param path(list[unicode]): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + @param path(list[unicode]): same as in [sat.core.sat_main.SAT.import_menu] @param path_i18n(list[unicode], None): translated menu path (same lenght as path), or None to get deferred translation - @param extra(dict[unicode, unicode], None): same as in [addMenus] - @param top_extra: same as in [_createCategories] + @param extra(dict[unicode, unicode], None): same as in [add_menus] + @param top_extra: same as in [_create_categories] @param callback(callable): local callback (mutually exclusive with id_) """ if path_i18n is None: - path_i18n = self._getPathI18n(path) + path_i18n = self._get_path_i_1_8_n(path) menu_item = MenuHook( type_, path[-1], path_i18n[-1], callback=callback, extra=extra ) - self.addMenuItem(type_, path[:-1], menu_item, path_i18n[:-1], top_extra) + self.add_menu_item(type_, path[:-1], menu_item, path_i18n[:-1], top_extra) log.info("Menu hook set on {path} ({type_})".format(path=path, type_=type_)) - def addCategory(self, type_, path, path_i18n=None, extra=None, top_extra=None): + def add_category(self, type_, path, path_i18n=None, extra=None, top_extra=None): """Create a category with all parents, and set extra on the last one - @param type_(unicode): same as in [sat.core.sat_main.SAT.importMenu] - @param path(list[unicode]): same as in [sat.core.sat_main.SAT.importMenu] + @param type_(unicode): same as in [sat.core.sat_main.SAT.import_menu] + @param path(list[unicode]): same as in [sat.core.sat_main.SAT.import_menu] @param path_i18n(list[unicode], None): translated menu path (same lenght as path), or None to get deferred translation of path - @param extra(dict[unicode, unicode], None): same as in [addMenus] (added on the leaf category only) - @param top_extra: same as in [_createCategories] + @param extra(dict[unicode, unicode], None): same as in [add_menus] (added on the leaf category only) + @param top_extra: same as in [_create_categories] @return (MenuCategory): last category add """ if path_i18n is None: - path_i18n = self._getPathI18n(path) - last_container = self._createCategories( + path_i18n = self._get_path_i_1_8_n(path) + last_container = self._create_categories( type_, path, path_i18n, top_extra=top_extra ) - last_container.setExtra(extra) + last_container.set_extra(extra) return last_container - def getMainContainer(self, type_): + def get_main_container(self, type_): """Get a main MenuType container @param type_: a C.MENU_* constant
--- a/sat_frontends/quick_frontend/quick_profile_manager.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_profile_manager.py Sat Apr 08 13:54:42 2023 +0200 @@ -106,38 +106,38 @@ else: # a profile is not validated, we go to manual mode self._autoconnect = False - self.host.actionManager(data, callback=authenticate_cb, profile=profile) + self.host.action_manager(data, callback=authenticate_cb, profile=profile) - def getProfileNameCb(profile): + def get_profile_name_cb(profile): if not profile: # FIXME: this method is not handling manual mode correclty anymore # must be thought to be handled asynchronously self._autoconnect = False # manual mode msg = _("Trying to plug an unknown profile key ({})".format(profile_key)) log.warning(msg) - self.host.showDialog(_("Profile plugging in error"), msg, "error") + self.host.show_dialog(_("Profile plugging in error"), msg, "error") else: - self.host.launchAction( + self.host.action_launch( C.AUTHENTICATE_PROFILE_ID, callback=authenticate_cb, profile=profile ) - def getProfileNameEb(failure): + def get_profile_name_eb(failure): log.error("Can't retrieve profile name: {}".format(failure)) for profile_key in profile_keys: - self.host.bridge.profileNameGet( - profile_key, callback=getProfileNameCb, errback=getProfileNameEb + self.host.bridge.profile_name_get( + profile_key, callback=get_profile_name_cb, errback=get_profile_name_eb ) - def getParamError(self, __): - self.host.showDialog(_("Error"), _("Can't get profile parameter"), "error") + def get_param_error(self, __): + self.host.show_dialog(_("Error"), _("Can't get profile parameter"), "error") ## Helping methods ## - def _getErrorMessage(self, reason): + def _get_error_message(self, reason): """Return an error message corresponding to profile creation error - @param reason (str): reason as returned by profileCreate + @param reason (str): reason as returned by profile_create @return (unicode): human readable error message """ if reason == "ConflictError": @@ -152,31 +152,31 @@ message = _("Can't create profile ({})").format(reason) return message - def _deleteProfile(self): + def _delete_profile(self): """Delete the currently selected profile""" if self.current.profile: - self.host.bridge.asyncDeleteProfile( - self.current.profile, callback=self.refillProfiles + self.host.bridge.profile_delete_async( + self.current.profile, callback=self.refill_profiles ) - self.resetFields() + self.reset_fields() ## workflow methods (events occuring during the profiles selection) ## # These methods must be called by the frontend at some point - def _onConnectProfiles(self): + def _on_connect_profiles(self): """Connect the profiles and start the main widget""" if self._autoconnect: - self.host.showDialog( + self.host.show_dialog( _("Internal error"), _("You can't connect manually and automatically at the same time"), "error", ) return - self.updateConnectionParams() - profiles = self.getProfiles() + self.update_connection_params() + profiles = self.get_profiles() if not profiles: - self.host.showDialog( + self.host.show_dialog( _("No profile selected"), _("You need to create and select at least one profile before connecting"), "error", @@ -185,40 +185,40 @@ # All profiles in the list are already validated, so we can plug them directly self.host.plug_profiles(profiles) - def getConnectionParams(self, profile): + def get_connection_params(self, profile): """Get login and password and display them @param profile: %(doc_profile)s """ - self.host.bridge.asyncGetParamA( + self.host.bridge.param_get_a_async( "JabberID", "Connection", profile_key=profile, - callback=self.setJID, - errback=self.getParamError, + callback=self.set_jid, + errback=self.get_param_error, ) - self.host.bridge.asyncGetParamA( + self.host.bridge.param_get_a_async( "Password", "Connection", profile_key=profile, - callback=self.setPassword, - errback=self.getParamError, + callback=self.set_password, + errback=self.get_param_error, ) - def updateConnectionParams(self): + def update_connection_params(self): """Check if connection parameters have changed, and update them if so""" if self.current.profile: - login = self.getJID() + login = self.get_jid() password = self.getPassword() if login != self.current.login and self.current.login is not None: self.current.login = login - self.host.bridge.setParam( + self.host.bridge.param_set( "JabberID", login, "Connection", profile_key=self.current.profile ) log.info("login updated for profile [{}]".format(self.current.profile)) if password != self.current.password and self.current.password is not None: self.current.password = password - self.host.bridge.setParam( + self.host.bridge.param_set( "Password", password, "Connection", profile_key=self.current.profile ) log.info( @@ -227,23 +227,23 @@ ## graphic updates (should probably be overriden in frontends) ## - def resetFields(self): + def reset_fields(self): """Set profile to None, and reset fields""" self.current.profile = None - self.setJID("") - self.setPassword("") + self.set_jid("") + self.set_password("") - def refillProfiles(self): + def refill_profiles(self): """Rebuild the list of profiles""" - profiles = self.host.bridge.profilesListGet() + profiles = self.host.bridge.profiles_list_get() profiles.sort() - self.setProfiles(profiles) + self.set_profiles(profiles) ## Method which must be implemented by frontends ## # get/set data - def getProfiles(self): + def get_profiles(self): """Return list of selected profiles Must be implemented by frontends @@ -251,11 +251,11 @@ """ raise NotImplementedError - def setProfiles(self, profiles): + def set_profiles(self, profiles): """Update the list of profiles""" raise NotImplementedError - def getJID(self): + def get_jid(self): """Get current jid Must be implemented by frontends @@ -271,7 +271,7 @@ """ raise NotImplementedError - def setJID(self, jid_): + def set_jid(self, jid_): """Set current jid Must be implemented by frontends @@ -279,7 +279,7 @@ """ raise NotImplementedError - def setPassword(self, password): + def set_password(self, password): """Set current password Must be implemented by frontends
--- a/sat_frontends/quick_frontend/quick_utils.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_utils.py Sat Apr 08 13:54:42 2023 +0200 @@ -22,7 +22,7 @@ from optparse import OptionParser -def getNewPath(path): +def get_new_path(path): """ Check if path exists, and find a non existant path if needed """ idx = 2 if not exists(path):
--- a/sat_frontends/quick_frontend/quick_widgets.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/quick_frontend/quick_widgets.py Sat Apr 08 13:54:42 2023 +0200 @@ -68,7 +68,7 @@ for widget in widget_instances: yield widget - def getRealClass(self, class_): + def get_real_class(self, class_): """Return class registered for given class_ @param class_: subclass of QuickWidget @@ -86,20 +86,20 @@ ) return cls - def getWidgetInstances(self, widget): + def get_widget_instances(self, widget): """Get all instance of a widget - This is a helper method which call getWidgets + This is a helper method which call get_widgets @param widget(QuickWidget): retrieve instances of this widget @return: iterator on widgets """ - return self.getWidgets(widget.__class__, widget.target, widget.profiles) + return self.get_widgets(widget.__class__, widget.target, widget.profiles) - def getWidgets(self, class_, target=None, profiles=None, with_duplicates=True): + def get_widgets(self, class_, target=None, profiles=None, with_duplicates=True): """Get all subclassed widgets instances @param class_: subclass of QuickWidget, same parameter as used in - [getOrCreateWidget] + [get_or_create_widget] @param target: if not None, construct a hash with this target and filter corresponding widgets recreated widgets are handled @@ -109,14 +109,14 @@ returned @return: iterator on widgets """ - class_ = self.getRealClass(class_) + class_ = self.get_real_class(class_) try: widgets_map = self._widgets[class_.__name__] except KeyError: return else: if target is not None: - filter_hash = str(class_.getWidgetHash(target, profiles)) + filter_hash = str(class_.get_widget_hash(target, profiles)) else: filter_hash = None if filter_hash is not None: @@ -133,11 +133,11 @@ # we only return the first widget of the list break - def getWidget(self, class_, target=None, profiles=None): + def get_widget(self, class_, target=None, profiles=None): """Get a widget without creating it if it doesn't exist. if several instances of widgets with this hash exist, the first one is returned - @param class_: subclass of QuickWidget, same parameter as used in [getOrCreateWidget] + @param class_: subclass of QuickWidget, same parameter as used in [get_or_create_widget] @param target: target depending of the widget, usually a JID instance @param profiles (unicode, iterable[unicode], None): profile(s) to use (may or may not be used, depending of the widget class) @@ -146,17 +146,17 @@ assert (target is not None) or (profiles is not None) if profiles is not None and isinstance(profiles, str): profiles = [profiles] - class_ = self.getRealClass(class_) - hash_ = class_.getWidgetHash(target, profiles) + class_ = self.get_real_class(class_) + hash_ = class_.get_widget_hash(target, profiles) try: return self._widgets[class_.__name__][hash_][0] except KeyError: return None - def getOrCreateWidget(self, class_, target, *args, **kwargs): + def get_or_create_widget(self, class_, target, *args, **kwargs): """Get an existing widget or create a new one when necessary - If the widget is new, self.host.newWidget will be called with it. + If the widget is new, self.host.new_widget will be called with it. @param class_(class): class of the widget to create @param target: target depending of the widget, usually a JID instance @param args(list): optional args to create a new instance of class_ @@ -164,22 +164,22 @@ if 'profile' key is present, it will be popped and put in 'profiles' if there is neither 'profile' nor 'profiles', None will be used for 'profiles' if 'on_new_widget' is present it can have the following values: - C.WIDGET_NEW [default]: self.host.newWidget will be called on widget creation - [callable]: this method will be called instead of self.host.newWidget + C.WIDGET_NEW [default]: self.host.new_widget will be called on widget creation + [callable]: this method will be called instead of self.host.new_widget None: do nothing if 'on_existing_widget' is present it can have the following values: C.WIDGET_KEEP [default]: return the existing widget C.WIDGET_RAISE: raise WidgetAlreadyExistsError C.WIDGET_RECREATE: create a new widget - if the existing widget has a "recreateArgs" method, it will be called with args list and kwargs dict + if the existing widget has a "recreate_args" method, it will be called with args list and kwargs dict so the values can be completed to create correctly the new instance [callable]: this method will be called with existing widget as argument, the widget to use must be returned - if 'force_hash' is present, the hash given in value will be used instead of the one returned by class_.getWidgetHash + if 'force_hash' is present, the hash given in value will be used instead of the one returned by class_.get_widget_hash other keys will be used to instanciate class_ if the case happen (e.g. if type_ is present and class_ is a QuickChat subclass, it will be used to create a new QuickChat instance). @return: a class_ instance, either new or already existing """ - cls = self.getRealClass(class_) + cls = self.get_real_class(class_) ## arguments management ## _args = [self.host, target] + list( @@ -212,7 +212,7 @@ try: hash_ = _kwargs.pop("force_hash") except KeyError: - hash_ = cls.getWidgetHash(target, _kwargs["profiles"]) + hash_ = cls.get_widget_hash(target, _kwargs["profiles"]) ## widget creation or retrieval ## @@ -227,17 +227,17 @@ except KeyError: widget = None else: - widget.addTarget(target) + widget.add_target(target) if widget is None: # we need to create a new widget log.debug(f"Creating new widget for target {target} {cls}") widget = cls(*_args, **_kwargs) widgets_map.setdefault(hash_, []).append(widget) - self.host.callListeners("widgetNew", widget) + self.host.call_listeners("widgetNew", widget) if on_new_widget == C.WIDGET_NEW: - self.host.newWidget(widget) + self.host.new_widget(widget) elif callable(on_new_widget): on_new_widget(widget) else: @@ -250,11 +250,11 @@ raise WidgetAlreadyExistsError(hash_) elif on_existing_widget == C.WIDGET_RECREATE: try: - recreateArgs = widget.recreateArgs + recreate_args = widget.recreate_args except AttributeError: pass else: - recreateArgs(_args, _kwargs) + recreate_args(_args, _kwargs) widget = cls(*_args, **_kwargs) widgets_map[hash_].append(widget) log.debug("widget <{wid}> already exists, a new one has been recreated" @@ -274,22 +274,22 @@ return widget - def deleteWidget(self, widget_to_delete, *args, **kwargs): + def delete_widget(self, widget_to_delete, *args, **kwargs): """Delete a widget instance this method must be called by frontends when a widget is deleted - widget's onDelete method will be called before deletion, and deletion will be + widget's on_delete method will be called before deletion, and deletion will be stopped if it returns False. @param widget_to_delete(QuickWidget): widget which need to deleted - @param *args: extra arguments to pass to onDelete - @param *kwargs: extra keywords arguments to pass to onDelete + @param *args: extra arguments to pass to on_delete + @param *kwargs: extra keywords arguments to pass to on_delete the extra arguments are not used by QuickFrontend, it's is up to the frontend to use them or not. following extra arguments are well known: - "all_instances" can be used as kwarg, if it evaluate to True, - all instances of the widget will be deleted (if onDelete is + all instances of the widget will be deleted (if on_delete is not returning False for any of the instance). This arguments - is not sent to onDelete methods. + is not sent to on_delete methods. - "explicit_close" is used when the deletion is requested by the user or a leave signal, "all_instances" is usually set at the same time. @@ -299,26 +299,26 @@ all_instances = kwargs.get('all_instances', False) if all_instances: - for w in self.getWidgetInstances(widget_to_delete): - if w.onDelete(**kwargs) == False: + for w in self.get_widget_instances(widget_to_delete): + if w.on_delete(**kwargs) == False: log.debug( f"Deletion of {widget_to_delete} cancelled by widget itself") return else: - if widget_to_delete.onDelete(**kwargs) == False: + if widget_to_delete.on_delete(**kwargs) == False: log.debug(f"Deletion of {widget_to_delete} cancelled by widget itself") return if self.host.selected_widget == widget_to_delete: self.host.selected_widget = None - class_ = self.getRealClass(widget_to_delete.__class__) + class_ = self.get_real_class(widget_to_delete.__class__) try: widgets_map = self._widgets[class_.__name__] except KeyError: log.error("no widgets_map found for class {cls}".format(cls=class_)) return - widget_hash = str(class_.getWidgetHash(widget_to_delete.target, + widget_hash = str(class_.get_widget_hash(widget_to_delete.target, widget_to_delete.profiles)) try: widget_instances = widgets_map[widget_hash] @@ -342,7 +342,7 @@ del widgets_map[widget_hash] log.debug("All instances of {cls} with hash {widget_hash!r} have been deleted" .format(cls=class_, widget_hash=widget_hash)) - self.host.callListeners("widgetDeleted", widget_to_delete) + self.host.call_listeners("widgetDeleted", widget_to_delete) class QuickWidget(object): @@ -368,17 +368,17 @@ """ self.host = host self.targets = set() - self.addTarget(target) + self.add_target(target) self.profiles = set() self._sync = True if isinstance(profiles, str): - self.addProfile(profiles) + self.add_profile(profiles) elif profiles is None: if not self.PROFILES_ALLOW_NONE: raise ValueError("profiles can't have a value of None") else: for profile in profiles: - self.addProfile(profile) + self.add_profile(profile) if not self.profiles: raise ValueError("no profile found, use None for no profile classes") @@ -402,7 +402,7 @@ @property def widget_hash(self): """Return quick widget hash""" - return self.getWidgetHash(self.target, self.profiles) + return self.get_widget_hash(self.target, self.profiles) # synchronisation state @@ -429,14 +429,14 @@ # target/profile - def addTarget(self, target): + def add_target(self, target): """Add a target if it doesn't already exists @param target: target to add """ self.targets.add(target) - def addProfile(self, profile): + def add_profile(self, profile): """Add a profile is if doesn't already exists @param profile: profile to add @@ -448,7 +448,7 @@ # widget identitication @staticmethod - def getWidgetHash(target, profiles): + def get_widget_hash(target, profiles): """Return the hash associated with this target for this widget class some widget classes can manage several target on the same instance @@ -465,7 +465,7 @@ # widget life events - def onDelete(self, *args, **kwargs): + def on_delete(self, *args, **kwargs): """Called when a widget is being deleted @return (boot, None): False to cancel deletion @@ -473,6 +473,6 @@ """ return True - def onSelected(self): + def on_selected(self): """Called when host.selected_widget is this instance""" pass
--- a/sat_frontends/tools/host_listener.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/host_listener.py Sat Apr 08 13:54:42 2023 +0200 @@ -31,7 +31,7 @@ listeners.append(cb) -def callListeners(host): +def call_listeners(host): """Must be called by frontend when host is ready. The call will launch all the callbacks, then remove the listeners list.
--- a/sat_frontends/tools/jid.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/jid.py Sat Apr 08 13:54:42 2023 +0200 @@ -124,7 +124,7 @@ return self.domain != "" -def newResource(entity, resource): +def new_resource(entity, resource): """Build a new JID from the given entity and resource. @param entity (JID): original JID
--- a/sat_frontends/tools/misc.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/misc.py Sat Apr 08 13:54:42 2023 +0200 @@ -19,7 +19,7 @@ class InputHistory(object): - def _updateInputHistory(self, text=None, step=None, callback=None, mode=""): + def _update_input_history(self, text=None, step=None, callback=None, mode=""): """Update the lists of previously sent messages. Several lists can be handled as they are stored in a dictionary, the argument "mode" being used as the entry key. There's also a temporary list to allow you play
--- a/sat_frontends/tools/strings.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/strings.py Sat Apr 08 13:54:42 2023 +0200 @@ -28,7 +28,7 @@ # TODO: merge this class with an other module or at least rename it (strings is not a good name) -def getURLParams(url): +def get_url_params(url): """This comes from pyjamas.Location.makeUrlDict with a small change to also parse full URLs, and parameters with no value specified (in that case the default value "" is used). @@ -51,13 +51,13 @@ return dict_ -def addURLToText(string, new_target=True): +def add_url_to_text(string, new_target=True): """Check a text for what looks like an URL and make it clickable. @param string (unicode): text to process @param new_target (bool): if True, make the link open in a new window """ - # XXX: report any change to libervia.browser.strings.addURLToText + # XXX: report any change to libervia.browser.strings.add_url_to_text def repl(match): url = match.group(0) if not re.match(r"""[a-z]{3,}://|mailto:|xmpp:""", url): @@ -68,12 +68,12 @@ return RE_URL.sub(repl, string) -def addURLToImage(string): +def add_url_to_image(string): """Check a XHTML text for what looks like an imageURL and make it clickable. @param string (unicode): text to process """ - # XXX: report any change to libervia.browser.strings.addURLToImage + # XXX: report any change to libervia.browser.strings.add_url_to_image def repl(match): url = match.group(1) return '<a href="%s" target="_blank">%s</a>' % (url, match.group(0)) @@ -82,7 +82,7 @@ return re.sub(pattern, repl, string) -def fixXHTMLLinks(xhtml): +def fix_xhtml_links(xhtml): """Add http:// if the scheme is missing and force opening in a new window. @param string (unicode): XHTML Content
--- a/sat_frontends/tools/xmltools.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/xmltools.py Sat Apr 08 13:54:42 2023 +0200 @@ -23,7 +23,7 @@ # (e.g. NativeDOM in Libervia) -def inlineRoot(doc): +def inline_root(doc): """ make the root attribute inline @param root_node: minidom's Document compatible class @return: plain XML
--- a/sat_frontends/tools/xmlui.py Fri Apr 07 15:18:39 2023 +0200 +++ b/sat_frontends/tools/xmlui.py Sat Apr 08 13:54:42 2023 +0200 @@ -40,8 +40,8 @@ pass -# FIXME: this method is duplicated in frontends.tools.xmlui.getText -def getText(node): +# FIXME: this method is duplicated in frontends.tools.xmlui.get_text +def get_text(node): """Get child text nodes @param node: dom Node @return: joined unicode text of all nodes @@ -168,7 +168,7 @@ """Widget which can contain other ones with a specific layout""" @classmethod - def _xmluiAdapt(cls, instance): + def _xmlui_adapt(cls, instance): """Make cls as instance.__class__ cls must inherit from original instance class @@ -217,24 +217,24 @@ def __init__(self, _xmlui_parent): self._xmlui_parent = _xmlui_parent - def _xmluiValidated(self, data=None): + def _xmlui_validated(self, data=None): if data is None: data = {} - self._xmluiSetData(C.XMLUI_STATUS_VALIDATED, data) - self._xmluiSubmit(data) + self._xmlui_set_data(C.XMLUI_STATUS_VALIDATED, data) + self._xmlui_submit(data) - def _xmluiCancelled(self): + def _xmlui_cancelled(self): data = {C.XMLUI_DATA_CANCELLED: C.BOOL_TRUE} - self._xmluiSetData(C.XMLUI_STATUS_CANCELLED, data) - self._xmluiSubmit(data) + self._xmlui_set_data(C.XMLUI_STATUS_CANCELLED, data) + self._xmlui_submit(data) - def _xmluiSubmit(self, data): + def _xmlui_submit(self, data): if self._xmlui_parent.submit_id is None: log.debug(_("Nothing to submit")) else: self._xmlui_parent.submit(data) - def _xmluiSetData(self, status, data): + def _xmlui_set_data(self, status, data): pass @@ -253,7 +253,7 @@ class ConfirmDialog(Dialog): """Dialog with a OK/Cancel type configuration""" - def _xmluiSetData(self, status, data): + def _xmlui_set_data(self, status, data): if status == C.XMLUI_STATUS_VALIDATED: data[C.XMLUI_DATA_ANSWER] = C.BOOL_TRUE elif status == C.XMLUI_STATUS_CANCELLED: @@ -283,12 +283,12 @@ - NO_CANCEL: the UI can't be cancelled - FROM_BACKEND: the UI come from backend (i.e. it's not the direct result of user operation) - @param callback(callable, None): if not None, will be used with launchAction: + @param callback(callable, None): if not None, will be used with action_launch: - if None is used, default behaviour will be used (closing the dialog and - calling host.actionManager) + calling host.action_manager) - if a callback is provided, it will be used instead, so you'll have to manage dialog closing or new xmlui to display, or other action (you can call - host.actionManager) + host.action_manager) The callback will have data, callback_id and profile as arguments """ self.host = host @@ -300,20 +300,20 @@ if flags is None: flags = [] self.flags = flags - self.callback = callback or self._defaultCb + self.callback = callback or self._default_cb self.profile = profile @property def user_action(self): return "FROM_BACKEND" not in self.flags - def _defaultCb(self, data, cb_id, profile): - # TODO: when XMLUI updates will be managed, the _xmluiClose + def _default_cb(self, data, cb_id, profile): + # TODO: when XMLUI updates will be managed, the _xmlui_close # must be called only if there is no update - self._xmluiClose() - self.host.actionManager(data, profile=profile) + self._xmlui_close() + self.host.action_manager(data, profile=profile) - def _isAttrSet(self, name, node): + def _is_attr_set(self, name, node): """Return widget boolean attribute status @param name: name of the attribute (e.g. "read_only") @@ -323,7 +323,7 @@ read_only = node.getAttribute(name) or C.BOOL_FALSE return read_only.lower().strip() == C.BOOL_TRUE - def _getChildNode(self, node, name): + def _get_child_node(self, node, name): """Return the first child node with the given name @param node: Node instance @@ -337,7 +337,7 @@ return None def submit(self, data): - self._xmluiClose() + self._xmlui_close() if self.submit_id is None: raise ValueError("Can't submit is self.submit_id is not set") if "session_id" in data: @@ -347,14 +347,14 @@ ) if self.session_id is not None: data["session_id"] = self.session_id - self._xmluiLaunchAction(self.submit_id, data) + self._xmlui_launch_action(self.submit_id, data) - def _xmluiLaunchAction(self, action_id, data): - self.host.launchAction( + def _xmlui_launch_action(self, action_id, data): + self.host.action_launch( action_id, data, callback=self.callback, profile=self.profile ) - def _xmluiClose(self): + def _xmlui_close(self): """Close the window/popup/... where the constructor XMLUI is this method must be overrided @@ -427,7 +427,7 @@ self._whitelist = whitelist else: self._whitelist = None - self.constructUI(parsed_dom) + self.construct_ui(parsed_dom) @staticmethod def escape(name): @@ -449,10 +449,10 @@ raise ValueError(_("XMLUI can have only one main container")) self._main_cont = value - def _parseChilds(self, _xmlui_parent, current_node, wanted=("container",), data=None): + def _parse_childs(self, _xmlui_parent, current_node, wanted=("container",), data=None): """Recursively parse childNodes of an element - @param _xmlui_parent: widget container with '_xmluiAppend' method + @param _xmlui_parent: widget container with '_xmlui_append' method @param current_node: element from which childs will be parsed @param wanted: list of tag names that can be present in the childs to be SàT XMLUI compliant @@ -473,16 +473,16 @@ self.main_cont = _xmlui_parent if type_ == "tabs": cont = self.widget_factory.createTabsContainer(_xmlui_parent) - self._parseChilds(_xmlui_parent, node, ("tab",), {"tabs_cont": cont}) + self._parse_childs(_xmlui_parent, node, ("tab",), {"tabs_cont": cont}) elif type_ == "vertical": cont = self.widget_factory.createVerticalContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) elif type_ == "pairs": cont = self.widget_factory.createPairsContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) elif type_ == "label": cont = self.widget_factory.createLabelContainer(_xmlui_parent) - self._parseChilds( + self._parse_childs( # FIXME: the "None" value for CURRENT_LABEL doesn't seem # used or even useful, it should probably be removed # and all "is not None" tests for it should be removed too @@ -507,15 +507,15 @@ "can't have selectable=='no' and callback_id at the same time" ) cont._xmlui_callback_id = callback_id - cont._xmluiOnSelect(self.onAdvListSelect) + cont._xmlui_on_select(self.on_adv_list_select) - self._parseChilds(cont, node, ("row",), data) + self._parse_childs(cont, node, ("row",), data) else: log.warning(_("Unknown container [%s], using default one") % type_) cont = self.widget_factory.createVerticalContainer(_xmlui_parent) - self._parseChilds(cont, node, ("widget", "container")) + self._parse_childs(cont, node, ("widget", "container")) try: - xmluiAppend = _xmlui_parent._xmluiAppend + xmluiAppend = _xmlui_parent._xmlui_append except ( AttributeError, TypeError, @@ -524,7 +524,7 @@ self.main_cont = cont else: raise Exception( - _("Internal Error, container has not _xmluiAppend method") + _("Internal Error, container has not _xmlui_append method") ) else: xmluiAppend(cont) @@ -540,8 +540,8 @@ name ) # XXX: awful hack because params need category and we don't keep parent tab_cont = data["tabs_cont"] - new_tab = tab_cont._xmluiAddTab(label or name, selected) - self._parseChilds(new_tab, node, ("widget", "container")) + new_tab = tab_cont._xmlui_add_tab(label or name, selected) + self._parse_childs(new_tab, node, ("widget", "container")) elif node.nodeName == "row": try: @@ -550,8 +550,8 @@ index = node.getAttribute("index") or None else: data["index"] += 1 - _xmlui_parent._xmluiAddRow(index) - self._parseChilds(_xmlui_parent, node, ("widget", "container")) + _xmlui_parent._xmlui_add_row(index) + self._parse_childs(_xmlui_parent, node, ("widget", "container")) elif node.nodeName == "widget": name = node.getAttribute("name") @@ -565,12 +565,12 @@ curr_label = data.pop(CURRENT_LABEL) if curr_label is not None: # if so, we remove it from parent - _xmlui_parent._xmluiRemove(curr_label) + _xmlui_parent._xmlui_remove(curr_label) continue type_ = node.getAttribute("type") - value_elt = self._getChildNode(node, "value") + value_elt = self._get_child_node(node, "value") if value_elt is not None: - value = getText(value_elt) + value = get_text(value_elt) else: value = ( node.getAttribute("value") if node.hasAttribute("value") else "" @@ -597,39 +597,39 @@ ctrl = self.widget_factory.createDividerWidget(_xmlui_parent, style) elif type_ == "string": ctrl = self.widget_factory.createStringWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "jid_input": ctrl = self.widget_factory.createJidInputWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "password": ctrl = self.widget_factory.createPasswordWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "textbox": ctrl = self.widget_factory.createTextBoxWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "xhtmlbox": ctrl = self.widget_factory.createXHTMLBoxWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "bool": ctrl = self.widget_factory.createBoolWidget( _xmlui_parent, value == C.BOOL_TRUE, - self._isAttrSet("read_only", node), + self._is_attr_set("read_only", node), ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "int": ctrl = self.widget_factory.createIntWidget( - _xmlui_parent, value, self._isAttrSet("read_only", node) + _xmlui_parent, value, self._is_attr_set("read_only", node) ) self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "list": @@ -652,7 +652,7 @@ self.ctrl_list[name] = {"type": type_, "control": ctrl} elif type_ == "jids_list": style = [] - jids = [getText(jid_) for jid_ in node.getElementsByTagName("jid")] + jids = [get_text(jid_) for jid_ in node.getElementsByTagName("jid")] ctrl = self.widget_factory.createJidsListWidget( _xmlui_parent, jids, style ) @@ -660,7 +660,7 @@ elif type_ == "button": callback_id = node.getAttribute("callback") ctrl = self.widget_factory.createButtonWidget( - _xmlui_parent, value, self.onButtonPress + _xmlui_parent, value, self.on_button_press ) ctrl._xmlui_param_id = ( callback_id, @@ -683,7 +683,7 @@ if self.type == "param" and type_ not in ("text", "button"): try: - ctrl._xmluiOnChange(self.onParamChange) + ctrl._xmlui_on_change(self.on_param_change) ctrl._param_category = self._current_category except ( AttributeError, @@ -702,15 +702,15 @@ field.getAttribute("name") for field in node.getElementsByTagName("internal_field") ] - cb_data = self.getInternalCallbackData(callback, node) + cb_data = self.get_internal_callback_data(callback, node) ctrl._xmlui_param_internal = (callback, fields, cb_data) if type_ == "button": - ctrl._xmluiOnClick(self.onChangeInternal) + ctrl._xmlui_on_click(self.on_change_internal) else: - ctrl._xmluiOnChange(self.onChangeInternal) + ctrl._xmlui_on_change(self.on_change_internal) ctrl._xmlui_name = name - _xmlui_parent._xmluiAppend(ctrl) + _xmlui_parent._xmlui_append(ctrl) if CURRENT_LABEL in data and not isinstance(ctrl, LabelWidget): curr_label = data.pop(CURRENT_LABEL) if curr_label is not None: @@ -721,7 +721,7 @@ else: raise NotImplementedError(_("Unknown tag [%s]") % node.nodeName) - def constructUI(self, parsed_dom, post_treat=None): + def construct_ui(self, parsed_dom, post_treat=None): """Actually construct the UI @param parsed_dom: main parsed dom @@ -741,17 +741,17 @@ if self.type == "param": self.param_changed = set() - self._parseChilds(self, parsed_dom.documentElement) + self._parse_childs(self, parsed_dom.documentElement) if post_treat is not None: post_treat() - def _xmluiSetParam(self, name, value, category): - self.host.bridge.setParam(name, value, category, profile_key=self.profile) + def _xmlui_set_param(self, name, value, category): + self.host.bridge.param_set(name, value, category, profile_key=self.profile) ##EVENTS## - def onParamChange(self, ctrl): + def on_param_change(self, ctrl): """Called when type is param and a widget to save is modified @param ctrl: widget modified @@ -759,29 +759,29 @@ assert self.type == "param" self.param_changed.add(ctrl) - def onAdvListSelect(self, ctrl): + def on_adv_list_select(self, ctrl): data = {} - widgets = ctrl._xmluiGetSelectedWidgets() + widgets = ctrl._xmlui_get_selected_widgets() for wid in widgets: try: name = self.escape(wid._xmlui_name) - value = wid._xmluiGetValue() + value = wid._xmlui_get_value() data[name] = value except ( AttributeError, TypeError, ): # XXX: TypeError is here because pyjamas raise a TypeError instead of an AttributeError pass - idx = ctrl._xmluiGetSelectedIndex() + idx = ctrl._xmlui_get_selected_index() if idx is not None: data["index"] = idx callback_id = ctrl._xmlui_callback_id if callback_id is None: log.info(_("No callback_id found")) return - self._xmluiLaunchAction(callback_id, data) + self._xmlui_launch_action(callback_id, data) - def onButtonPress(self, button): + def on_button_press(self, button): """Called when an XMLUI button is clicked Launch the action associated to the button @@ -795,16 +795,16 @@ escaped = self.escape(field) ctrl = self.ctrl_list[field] if isinstance(ctrl["control"], ListWidget): - data[escaped] = "\t".join(ctrl["control"]._xmluiGetSelectedValues()) + data[escaped] = "\t".join(ctrl["control"]._xmlui_get_selected_values()) else: - data[escaped] = ctrl["control"]._xmluiGetValue() - self._xmluiLaunchAction(callback_id, data) + data[escaped] = ctrl["control"]._xmlui_get_value() + self._xmlui_launch_action(callback_id, data) - def onChangeInternal(self, ctrl): + def on_change_internal(self, ctrl): """Called when a widget that has been bound to an internal callback is changed. This is used to perform some UI actions without communicating with the backend. - See sat.tools.xml_tools.Widget.setInternalCallback for more details. + See sat.tools.xml_tools.Widget.set_internal_callback for more details. @param ctrl: widget modified """ action, fields, data = ctrl._xmlui_param_internal @@ -817,32 +817,32 @@ """Depending of 'action' value, copy or move from source to target.""" if isinstance(target, ListWidget): if isinstance(source, ListWidget): - values = source._xmluiGetSelectedValues() + values = source._xmlui_get_selected_values() else: - values = [source._xmluiGetValue()] + values = [source._xmlui_get_value()] if action == "move": - source._xmluiSetValue("") + source._xmlui_set_value("") values = [value for value in values if value] if values: - target._xmluiAddValues(values, select=True) + target._xmlui_add_values(values, select=True) else: if isinstance(source, ListWidget): - value = ", ".join(source._xmluiGetSelectedValues()) + value = ", ".join(source._xmlui_get_selected_values()) else: - value = source._xmluiGetValue() + value = source._xmlui_get_value() if action == "move": - source._xmluiSetValue("") - target._xmluiSetValue(value) + source._xmlui_set_value("") + target._xmlui_set_value(value) def groups_of_contact(source, target): """Select in target the groups of the contact which is selected in source.""" assert isinstance(source, ListWidget) assert isinstance(target, ListWidget) try: - contact_jid_s = source._xmluiGetSelectedValues()[0] + contact_jid_s = source._xmlui_get_selected_values()[0] except IndexError: return - target._xmluiSelectValues(data[contact_jid_s]) + target._xmlui_select_values(data[contact_jid_s]) pass source = None @@ -857,11 +857,11 @@ groups_of_contact(source, widget) source = None - def getInternalCallbackData(self, action, node): + def get_internal_callback_data(self, action, node): """Retrieve from node the data needed to perform given action. @param action (string): a value from the one that can be passed to the - 'callback' parameter of sat.tools.xml_tools.Widget.setInternalCallback + 'callback' parameter of sat.tools.xml_tools.Widget.set_internal_callback @param node (DOM Element): the node of the widget that triggers the callback """ # TODO: it would be better to not have a specific way to retrieve @@ -883,7 +883,7 @@ data[jid_s].append(value_elt.getAttribute("name")) return data - def onFormSubmitted(self, ignore=None): + def on_form_submitted(self, ignore=None): """An XMLUI form has been submited call the submit action associated with this form @@ -894,10 +894,10 @@ ctrl = self.ctrl_list[ctrl_name] if isinstance(ctrl["control"], ListWidget): selected_values.append( - (escaped, "\t".join(ctrl["control"]._xmluiGetSelectedValues())) + (escaped, "\t".join(ctrl["control"]._xmlui_get_selected_values())) ) else: - selected_values.append((escaped, ctrl["control"]._xmluiGetValue())) + selected_values.append((escaped, ctrl["control"]._xmlui_get_value())) data = dict(selected_values) for key, value in self.hidden.items(): data[self.escape(key)] = value @@ -908,9 +908,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - def onFormCancelled(self, *__): + def on_form_cancelled(self, *__): """Called when a form is cancelled""" log.debug(_("Cancelling form")) if self.submit_id is not None: @@ -920,9 +920,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - def onSaveParams(self, ignore=None): + def on_save_params(self, ignore=None): """Params are saved, we send them to backend self.type must be param @@ -930,13 +930,13 @@ assert self.type == "param" for ctrl in self.param_changed: if isinstance(ctrl, ListWidget): - value = "\t".join(ctrl._xmluiGetSelectedValues()) + value = "\t".join(ctrl._xmlui_get_selected_values()) else: - value = ctrl._xmluiGetValue() + value = ctrl._xmlui_get_value() param_name = ctrl._xmlui_name.split(C.SAT_PARAM_SEPARATOR)[1] - self._xmluiSetParam(param_name, value, ctrl._param_category) + self._xmlui_set_param(param_name, value, ctrl._param_category) - self._xmluiClose() + self._xmlui_close() def show(self, *args, **kwargs): pass @@ -945,7 +945,7 @@ class AIOXMLUIPanel(XMLUIPanel): """Asyncio compatible version of XMLUIPanel""" - async def onFormSubmitted(self, ignore=None): + async def on_form_submitted(self, ignore=None): """An XMLUI form has been submited call the submit action associated with this form @@ -956,10 +956,10 @@ ctrl = self.ctrl_list[ctrl_name] if isinstance(ctrl["control"], ListWidget): selected_values.append( - (escaped, "\t".join(ctrl["control"]._xmluiGetSelectedValues())) + (escaped, "\t".join(ctrl["control"]._xmlui_get_selected_values())) ) else: - selected_values.append((escaped, ctrl["control"]._xmluiGetValue())) + selected_values.append((escaped, ctrl["control"]._xmlui_get_value())) data = dict(selected_values) for key, value in self.hidden.items(): data[self.escape(key)] = value @@ -970,9 +970,9 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() - async def onFormCancelled(self, *__): + async def on_form_cancelled(self, *__): """Called when a form is cancelled""" log.debug(_("Cancelling form")) if self.submit_id is not None: @@ -982,10 +982,10 @@ log.warning( _("The form data is not sent back, the type is not managed properly") ) - self._xmluiClose() + self._xmlui_close() async def submit(self, data): - self._xmluiClose() + self._xmlui_close() if self.submit_id is None: raise ValueError("Can't submit is self.submit_id is not set") if "session_id" in data: @@ -995,10 +995,10 @@ ) if self.session_id is not None: data["session_id"] = self.session_id - await self._xmluiLaunchAction(self.submit_id, data) + await self._xmlui_launch_action(self.submit_id, data) - async def _xmluiLaunchAction(self, action_id, data): - await self.host.launchAction( + async def _xmlui_launch_action(self, action_id, data): + await self.host.action_launch( action_id, data, callback=self.callback, profile=self.profile ) @@ -1021,13 +1021,13 @@ host, parsed_dom, title=title, flags=flags, callback=callback, profile=profile ) top = parsed_dom.documentElement - dlg_elt = self._getChildNode(top, "dialog") + dlg_elt = self._get_child_node(top, "dialog") if dlg_elt is None: raise ValueError("Invalid XMLUI: no Dialog element found !") dlg_type = dlg_elt.getAttribute("type") or C.XMLUI_DIALOG_MESSAGE try: - mess_elt = self._getChildNode(dlg_elt, C.XMLUI_DATA_MESS) - message = getText(mess_elt) + mess_elt = self._get_child_node(dlg_elt, C.XMLUI_DATA_MESS) + message = get_text(mess_elt) except ( TypeError, AttributeError, @@ -1045,7 +1045,7 @@ ) elif dlg_type == C.XMLUI_DIALOG_CONFIRM: try: - buttons_elt = self._getChildNode(dlg_elt, "buttons") + buttons_elt = self._get_child_node(dlg_elt, "buttons") buttons_set = ( buttons_elt.getAttribute("set") or C.XMLUI_DATA_BTNS_SET_DEFAULT ) @@ -1059,7 +1059,7 @@ ) elif dlg_type == C.XMLUI_DIALOG_FILE: try: - file_elt = self._getChildNode(dlg_elt, "file") + file_elt = self._get_child_node(dlg_elt, "file") filetype = file_elt.getAttribute("type") or C.XMLUI_DATA_FILETYPE_DEFAULT except ( TypeError, @@ -1073,13 +1073,13 @@ raise ValueError("Unknown dialog type [%s]" % dlg_type) def show(self): - self.dlg._xmluiShow() + self.dlg._xmlui_show() - def _xmluiClose(self): - self.dlg._xmluiClose() + def _xmlui_close(self): + self.dlg._xmlui_close() -def registerClass(type_, class_): +def register_class(type_, class_): """Register the class to use with the factory @param type_: one of: @@ -1090,7 +1090,7 @@ # TODO: remove this method, as there are seme use cases where different XMLUI # classes can be used in the same frontend, so a global value is not good assert type_ in (CLASS_PANEL, CLASS_DIALOG) - log.warning("registerClass for XMLUI is deprecated, please use partial with " + log.warning("register_class for XMLUI is deprecated, please use partial with " "xmlui.create and class_map instead") if type_ in _class_map: log.debug(_("XMLUI class already registered for {type_}, ignoring").format( @@ -1132,7 +1132,7 @@ cls = class_map[CLASS_DIALOG] except KeyError: raise ClassNotRegistedError( - _("You must register classes with registerClass before creating a XMLUI") + _("You must register classes with register_class before creating a XMLUI") ) xmlui = cls(
--- a/tests/e2e/libervia-cli/test_libervia-cli.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/e2e/libervia-cli/test_libervia-cli.py Sat Apr 08 13:54:42 2023 +0200 @@ -219,7 +219,7 @@ assert metadata['node'] == self.MICROBLOG_NS assert metadata['rsm'].keys() <= {"first", "last", "index", "count"} item_id = item['id'] - expected_uri = uri.buildXMPPUri( + expected_uri = uri.build_xmpp_uri( 'pubsub', subtype="microblog", path="account1@server1.test", node=self.MICROBLOG_NS, item=item_id )
--- a/tests/unit/conftest.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/conftest.py Sat Apr 08 13:54:42 2023 +0200 @@ -29,8 +29,8 @@ @fixture(scope="session") def bridge(): bridge = AsyncMock() - bridge.addSignal = MagicMock() - bridge.addMethod = MagicMock() + bridge.add_signal = MagicMock() + bridge.add_method = MagicMock() return bridge @@ -49,21 +49,21 @@ self.profiles = {} self.plugins = {} # map for short name to whole namespace, - # extended by plugins with registerNamespace + # extended by plugins with register_namespace self.ns_map = { "x-data": xmpp.NS_X_DATA, "disco#info": xmpp.NS_DISCO_INFO, } self.memory = MagicMock() self.memory.storage = storage - self.memory.getConfig.side_effect = self.get_test_config + self.memory.config_get.side_effect = self.get_test_config self.trigger = trigger.TriggerManager() self.bridge = bridge defer.ensureDeferred(self._post_init()) self.common_cache = AsyncMock() self._import_plugins() - self._addBaseMenus() + self._add_base_menus() self.initialised = defer.Deferred() self.initialised.callback(None)
--- a/tests/unit/test_ap-gateway.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/test_ap-gateway.py Sat Apr 08 13:54:42 2023 +0200 @@ -348,8 +348,8 @@ return dict(data) -async def mock_getItems(client, service, node, *args, **kwargs): - """Mock getItems +async def mock_get_items(client, service, node, *args, **kwargs): + """Mock get_items special kwargs can be used: ret_items (List[Domish.Element]): items to be returned, by default XMPP_ITEMS are @@ -367,8 +367,8 @@ return ret_items, {"rsm": rsm_resp.toDict(), "complete": True} -async def mock_getPubsubNode(client, service, node, with_subscriptions=False, **kwargs): - """Mock storage's getPubsubNode +async def mock_get_pubsub_node(client, service, node, with_subscriptions=False, **kwargs): + """Mock storage's get_pubsub_node return an MagicMock with subscription attribute set to empty list """ @@ -377,7 +377,7 @@ return fake_cached_node -def mockClient(jid): +def mock_client(jid): client = MagicMock() client.jid = jid client.host = "test.example" @@ -386,8 +386,8 @@ return client -def getVirtualClient(jid): - return mockClient(jid) +def get_virtual_client(jid): + return mock_client(jid) class FakeTReqPostResponse: @@ -398,10 +398,10 @@ def ap_gateway(host): gateway = plugin_comp_ap_gateway.APGateway(host) gateway.initialised = True - gateway.isPubsub = AsyncMock() - gateway.isPubsub.return_value = False - client = mockClient(jid.JID("ap.test.example")) - client.getVirtualClient = getVirtualClient + gateway.is_pubsub = AsyncMock() + gateway.is_pubsub.return_value = False + client = mock_client(jid.JID("ap.test.example")) + client.get_virtual_client = get_virtual_client gateway.client = client gateway.local_only = True gateway.public_url = PUBLIC_URL @@ -416,7 +416,7 @@ class TestActivityPubGateway: - def getTitleXHTML(self, item_elt: domish.Element) -> domish.Element: + def get_title_xhtml(self, item_elt: domish.Element) -> domish.Element: return next( t for t in item_elt.entry.elements(NS_ATOM, "title") @@ -426,7 +426,7 @@ @ed async def test_jid_and_node_convert_to_ap_handle(self, ap_gateway): """JID and pubsub node are converted correctly to an AP actor handle""" - get_account = ap_gateway.getAPAccountFromJidAndNode + get_account = ap_gateway.get_ap_account_from_jid_and_node # local jid assert ( @@ -447,16 +447,16 @@ ) # local pubsub node - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True assert ( await get_account(jid_=jid.JID("pubsub.test.example"), node="some_node") == "some_node@pubsub.test.example" ) # non local pubsub node - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True assert ( await get_account(jid_=jid.JID("pubsub.example.org"), node="some_node") == "___some_node.40pubsub.2eexample.2eorg@ap.test.example" @@ -465,11 +465,11 @@ @ed async def test_ap_handle_convert_to_jid_and_node(self, ap_gateway, monkeypatch): """AP actor handle convert correctly to JID and pubsub node""" - get_jid_node = ap_gateway.getJIDAndNode + get_jid_node = ap_gateway.get_jid_and_node # for following assertion, host is not a pubsub service - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = False + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = False # simple local jid assert await get_jid_node("toto@test.example") == ( @@ -498,8 +498,8 @@ ) # for following assertion, host is a pubsub service - with patch.object(ap_gateway, "isPubsub") as isPubsub: - isPubsub.return_value = True + with patch.object(ap_gateway, "is_pubsub") as is_pubsub: + is_pubsub.return_value = True # simple local node assert await get_jid_node("toto@pubsub.test.example") == ( @@ -517,18 +517,18 @@ """AP requests are converted to pubsub""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - actor_data = await ap_gateway.getAPActorDataFromAccount(TEST_AP_ACCOUNT) - outbox = await ap_gateway.apGetObject(actor_data, "outbox") - items, rsm_resp = await ap_gateway.getAPItems(outbox, 2) + actor_data = await ap_gateway.get_ap_actor_data_from_account(TEST_AP_ACCOUNT) + outbox = await ap_gateway.ap_get_object(actor_data, "outbox") + items, rsm_resp = await ap_gateway.get_ap_items(outbox, 2) assert rsm_resp.count == 4 assert rsm_resp.index == 0 assert rsm_resp.first == "https://example.org/users/test_user/statuses/4" assert rsm_resp.last == "https://example.org/users/test_user/statuses/3" - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 4</p></div>" @@ -540,7 +540,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[0].entry.published) == "2021-12-16T17:25:03Z" - title_xhtml = self.getTitleXHTML(items[1]) + title_xhtml = self.get_title_xhtml(items[1]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 3</p></div>" @@ -552,7 +552,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[1].entry.published) == "2021-12-16T17:26:03Z" - items, rsm_resp = await ap_gateway.getAPItems( + items, rsm_resp = await ap_gateway.get_ap_items( outbox, max_items=2, after_id="https://example.org/users/test_user/statuses/3", @@ -563,7 +563,7 @@ assert rsm_resp.first == "https://example.org/users/test_user/statuses/2" assert rsm_resp.last == "https://example.org/users/test_user/statuses/1" - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 2</p></div>" @@ -575,7 +575,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[0].entry.published) == "2021-12-16T17:27:03Z" - title_xhtml = self.getTitleXHTML(items[1]) + title_xhtml = self.get_title_xhtml(items[1]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 1</p></div>" @@ -587,7 +587,7 @@ assert author_uri == "xmpp:test_user\\40example.org@ap.test.example" assert str(items[1].entry.published) == "2021-12-16T17:28:03Z" - items, rsm_resp = await ap_gateway.getAPItems(outbox, max_items=1, start_index=2) + items, rsm_resp = await ap_gateway.get_ap_items(outbox, max_items=1, start_index=2) assert rsm_resp.count == 4 assert rsm_resp.index == 2 @@ -595,7 +595,7 @@ assert rsm_resp.last == "https://example.org/users/test_user/statuses/2" assert len(items) == 1 - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 2</p></div>" @@ -603,7 +603,7 @@ ) assert str(items[0].entry.published) == "2021-12-16T17:27:03Z" - items, rsm_resp = await ap_gateway.getAPItems( + items, rsm_resp = await ap_gateway.get_ap_items( outbox, max_items=3, chronological_pagination=False ) assert rsm_resp.count == 4 @@ -611,13 +611,13 @@ assert rsm_resp.first == "https://example.org/users/test_user/statuses/3" assert rsm_resp.last == "https://example.org/users/test_user/statuses/1" assert len(items) == 3 - title_xhtml = self.getTitleXHTML(items[0]) + title_xhtml = self.get_title_xhtml(items[0]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 3</p></div>" "</title>" ) - title_xhtml = self.getTitleXHTML(items[2]) + title_xhtml = self.get_title_xhtml(items[2]) assert title_xhtml.toXml() == ( "<title xmlns='http://www.w3.org/2005/Atom' type='xhtml'>" "<div xmlns='http://www.w3.org/1999/xhtml'><p>test message 1</p></div>" @@ -680,8 +680,8 @@ @ed async def test_pubsub_to_ap_conversion(self, ap_gateway, monkeypatch): """Pubsub nodes are converted to AP collections""" - monkeypatch.setattr(ap_gateway._p, "getItems", mock_getItems) - outbox = await ap_gateway.server.resource.APOutboxRequest( + monkeypatch.setattr(ap_gateway._p, "get_items", mock_get_items) + outbox = await ap_gateway.server.resource.ap_outbox_request( **self.ap_request_params(ap_gateway, "outbox") ) assert outbox["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -691,7 +691,7 @@ assert outbox["first"] assert outbox["last"] - first_page = await ap_gateway.server.resource.APOutboxPageRequest( + first_page = await ap_gateway.server.resource.ap_outbox_page_request( **self.ap_request_params(ap_gateway, url=outbox["first"]) ) assert first_page["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -725,11 +725,11 @@ """AP following items are converted to Public Pubsub Subscription subscriptions""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), ap_gateway._pps.subscriptions_node, None, None, @@ -749,12 +749,12 @@ """AP followers items are converted to Public Pubsub Subscription subscribers""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), - ap_gateway._pps.getPublicSubscribersNode(ap_gateway._m.namespace), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), + ap_gateway._pps.get_public_subscribers_node(ap_gateway._m.namespace), None, None, None, @@ -773,22 +773,22 @@ subscriptions = [ pubsub.Item( id="subscription_1", - payload=ap_gateway._pps.buildSubscriptionElt( + payload=ap_gateway._pps.build_subscription_elt( ap_gateway._m.namespace, jid.JID("local_user@test.example") ), ), pubsub.Item( id="subscription_2", - payload=ap_gateway._pps.buildSubscriptionElt( + payload=ap_gateway._pps.build_subscription_elt( ap_gateway._m.namespace, jid.JID("ext_user\\40example.org@ap.test.example"), ), ), ] monkeypatch.setattr( - ap_gateway._p, "getItems", partial(mock_getItems, ret_items=subscriptions) + ap_gateway._p, "get_items", partial(mock_get_items, ret_items=subscriptions) ) - following = await ap_gateway.server.resource.APFollowingRequest( + following = await ap_gateway.server.resource.ap_following_request( **self.ap_request_params(ap_gateway, "following") ) assert following["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -812,21 +812,21 @@ subscribers = [ pubsub.Item( id="subscriber_1", - payload=ap_gateway._pps.buildSubscriberElt( + payload=ap_gateway._pps.build_subscriber_elt( jid.JID("local_user@test.example") ), ), pubsub.Item( id="subscriber_2", - payload=ap_gateway._pps.buildSubscriberElt( + payload=ap_gateway._pps.build_subscriber_elt( jid.JID("ext_user\\40example.org@ap.test.example") ), ), ] monkeypatch.setattr( - ap_gateway._p, "getItems", partial(mock_getItems, ret_items=subscribers) + ap_gateway._p, "get_items", partial(mock_get_items, ret_items=subscribers) ) - followers = await ap_gateway.server.resource.APFollowersRequest( + followers = await ap_gateway.server.resource.ap_followers_request( **self.ap_request_params(ap_gateway, "followers") ) assert followers["@context"] == ["https://www.w3.org/ns/activitystreams"] @@ -849,17 +849,17 @@ """XMPP message are sent as AP direct message""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mess_data = { "from": TEST_JID, - "to": ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + "to": ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), "type": "chat", "message": {"": "This is a test message."}, "extra": {"origin-id": "123"}, } - with patch.object(ap_gateway, "signAndPost") as signAndPost: + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: await ap_gateway.onMessage(ap_gateway.client, mess_data) - url, actor_id, doc = signAndPost.call_args[0] + url, actor_id, doc = sign_and_post.call_args[0] assert url == "https://example.org/users/test_user/inbox" assert actor_id == "https://test.example/_ap/actor/some_user@test.example" obj = doc["object"] @@ -883,11 +883,11 @@ """AP direct message are sent as XMPP message (not Pubsub)""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # we have to patch DeferredList to not wait forever monkeypatch.setattr(defer, "DeferredList", AsyncMock()) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_ap_message = { "attributedTo": TEST_AP_ACTOR_ID, "cc": [], @@ -898,11 +898,11 @@ "to": [xmpp_actor_id], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) with patch.object(client, "sendMessage") as sendMessage: - await ap_gateway.newAPItem( + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_ap_message ) @@ -917,38 +917,38 @@ """Pubsub retract requests are converted to AP delete activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) retract_id = "retract_123" retract_elt = domish.Element((pubsub.NS_PUBSUB_EVENT, "retract")) retract_elt["id"] = retract_id items_event = pubsub.ItemsEvent( sender=TEST_JID, - recipient=ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + recipient=ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), nodeIdentifier=ap_gateway._m.namespace, items=[retract_elt], headers={}, ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value = FakeTReqPostResponse() + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value = FakeTReqPostResponse() # we simulate the reception of a retract event - await ap_gateway._itemsReceived(ap_gateway.client, items_event) - url, actor_id, doc = signAndPost.call_args[0] - jid_account = await ap_gateway.getAPAccountFromJidAndNode(TEST_JID, None) - jid_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, jid_account) + await ap_gateway._items_received(ap_gateway.client, items_event) + url, actor_id, doc = sign_and_post.call_args[0] + jid_account = await ap_gateway.get_ap_account_from_jid_and_node(TEST_JID, None) + jid_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, jid_account) assert url == f"{TEST_BASE_URL}/inbox" assert actor_id == jid_actor_id assert doc["type"] == "Delete" assert doc["actor"] == jid_actor_id obj = doc["object"] assert obj["type"] == ap_const.TYPE_TOMBSTONE - url_item_id = ap_gateway.buildAPURL(ap_const.TYPE_ITEM, jid_account, retract_id) + url_item_id = ap_gateway.build_apurl(ap_const.TYPE_ITEM, jid_account, retract_id) assert obj["id"] == url_item_id @ed async def test_ap_delete_to_pubsub_retract(self, ap_gateway): """AP delete activity is converted to pubsub retract""" - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) ap_item = { @@ -962,12 +962,12 @@ with patch.multiple( ap_gateway.host.memory.storage, get=DEFAULT, - getPubsubNode=DEFAULT, - deletePubsubItems=DEFAULT, + get_pubsub_node=DEFAULT, + delete_pubsub_items=DEFAULT, ) as mock_objs: mock_objs["get"].return_value = None cached_node = MagicMock() - mock_objs["getPubsubNode"].return_value = cached_node + mock_objs["get_pubsub_node"].return_value = cached_node subscription = MagicMock() subscription.state = SubscriptionState.SUBSCRIBED subscription.subscriber = TEST_JID @@ -976,7 +976,7 @@ ap_gateway.pubsub_service, "notifyRetract" ) as notifyRetract: # we simulate a received Delete activity - await ap_gateway.newAPDeleteItem( + await ap_gateway.new_ap_delete_item( client=client, destinee=None, node=ap_gateway._m.namespace, @@ -984,9 +984,9 @@ ) # item is deleted from database - deletePubsubItems = mock_objs["deletePubsubItems"] - assert deletePubsubItems.call_count == 1 - assert deletePubsubItems.call_args.args[1] == [ap_item["id"]] + delete_pubsub_items = mock_objs["delete_pubsub_items"] + assert delete_pubsub_items.call_count == 1 + assert delete_pubsub_items.call_args.args[1] == [ap_item["id"]] # retraction notification is sent to subscribers assert notifyRetract.call_count == 1 @@ -1007,42 +1007,42 @@ """Message retract requests are converted to AP delete activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # origin ID is the ID of the message to retract origin_id = "mess_retract_123" - # we call retractByOriginId to get the message element of a retraction request + # we call retract_by_origin_id to get the message element of a retraction request fake_client = MagicMock() fake_client.jid = TEST_JID - dest_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - ap_gateway._r.retractByOriginId(fake_client, dest_jid, origin_id) + dest_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + ap_gateway._r.retract_by_origin_id(fake_client, dest_jid, origin_id) # message_retract_elt is the message which would be sent for a retraction message_retract_elt = fake_client.send.call_args.args[0] apply_to_elt = next(message_retract_elt.elements(NS_FASTEN, "apply-to")) retract_elt = apply_to_elt.retract - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value = FakeTReqPostResponse() + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value = FakeTReqPostResponse() fake_fastened_elts = MagicMock() fake_fastened_elts.id = origin_id # we simulate the reception of a retract event using the message element that # we generated above - await ap_gateway._onMessageRetract( + await ap_gateway._on_message_retract( ap_gateway.client, message_retract_elt, retract_elt, fake_fastened_elts ) - url, actor_id, doc = signAndPost.call_args[0] + url, actor_id, doc = sign_and_post.call_args[0] - # the AP delete activity must have been sent through signAndPost + # the AP delete activity must have been sent through sign_and_post # we check its values - jid_account = await ap_gateway.getAPAccountFromJidAndNode(TEST_JID, None) - jid_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, jid_account) + jid_account = await ap_gateway.get_ap_account_from_jid_and_node(TEST_JID, None) + jid_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, jid_account) assert url == f"{TEST_BASE_URL}/users/{TEST_USER}/inbox" assert actor_id == jid_actor_id assert doc["type"] == "Delete" assert doc["actor"] == jid_actor_id obj = doc["object"] assert obj["type"] == ap_const.TYPE_TOMBSTONE - url_item_id = ap_gateway.buildAPURL(ap_const.TYPE_ITEM, jid_account, origin_id) + url_item_id = ap_gateway.build_apurl(ap_const.TYPE_ITEM, jid_account, origin_id) assert obj["id"] == url_item_id @ed @@ -1053,11 +1053,11 @@ # by ``test_ap_delete_to_pubsub_retract``) # we don't want actual queries in database - retractDBHistory = AsyncMock() - monkeypatch.setattr(ap_gateway._r, "retractDBHistory", retractDBHistory) + retract_db_history = AsyncMock() + monkeypatch.setattr(ap_gateway._r, "retract_db_history", retract_db_history) - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) fake_send = MagicMock() monkeypatch.setattr(client, "send", fake_send) @@ -1077,14 +1077,14 @@ fake_history.origin_id = ap_item["id"] storage_get.return_value = fake_history # we simulate a received Delete activity - await ap_gateway.newAPDeleteItem( + await ap_gateway.new_ap_delete_item( client=client, destinee=None, node=ap_gateway._m.namespace, item=ap_item ) # item is deleted from database - assert retractDBHistory.call_count == 1 - assert retractDBHistory.call_args.args[0] == client - assert retractDBHistory.call_args.args[1] == fake_history + assert retract_db_history.call_count == 1 + assert retract_db_history.call_args.args[0] == client + assert retract_db_history.call_args.args[1] == fake_history # retraction notification is sent to destinee assert fake_send.call_count == 1 @@ -1103,11 +1103,11 @@ """AP actor metadata are converted to XMPP/vCard4""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) items, __ = await ap_gateway.pubsub_service.items( jid.JID("toto@example.org"), - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT), + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT), # VCard4 node ap_gateway._v.node, None, @@ -1116,7 +1116,7 @@ ) assert len(items) == 1 vcard_elt = next(items[0].elements(ap_gateway._v.namespace, "vcard")) - vcard = ap_gateway._v.vcard2Dict(vcard_elt) + vcard = ap_gateway._v.vcard_2_dict(vcard_elt) assert "test_user nickname" in vcard["nicknames"] assert vcard["description"] == "test account" @@ -1125,12 +1125,12 @@ """XMPP identity is converted to AP actor metadata""" # XXX: XMPP identity is normally an amalgam of metadata from several # XEPs/locations (vCard4, vcard-tmp, etc) - with patch.object(ap_gateway._i, "getIdentity") as getIdentity: - getIdentity.return_value = { + with patch.object(ap_gateway._i, "get_identity") as get_identity: + get_identity.return_value = { "nicknames": ["nick1", "nick2"], "description": "test description", } - actor_data = await ap_gateway.server.resource.APActorRequest( + actor_data = await ap_gateway.server.resource.ap_actor_request( **self.ap_request_params(ap_gateway, ap_const.TYPE_ACTOR) ) @@ -1143,9 +1143,9 @@ """AP mentions by direct addressing are converted to XEP-0372 references""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_addr_mention = { "attributedTo": TEST_AP_ACTOR_ID, @@ -1156,37 +1156,37 @@ "to": [ap_const.NS_AP_PUBLIC, xmpp_actor_id], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) monkeypatch.setattr(client, "sendMessage", MagicMock()) - with patch.object(ap_gateway._refs, "sendReference") as sendReference: - await ap_gateway.newAPItem( + with patch.object(ap_gateway._refs, "send_reference") as send_reference: + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_addr_mention ) - assert sendReference.call_count == 1 - assert sendReference.call_args.kwargs["to_jid"] == TEST_JID + assert send_reference.call_count == 1 + assert send_reference.call_args.kwargs["to_jid"] == TEST_JID - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - expected_anchor = xmpp_uri.buildXMPPUri( + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + expected_anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=local_actor_jid.full(), node=ap_gateway._m.namespace, item=direct_addr_mention["id"], ) - assert sendReference.call_args.kwargs["anchor"] == expected_anchor + assert send_reference.call_args.kwargs["anchor"] == expected_anchor @ed async def test_tag_mention_to_reference(self, ap_gateway, monkeypatch): """AP mentions in "tag" field are converted to XEP-0372 references""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) direct_addr_mention = { "attributedTo": TEST_AP_ACTOR_ID, @@ -1198,35 +1198,35 @@ "tag": [{"type": "Mention", "href": xmpp_actor_id, "name": f"@{TEST_JID}'"}], "type": "Note", } - client = ap_gateway.client.getVirtualClient( - ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + client = ap_gateway.client.get_virtual_client( + ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) ) monkeypatch.setattr(client, "sendMessage", MagicMock()) - with patch.object(ap_gateway._refs, "sendReference") as sendReference: - await ap_gateway.newAPItem( + with patch.object(ap_gateway._refs, "send_reference") as send_reference: + await ap_gateway.new_ap_item( client, None, ap_gateway._m.namespace, direct_addr_mention ) - assert sendReference.call_count == 1 - assert sendReference.call_args.kwargs["to_jid"] == TEST_JID + assert send_reference.call_count == 1 + assert send_reference.call_args.kwargs["to_jid"] == TEST_JID - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) - expected_anchor = xmpp_uri.buildXMPPUri( + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) + expected_anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=local_actor_jid.full(), node=ap_gateway._m.namespace, item=direct_addr_mention["id"], ) - assert sendReference.call_args.kwargs["anchor"] == expected_anchor + assert send_reference.call_args.kwargs["anchor"] == expected_anchor @ed async def test_auto_mentions(self, ap_gateway, monkeypatch): """Check that mentions in body are converted to AP mentions""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mb_data = { "author_jid": TEST_JID.full(), @@ -1254,7 +1254,7 @@ # in mb_data_2_ap_item monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) mb_data = { "author_jid": TEST_JID.full(), @@ -1275,11 +1275,11 @@ """Check that XEP-0372 references are converted to AP mention""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - local_actor_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + local_actor_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) item_elt = XMPP_ITEMS[0] - anchor = xmpp_uri.buildXMPPUri( + anchor = xmpp_uri.build_xmpp_uri( "pubsub", path=TEST_JID.full(), node=ap_gateway._m.namespace, @@ -1287,18 +1287,18 @@ ) ref_data: Dict[str, Union[str, int, dict]] = { - "uri": xmpp_uri.buildXMPPUri(None, path=local_actor_jid.full()), + "uri": xmpp_uri.build_xmpp_uri(None, path=local_actor_jid.full()), "type_": "mention", "anchor": anchor, } - reference_elt = ap_gateway._refs.buildRefElement(**ref_data) + reference_elt = ap_gateway._refs.build_ref_element(**ref_data) # we now update ref_data to look like what is received in the trigger - ref_data["parsed_uri"] = xmpp_uri.parseXMPPUri(ref_data["uri"]) - ref_data["parsed_anchor"] = xmpp_uri.parseXMPPUri(ref_data["anchor"]) + ref_data["parsed_uri"] = xmpp_uri.parse_xmpp_uri(ref_data["uri"]) + ref_data["parsed_anchor"] = xmpp_uri.parse_xmpp_uri(ref_data["anchor"]) - # "type" is a builtin function, thus "type_" is used in buildRefElement, but in + # "type" is a builtin function, thus "type_" is used in build_ref_element, but in # ref_data is "type" without underscore ref_data["type"] = ref_data["type_"] del ref_data["type_"] @@ -1306,22 +1306,22 @@ message_elt = domish.Element((None, "message")) message_elt.addChild(reference_elt) - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - # getItems returns a sqla_mapping.PubsubItem, thus we need to fake it and set + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + # get_items returns a sqla_mapping.PubsubItem, thus we need to fake it and set # the item_elt we want to use in its "data" attribute mock_pubsub_item = MagicMock mock_pubsub_item.data = item_elt - getItems.return_value = ([mock_pubsub_item], {}) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._onReferenceReceived( + get_items.return_value = ([mock_pubsub_item], {}) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._on_reference_received( ap_gateway.client, message_elt, ref_data ) # when reference is received, the referencing item must be sent to referenced # actor, and they must be in "to" field and in "tag" - assert signAndPost.call_count == 1 - send_ap_item = signAndPost.call_args.args[-1] + assert sign_and_post.call_count == 1 + send_ap_item = sign_and_post.call_args.args[-1] ap_object = send_ap_item["object"] assert TEST_AP_ACTOR_ID in ap_object["to"] expected_mention = { @@ -1338,13 +1338,13 @@ """XEP-0272 post repeat is converted to AP Announce activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) # JID repeated AP actor (also the recipient of the message) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # repeated item ap_item = TEST_AP_ITEMS[0] - ap_item_url = xmpp_uri.buildXMPPUri( + ap_item_url = xmpp_uri.build_xmpp_uri( "pubsub", path=recipient_jid.full(), node=ap_gateway._m.namespace, @@ -1374,9 +1374,9 @@ ) item_elt.uri = pubsub.NS_PUBSUB_EVENT - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway.convertAndPostItems( + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway.convert_and_post_items( ap_gateway.client, TEST_AP_ACCOUNT, TEST_JID, @@ -1384,10 +1384,10 @@ [item_elt], ) - assert signAndPost.called - url, actor_id, doc = signAndPost.call_args.args + assert sign_and_post.called + url, actor_id, doc = sign_and_post.call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + assert actor_id == ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) assert doc["type"] == "Announce" assert ap_const.NS_AP_PUBLIC in doc["to"] assert doc["object"] == ap_item["id"] @@ -1397,12 +1397,12 @@ """AP Announce activity is converted to XEP-0272 post repeat""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # announced item xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) announce = { @@ -1415,25 +1415,25 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: mock_pubsub_item = MagicMock mock_pubsub_item.data = xmpp_item - getItems.return_value = ([mock_pubsub_item], {}) + get_items.return_value = ([mock_pubsub_item], {}) with patch.object( - ap_gateway.host.memory.storage, "cachePubsubItems" - ) as cachePubsubItems: - await ap_gateway.server.resource.handleAnnounceActivity( + ap_gateway.host.memory.storage, "cache_pubsub_items" + ) as cache_pubsub_items: + await ap_gateway.server.resource.handle_announce_activity( Request(MagicMock()), announce, None, None, None, "", TEST_AP_ACTOR_ID ) - assert cachePubsubItems.called + assert cache_pubsub_items.called # the microblog data put in cache correspond to the item sent to subscribers - __, __, __, [mb_data] = cachePubsubItems.call_args.args + __, __, __, [mb_data] = cache_pubsub_items.call_args.args extra = mb_data["extra"] assert "repeated" in extra repeated = extra["repeated"] - assert repeated["by"] == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT).full() - xmpp_item_xmpp_url = xmpp_uri.buildXMPPUri( + assert repeated["by"] == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT).full() + xmpp_item_xmpp_url = xmpp_uri.build_xmpp_uri( "pubsub", path=TEST_JID.full(), node=ap_gateway._m.namespace, @@ -1446,12 +1446,12 @@ """Pubsub-attachments ``noticed`` is converted to AP Like activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # noticed item ap_item = TEST_AP_ITEMS[0] - attachment_node = ap_gateway._pa.getAttachmentNodeName( + attachment_node = ap_gateway._pa.get_attachment_node_name( recipient_jid, ap_gateway._m.namespace, ap_item["id"] ) item_elt = xml_tools.parse( @@ -1468,14 +1468,14 @@ TEST_JID, recipient_jid, attachment_node, [item_elt], {} ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._itemsReceived(ap_gateway.client, items_event) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._items_received(ap_gateway.client, items_event) - assert signAndPost.called - url, actor_id, doc = signAndPost.call_args.args + assert sign_and_post.called + url, actor_id, doc = sign_and_post.call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + assert actor_id == ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) assert doc["type"] == "Like" assert ap_const.NS_AP_PUBLIC in doc["cc"] assert doc["object"] == ap_item["id"] @@ -1485,12 +1485,12 @@ """AP Like activity is converted to ``noticed`` attachment""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # liked item xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) like = { @@ -1503,23 +1503,23 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - getItems.return_value = ([], {}) - with patch.object(ap_gateway._p, "sendItems") as sendItems: - await ap_gateway.server.resource.APInboxRequest( + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + get_items.return_value = ([], {}) + with patch.object(ap_gateway._p, "send_items") as send_items: + await ap_gateway.server.resource.ap_inbox_request( **self.ap_request_params( ap_gateway, "inbox", data=like, signing_actor=TEST_AP_ACTOR_ID ) ) - assert sendItems.called - si_client, si_service, si_node, [si_item] = sendItems.call_args.args - assert si_client.jid == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + assert send_items.called + si_client, si_service, si_node, [si_item] = send_items.call_args.args + assert si_client.jid == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) assert si_service == TEST_JID - assert si_node == ap_gateway._pa.getAttachmentNodeName( + assert si_node == ap_gateway._pa.get_attachment_node_name( TEST_JID, ap_gateway._m.namespace, xmpp_item["id"] ) - [parsed_item] = ap_gateway._pa.items2attachmentData(si_client, [si_item]) + [parsed_item] = ap_gateway._pa.items_2_attachment_data(si_client, [si_item]) assert parsed_item["from"] == si_client.jid.full() assert "noticed" in parsed_item assert parsed_item["noticed"]["noticed"] == True @@ -1529,18 +1529,18 @@ """Pubsub-attachments ``reactions`` is converted to AP EmojiReact activity""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - recipient_jid = ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + recipient_jid = ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) # noticed item ap_item = TEST_AP_ITEMS[0] - ap_item_url = xmpp_uri.buildXMPPUri( + ap_item_url = xmpp_uri.build_xmpp_uri( "pubsub", path=recipient_jid.full(), node=ap_gateway._m.namespace, item=ap_item["id"], ) - attachment_node = ap_gateway._pa.getAttachmentNodeName( + attachment_node = ap_gateway._pa.get_attachment_node_name( recipient_jid, ap_gateway._m.namespace, ap_item["id"] ) reactions = ["🦁", "🥜", "🎻"] @@ -1562,15 +1562,15 @@ TEST_JID, recipient_jid, attachment_node, [item_elt], {} ) - with patch.object(ap_gateway, "signAndPost") as signAndPost: - signAndPost.return_value.code = 202 - await ap_gateway._itemsReceived(ap_gateway.client, items_event) + with patch.object(ap_gateway, "sign_and_post") as sign_and_post: + sign_and_post.return_value.code = 202 + await ap_gateway._items_received(ap_gateway.client, items_event) - assert signAndPost.call_count == 3 - for idx, call_args in enumerate(signAndPost.call_args_list): + assert sign_and_post.call_count == 3 + for idx, call_args in enumerate(sign_and_post.call_args_list): url, actor_id, doc = call_args.args assert url == TEST_USER_DATA["endpoints"]["sharedInbox"] - assert actor_id == ap_gateway.buildAPURL( + assert actor_id == ap_gateway.build_apurl( ap_const.TYPE_ACTOR, TEST_JID.userhost() ) assert doc["type"] == "EmojiReact" @@ -1588,12 +1588,12 @@ """AP EmojiReact activity is converted to ``reactions`` attachment""" monkeypatch.setattr(plugin_comp_ap_gateway.treq, "get", mock_ap_get) monkeypatch.setattr(plugin_comp_ap_gateway.treq, "json_content", mock_treq_json) - monkeypatch.setattr(ap_gateway, "apGet", mock_ap_get) + monkeypatch.setattr(ap_gateway, "ap_get", mock_ap_get) - xmpp_actor_id = ap_gateway.buildAPURL(ap_const.TYPE_ACTOR, TEST_JID.userhost()) + xmpp_actor_id = ap_gateway.build_apurl(ap_const.TYPE_ACTOR, TEST_JID.userhost()) # item on which reaction is attached xmpp_item = XMPP_ITEMS[0] - xmpp_item_url = ap_gateway.buildAPURL( + xmpp_item_url = ap_gateway.build_apurl( ap_const.TYPE_ITEM, TEST_JID.userhost(), xmpp_item["id"] ) like = { @@ -1607,23 +1607,23 @@ "published": "2022-07-22T09:24:12Z", "to": [ap_const.NS_AP_PUBLIC], } - with patch.object(ap_gateway.host.memory.storage, "getItems") as getItems: - getItems.return_value = ([], {}) - with patch.object(ap_gateway._p, "sendItems") as sendItems: - await ap_gateway.server.resource.APInboxRequest( + with patch.object(ap_gateway.host.memory.storage, "get_items") as get_items: + get_items.return_value = ([], {}) + with patch.object(ap_gateway._p, "send_items") as send_items: + await ap_gateway.server.resource.ap_inbox_request( **self.ap_request_params( ap_gateway, "inbox", data=like, signing_actor=TEST_AP_ACTOR_ID ) ) - assert sendItems.called - si_client, si_service, si_node, [si_item] = sendItems.call_args.args - assert si_client.jid == ap_gateway.getLocalJIDFromAccount(TEST_AP_ACCOUNT) + assert send_items.called + si_client, si_service, si_node, [si_item] = send_items.call_args.args + assert si_client.jid == ap_gateway.get_local_jid_from_account(TEST_AP_ACCOUNT) assert si_service == TEST_JID - assert si_node == ap_gateway._pa.getAttachmentNodeName( + assert si_node == ap_gateway._pa.get_attachment_node_name( TEST_JID, ap_gateway._m.namespace, xmpp_item["id"] ) - [parsed_item] = ap_gateway._pa.items2attachmentData(si_client, [si_item]) + [parsed_item] = ap_gateway._pa.items_2_attachment_data(si_client, [si_item]) assert parsed_item["from"] == si_client.jid.full() assert "reactions" in parsed_item assert parsed_item["reactions"]["reactions"] == ["🐅"]
--- a/tests/unit/test_pubsub-cache.py Fri Apr 07 15:18:39 2023 +0200 +++ b/tests/unit/test_pubsub-cache.py Sat Apr 08 13:54:42 2023 +0200 @@ -27,22 +27,22 @@ @ed async def test_cache_is_used_transparently(self, host, client): - """Cache is used when a pubsub getItems operation is done""" + """Cache is used when a pubsub get_items operation is done""" items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - pubsub_node = host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + pubsub_node = host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert cacheNode.call_count == 1 - assert cacheNode.call_args.args[-1] == pubsub_node + assert cache_node.call_count == 1 + assert cache_node.call_args.args[-1] == pubsub_node @ed async def test_cache_is_skipped_with_use_cache_false(self, host, client): @@ -50,18 +50,18 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", extra = {C.KEY_USE_CACHE: False} ) - assert not cacheNode.called + assert not cache_node.called @ed async def test_cache_is_not_used_when_no_cache(self, host, client): @@ -70,17 +70,17 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = None - host.memory.storage.setPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = None + host.memory.storage.set_pubsub_node.return_value = PubsubNode( sync_state = None ) - with patch.object(host.plugins["PUBSUB_CACHE"], "cacheNode") as cacheNode: - await host.plugins["XEP-0060"].getItems( + with patch.object(host.plugins["PUBSUB_CACHE"], "cache_node") as cache_node: + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert not cacheNode.called + assert not cache_node.called @ed @@ -89,20 +89,20 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = PubsubNode( sync_state = SyncState.COMPLETED ) with patch.object( host.plugins["PUBSUB_CACHE"], - "getItemsFromCache" - ) as getItemsFromCache: - getItemsFromCache.return_value = ([], {}) - await host.plugins["XEP-0060"].getItems( + "get_items_from_cache" + ) as get_items_from_cache: + get_items_from_cache.return_value = ([], {}) + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert getItemsFromCache.call_count == 1 + assert get_items_from_cache.call_count == 1 assert not client.pubsub_client.items.called @ed @@ -111,21 +111,21 @@ items_ret = defer.Deferred() items_ret.callback(([], {})) client.pubsub_client.items = MagicMock(return_value=items_ret) - host.memory.storage.getPubsubNode.return_value = PubsubNode( + host.memory.storage.get_pubsub_node.return_value = PubsubNode( sync_state = SyncState.IN_PROGRESS ) - with patch.object(host.plugins["PUBSUB_CACHE"], "analyseNode") as analyseNode: - analyseNode.return_value = {"to_sync": True} + with patch.object(host.plugins["PUBSUB_CACHE"], "analyse_node") as analyse_node: + analyse_node.return_value = {"to_sync": True} with patch.object( host.plugins["PUBSUB_CACHE"], - "getItemsFromCache" - ) as getItemsFromCache: - getItemsFromCache.return_value = ([], {}) + "get_items_from_cache" + ) as get_items_from_cache: + get_items_from_cache.return_value = ([], {}) assert client.pubsub_client.items.call_count == 0 - await host.plugins["XEP-0060"].getItems( + await host.plugins["XEP-0060"].get_items( client, None, "urn:xmpp:microblog:0", ) - assert not getItemsFromCache.called + assert not get_items_from_cache.called assert client.pubsub_client.items.call_count == 1
--- a/twisted/plugins/sat_plugin.py Fri Apr 07 15:18:39 2023 +0200 +++ b/twisted/plugins/sat_plugin.py Sat Apr 08 13:54:42 2023 +0200 @@ -36,7 +36,7 @@ """Method to initialise global modules""" # XXX: We need to configure logs before any log method is used, so here is the best place. from sat.core import log_config - log_config.satConfigure(C.LOG_BACKEND_TWISTED, C, backend_data=options) + log_config.sat_configure(C.LOG_BACKEND_TWISTED, C, backend_data=options) class Options(usage.Options): @@ -50,7 +50,7 @@ description = _("%s XMPP client backend") % C.APP_NAME_FULL options = Options - def setDebugger(self): + def set_debugger(self): from twisted.internet import defer if defer.Deferred.debug: # if we are in debug mode, we want to use ipdb instead of pdb @@ -65,7 +65,7 @@ def makeService(self, options): from twisted.internet import asyncioreactor asyncioreactor.install() - self.setDebugger() + self.set_debugger() # XXX: Libervia must be imported after log configuration, # because it write stuff to logs initialise(options.parent)