MarseyWorld/files/routes/notifications.py

410 lines
12 KiB
Python
Raw Normal View History

[DO NOT MERGE] import detanglation (#442) * move Base definition to files.classes.__init__.py * fix ImportError * move userpage listing to users.py * don't import the app from classes * consts: set default values to avoid crashes consts: warn if the secret key is the default config value * card view: sneed (user db schema) * cloudflare: use DEFAULT_CONFIG_VALUE * const: set default values * decouple media.py from __main__ * pass database to avoid imports * import cleanup and import request not in const, but in the requests mega import * move asset_submissions site check to __init__ * asset submissions feature flag * flag * g.is_tor * don't import request where it's not needed * i think this is fine * mail: move to own routes and helper * wrappers * required wrappers move * unfuck wrappers a bit * move snappy quotes and marseys to stateful consts * marsify * :pepodrool: * fix missing import * import cache * ...and settings.py * and static.py * static needs cache * route * lmao all of the jinja shit was in feeds.py amazing * classes should only import what they need from flask * import Response * hdjbjdhbhjf * ... * dfdfdfdf * make get a non-required import * isort imports (mostly) * but actually * configs * reload config on import * fgfgfgfg * config * config * initialize snappy and test * cookie of doom debug * edfjnkf * xikscdfd * debug config * set session cookie domain, i think this fixes the can't login bug * sdfbgnhvfdsghbnjfbdvvfghnn * hrsfxgf * dump the entire config on a request * kyskyskyskyskyskyskyskyskys * duifhdskfjdfd * dfdfdfdfdfdfdfdfdfdfdfdf * dfdfdfdf * imoprt all of the consts beacuse fuck it * 😭 * dfdfdfdfdfdfsdasdf * print the entire session * rffdfdfjkfksj * fgbhffh * not the secret keys * minor bug fixes * be helpful in the warning * gfgfgfg * move warning lower * isort main imports (i hope this doesn't fuck something up) * test * session cookie domain redux * dfdfdfd * try only importing Flask * formkeys fix * y * :pepodrool: * route helper * remove before flight * dfdfdfdfdf * isort classes * isort helpers * move check_for_alts to routehelpers and also sort imports and get rid of unused ones * that previous commit but actkally * readd the cache in a dozen places they were implicitly imported * use g.is_tor instead of request.headers. bla bla bla * upgrade streamers to their own route file * get rid of unused imports in __main__ * fgfgf * don't pull in the entire ORM where we don't need it * features * explicit imports for the get helper * explicit imports for the get helper redux * testing allroutes * remove unused import * decouple flask from classes * syntax fix also remember these have side fx for some reason (why?) * move side effects out of the class * posts * testing on devrama * settings * reloading * settingssdsdsds * streamer features * site settings * testing settings on devrama * import * fix modlog * remove debug stuff * revert commit 67275b21ab6e2f2520819e84d10bfc1c746a15b6 * archiveorg to _archiveorg * skhudkfkjfd * fix cron for PCM * fix bugs that snekky wants me to * Fix call to realbody passing db, standardize kwarg * test * import check_for_alts from the right place * cloudflare * testing on devrama * fix cron i think * shadow properly * tasks * Remove print which will surely be annoying in prod. * v and create new session * use files.classes * make errors import little and fix rare 500 in /allow_nsfw * Revert "use files.classes" This reverts commit 98c10b876cf86ce058b7fb955cf1ec0bfb9996c6. * pass v to media functions rather than using g * fix * dfdfdfdfd * cleanup, py type checking is dumb so don't use it where it causes issues * Fix some merge bugs, add DEFAULT_RATELIMIT to main. * Fix imports on sqlalchemy expressions. * `from random import random` is an error. * Fix replies db param. * errors: fix missing import * fix rare 500: only send to GIFT_NOTIF_ID if it exists, and send them the right text * Fix signup formkey. * fix 2 500s * propagate db to submissions * fix replies * dfdfdfdf * Fix verifiedcolor. * is_manual * can't use getters outside of an app context * don't attempt to do gumroad on sites where it's not enabled * don't attempt to do gumraod on sites's where it's unnecessary * Revert "don't attempt to do gumroad on sites where it's not enabled" This reverts commit 6f8a6331878655492dfaf1907b27f8be513c14d3. * fix 500 * validate media type Co-authored-by: TLSM <duolsm@outlook.com>
2022-11-15 09:19:08 +00:00
import time
from sqlalchemy.sql.expression import not_, and_, or_
from files.classes.mod_logs import ModAction
from files.classes.sub_logs import SubAction
from files.helpers.config.const import *
2023-01-26 05:39:17 +00:00
from files.helpers.config.modaction_types import *
[DO NOT MERGE] import detanglation (#442) * move Base definition to files.classes.__init__.py * fix ImportError * move userpage listing to users.py * don't import the app from classes * consts: set default values to avoid crashes consts: warn if the secret key is the default config value * card view: sneed (user db schema) * cloudflare: use DEFAULT_CONFIG_VALUE * const: set default values * decouple media.py from __main__ * pass database to avoid imports * import cleanup and import request not in const, but in the requests mega import * move asset_submissions site check to __init__ * asset submissions feature flag * flag * g.is_tor * don't import request where it's not needed * i think this is fine * mail: move to own routes and helper * wrappers * required wrappers move * unfuck wrappers a bit * move snappy quotes and marseys to stateful consts * marsify * :pepodrool: * fix missing import * import cache * ...and settings.py * and static.py * static needs cache * route * lmao all of the jinja shit was in feeds.py amazing * classes should only import what they need from flask * import Response * hdjbjdhbhjf * ... * dfdfdfdf * make get a non-required import * isort imports (mostly) * but actually * configs * reload config on import * fgfgfgfg * config * config * initialize snappy and test * cookie of doom debug * edfjnkf * xikscdfd * debug config * set session cookie domain, i think this fixes the can't login bug * sdfbgnhvfdsghbnjfbdvvfghnn * hrsfxgf * dump the entire config on a request * kyskyskyskyskyskyskyskyskys * duifhdskfjdfd * dfdfdfdfdfdfdfdfdfdfdfdf * dfdfdfdf * imoprt all of the consts beacuse fuck it * 😭 * dfdfdfdfdfdfsdasdf * print the entire session * rffdfdfjkfksj * fgbhffh * not the secret keys * minor bug fixes * be helpful in the warning * gfgfgfg * move warning lower * isort main imports (i hope this doesn't fuck something up) * test * session cookie domain redux * dfdfdfd * try only importing Flask * formkeys fix * y * :pepodrool: * route helper * remove before flight * dfdfdfdfdf * isort classes * isort helpers * move check_for_alts to routehelpers and also sort imports and get rid of unused ones * that previous commit but actkally * readd the cache in a dozen places they were implicitly imported * use g.is_tor instead of request.headers. bla bla bla * upgrade streamers to their own route file * get rid of unused imports in __main__ * fgfgf * don't pull in the entire ORM where we don't need it * features * explicit imports for the get helper * explicit imports for the get helper redux * testing allroutes * remove unused import * decouple flask from classes * syntax fix also remember these have side fx for some reason (why?) * move side effects out of the class * posts * testing on devrama * settings * reloading * settingssdsdsds * streamer features * site settings * testing settings on devrama * import * fix modlog * remove debug stuff * revert commit 67275b21ab6e2f2520819e84d10bfc1c746a15b6 * archiveorg to _archiveorg * skhudkfkjfd * fix cron for PCM * fix bugs that snekky wants me to * Fix call to realbody passing db, standardize kwarg * test * import check_for_alts from the right place * cloudflare * testing on devrama * fix cron i think * shadow properly * tasks * Remove print which will surely be annoying in prod. * v and create new session * use files.classes * make errors import little and fix rare 500 in /allow_nsfw * Revert "use files.classes" This reverts commit 98c10b876cf86ce058b7fb955cf1ec0bfb9996c6. * pass v to media functions rather than using g * fix * dfdfdfdfd * cleanup, py type checking is dumb so don't use it where it causes issues * Fix some merge bugs, add DEFAULT_RATELIMIT to main. * Fix imports on sqlalchemy expressions. * `from random import random` is an error. * Fix replies db param. * errors: fix missing import * fix rare 500: only send to GIFT_NOTIF_ID if it exists, and send them the right text * Fix signup formkey. * fix 2 500s * propagate db to submissions * fix replies * dfdfdfdf * Fix verifiedcolor. * is_manual * can't use getters outside of an app context * don't attempt to do gumroad on sites where it's not enabled * don't attempt to do gumraod on sites's where it's unnecessary * Revert "don't attempt to do gumroad on sites where it's not enabled" This reverts commit 6f8a6331878655492dfaf1907b27f8be513c14d3. * fix 500 * validate media type Co-authored-by: TLSM <duolsm@outlook.com>
2022-11-15 09:19:08 +00:00
from files.helpers.get import *
from files.routes.wrappers import *
2022-07-08 18:06:54 +00:00
from files.__main__ import app
@app.post("/clear")
2023-02-27 05:33:45 +00:00
@limiter.limit('1/second', scope=rpath)
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
def clear(v):
notifs = g.db.query(Notification).join(Notification.comment).filter(Notification.read == False, Notification.user_id == v.id).all()
for n in notifs:
n.read = True
g.db.add(n)
v.last_viewed_post_notifs = int(time.time())
2022-08-05 21:50:30 +00:00
v.last_viewed_log_notifs = int(time.time())
g.db.add(v)
2022-09-24 00:00:43 +00:00
return {"message": "Notifications marked as read!"}
2022-07-08 18:06:54 +00:00
@app.get("/unread")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
def unread(v):
listing = g.db.query(Notification, Comment).join(Notification.comment).filter(
Notification.read == False,
Notification.user_id == v.id,
Comment.is_banned == False,
Comment.deleted_utc == 0,
).order_by(Notification.created_utc.desc()).all()
for n, c in listing:
n.read = True
g.db.add(n)
2022-11-15 12:52:17 +00:00
return {"data":[x[1].json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
@app.get("/notifications/modmail")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-10-06 03:10:57 +00:00
@admin_level_required(PERMS['VIEW_MODMAIL'])
2022-07-08 18:06:54 +00:00
def notifications_modmail(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
comments = g.db.query(Comment).filter_by(
sentto=MODMAIL_ID,
level=1,
).order_by(Comment.id.desc()).offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
next_exists = (len(comments) > PAGE_SIZE)
listing = comments[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
2022-07-08 19:42:40 +00:00
g.db.commit()
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/messages")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
2022-11-26 21:00:03 +00:00
def notifications_messages(v:User):
2022-07-08 18:06:54 +00:00
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
# All of these queries are horrible. For whomever comes here after me,
# PLEASE just turn DMs into their own table and get them out of
# Notifications & Comments. It's worth it. Save yourself.
message_threads = g.db.query(Comment).filter(
Comment.sentto != None,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
Comment.parent_submission == None,
Comment.level == 1,
)
thread_order = g.db.query(Comment.top_comment_id, Comment.created_utc) \
.distinct(Comment.top_comment_id) \
.filter(
Comment.sentto != None,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
).order_by(
Comment.top_comment_id.desc(),
Comment.created_utc.desc()
).subquery()
message_threads = message_threads.join(thread_order,
thread_order.c.top_comment_id == Comment.top_comment_id)
message_threads = message_threads.order_by(thread_order.c.created_utc.desc()) \
.offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
# Clear notifications (used for unread indicator only) for all user messages.
2023-03-09 23:41:57 +00:00
if not session.get("GLOBAL"):
notifs_unread_row = g.db.query(Notification.comment_id).join(Comment).filter(
Notification.user_id == v.id,
2023-03-09 23:41:57 +00:00
Notification.read == False,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
).all()
notifs_unread = [n.comment_id for n in notifs_unread_row]
g.db.query(Notification).filter(
Notification.user_id == v.id,
Notification.comment_id.in_(notifs_unread),
).update({Notification.read: True})
g.db.commit()
2022-07-08 19:42:40 +00:00
next_exists = (len(message_threads) > 25)
listing = message_threads[:25]
2022-09-17 21:59:49 +00:00
list_to_perserve_unread_attribute = []
comments_unread = g.db.query(Comment).filter(Comment.id.in_(notifs_unread))
for c in comments_unread:
c.unread = True
2022-09-17 21:59:49 +00:00
list_to_perserve_unread_attribute.append(c)
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/posts")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
2022-11-26 21:00:03 +00:00
def notifications_posts(v:User):
2022-07-08 18:06:54 +00:00
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
listing = [x[0] for x in g.db.query(Submission.id).filter(
or_(
2022-07-08 19:10:01 +00:00
Submission.author_id.in_(v.followed_users),
Submission.sub.in_(v.followed_subs)
),
Submission.deleted_utc == 0,
Submission.is_banned == False,
Submission.private == False,
2022-09-10 07:00:45 +00:00
Submission.notify == True,
Submission.author_id != v.id,
2022-07-17 23:00:51 +00:00
Submission.ghost == False,
Submission.author_id.notin_(v.userblocks)
).order_by(Submission.created_utc.desc()).offset(PAGE_SIZE * (page - 1)).limit(PAGE_SIZE+1).all()]
2022-07-08 18:06:54 +00:00
next_exists = (len(listing) > 25)
listing = listing[:25]
listing = get_posts(listing, v=v, eager=True)
2022-07-08 18:06:54 +00:00
for p in listing:
p.unread = p.created_utc > v.last_viewed_post_notifs
2023-03-09 23:41:57 +00:00
if not session.get("GLOBAL"):
v.last_viewed_post_notifs = int(time.time())
g.db.add(v)
2022-07-08 18:06:54 +00:00
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/modactions")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-11-08 13:49:43 +00:00
@auth_required
2022-11-26 21:00:03 +00:00
def notifications_modactions(v:User):
2022-07-08 18:06:54 +00:00
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
2022-11-08 13:49:43 +00:00
if v.admin_level >= PERMS['NOTIFICATIONS_MODERATOR_ACTIONS']:
cls = ModAction
elif v.moderated_subs:
cls = SubAction
else:
abort(403)
listing = g.db.query(cls).filter(cls.user_id != v.id)
2023-01-27 13:51:04 +00:00
if v.id == AEVANN_ID:
2023-01-27 08:18:05 +00:00
listing = listing.filter(cls.kind.in_(AEVANN_MODACTION_TYPES))
2023-01-26 05:31:47 +00:00
if v.admin_level < PERMS['PROGSTACK']:
2023-01-26 05:47:20 +00:00
listing = listing.filter(cls.kind.notin_(MODACTION_PRIVILEGED__TYPES))
2023-01-26 05:31:47 +00:00
2022-11-08 13:49:43 +00:00
if cls == SubAction:
listing = listing.filter(cls.sub.in_(v.moderated_subs))
2022-07-08 18:06:54 +00:00
2022-11-08 13:49:43 +00:00
listing = listing.order_by(cls.id.desc()).offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
next_exists = len(listing) > PAGE_SIZE
listing = listing[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
2022-08-05 21:50:30 +00:00
for ma in listing:
ma.unread = ma.created_utc > v.last_viewed_log_notifs
2022-07-08 19:42:40 +00:00
2023-03-09 23:41:57 +00:00
if not session.get("GLOBAL"):
v.last_viewed_log_notifs = int(time.time())
g.db.add(v)
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/reddit")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
2022-11-26 21:00:03 +00:00
def notifications_reddit(v:User):
2022-07-08 18:06:54 +00:00
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
if not v.can_view_offsitementions: abort(403)
listing = g.db.query(Comment).filter(
2022-07-10 14:09:41 +00:00
Comment.body_html.like('%<p>New site mention%<a href="https://old.reddit.com/r/%'),
Comment.parent_submission == None,
Comment.author_id == AUTOJANNY_ID
).order_by(Comment.created_utc.desc()).offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
2022-07-08 18:06:54 +00:00
next_exists = len(listing) > PAGE_SIZE
2022-11-30 22:33:41 +00:00
listing = listing[:PAGE_SIZE]
for ma in listing:
ma.unread = ma.created_utc > v.last_viewed_reddit_notifs
2022-07-08 18:06:54 +00:00
2023-03-09 23:41:57 +00:00
if not session.get("GLOBAL"):
v.last_viewed_reddit_notifs = int(time.time())
g.db.add(v)
2022-07-08 18:06:54 +00:00
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications")
@limiter.limit(DEFAULT_RATELIMIT)
2023-01-21 04:39:46 +00:00
@limiter.limit(DEFAULT_RATELIMIT, key_func=get_ID)
2022-07-08 18:06:54 +00:00
@auth_required
2022-11-26 21:00:03 +00:00
def notifications(v:User):
2022-07-08 18:06:54 +00:00
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
if v.admin_level < PERMS['USER_SHADOWBAN']:
unread_and_inaccessible = g.db.query(Notification).join(Notification.comment).join(Comment.author).filter(
Notification.user_id == v.id,
Notification.read == False,
or_(
User.shadowbanned != None,
Comment.is_banned != False,
Comment.deleted_utc != 0,
)
).all()
for n in unread_and_inaccessible:
n.read = True
g.db.add(n)
comments = g.db.query(Comment, Notification).join(Notification.comment).join(Comment.author).filter(
2022-07-08 18:06:54 +00:00
Notification.user_id == v.id,
or_(Comment.sentto == None, Comment.sentto == MODMAIL_ID),
2023-02-26 09:14:39 +00:00
not_(and_(Comment.sentto == MODMAIL_ID, User.is_muted)),
)
if v.admin_level < PERMS['USER_SHADOWBAN']:
comments = comments.filter(
Comment.is_banned == False,
Comment.deleted_utc == 0,
)
2022-07-08 18:06:54 +00:00
comments = comments.order_by(Notification.created_utc.desc())
comments = comments.offset(PAGE_SIZE * (page - 1)).limit(PAGE_SIZE+1).all()
2022-07-08 18:06:54 +00:00
next_exists = (len(comments) > PAGE_SIZE)
comments = comments[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
cids = [x[0].id for x in comments]
listing = []
2023-02-25 23:44:31 +00:00
total = [x[0] for x in comments]
2023-03-11 08:21:33 +00:00
for c, n in comments:
c.notified_utc = n.created_utc
2023-03-11 08:51:19 +00:00
c.collapse = n.read
2023-03-11 08:21:33 +00:00
2022-07-08 18:06:54 +00:00
for c, n in comments:
if n.created_utc > 1620391248: c.notif_utc = n.created_utc
if c.parent_submission or c.wall_user_id:
2023-02-25 23:44:31 +00:00
total.append(c)
2022-07-08 18:06:54 +00:00
if c.replies2 == None:
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
total.extend(c.replies2)
2022-07-08 18:06:54 +00:00
for x in c.replies2:
if x.replies2 == None: x.replies2 = []
2023-02-25 23:44:31 +00:00
2022-07-08 18:06:54 +00:00
count = 0
2023-02-25 23:44:31 +00:00
while count < 50 and c.parent_comment and (c.parent_comment.author_id == v.id or c.parent_comment.id in cids):
2022-07-08 18:06:54 +00:00
count += 1
c = c.parent_comment
2023-03-11 08:21:33 +00:00
2023-03-11 09:11:17 +00:00
if c.replies2 == None:
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
total.extend(c.replies2)
for x in c.replies2:
if x.replies2 == None:
x.replies2 = g.db.query(Comment).filter_by(parent_comment_id=x.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
total.extend(x.replies2)
2023-03-11 08:21:33 +00:00
2023-03-11 09:12:16 +00:00
if not hasattr(c, "notified_utc") or n.created_utc > c.notified_utc:
c.notified_utc = n.created_utc
c.collapse = n.read
2023-03-11 08:21:33 +00:00
c.replies2 = sorted(c.replies2, key=lambda x: x.notified_utc if hasattr(x, "notified_utc") else x.id, reverse=True)
2022-07-08 18:06:54 +00:00
else:
while c.parent_comment_id:
2022-07-08 18:06:54 +00:00
c = c.parent_comment
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).order_by(Comment.id).all()
if c not in listing: listing.append(c)
2023-01-01 11:36:20 +00:00
2023-03-11 08:51:19 +00:00
if not n.read and not session.get("GLOBAL"):
n.read = True
c.unread = True
g.db.add(n)
total.extend(listing)
2023-02-28 18:51:11 +00:00
listing2 = {}
for x in listing:
if x.parent_comment_id:
2023-02-27 01:56:10 +00:00
parent = x.parent_comment
if parent.replies2 == None:
2023-02-27 02:11:41 +00:00
parent.replies2 = g.db.query(Comment).filter_by(parent_comment_id=parent.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids+[x.id]))).order_by(Comment.id.desc()).all()
2023-02-28 16:52:29 +00:00
total.extend(parent.replies2)
for y in parent.replies2:
2023-02-28 17:27:36 +00:00
if y.replies2 == None:
y.replies2 = []
2023-02-28 18:51:11 +00:00
listing2[parent] = ''
else:
2023-02-28 18:51:11 +00:00
listing2[x] = ''
listing = listing2.keys()
total.extend(listing)
total_cids = [x.id for x in total]
total_cids.extend(cids)
total_cids = set(total_cids)
2023-02-27 16:16:12 +00:00
output = get_comments_v_properties(v, None, Comment.id.in_(total_cids))[1]
2022-07-08 18:06:54 +00:00
2022-07-08 19:42:40 +00:00
g.db.commit()
2022-07-08 18:06:54 +00:00
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
)